Archive

Archive for the ‘PC’ Category

Beta fish is back

February 29, 2012 Leave a comment

The Windows 8 Consumer Preview is now available to download and try out for free. This is a great opportunity to take the latest public cut of Windows 8 for a spin. Whether to dabble with HTML5 to create Metro apps or just explore the radical UI changes, trying out Windows 8 in a VM couldn’t be simpler. If you haven’t used virtual machines before Lifehacker has a great guide to get you started and Ars Technical have a nice series of screen grabs of the process.

20120229-204458.jpg

I do like the cute little touches like the Beta Fish shown during OS boot and on the default wallpaper. This, albeit cosmetic point, shows the painstaking attention the whole user experience has received. I’m sure cracks in the fa├žade will emerge, but everything that I have experimented with so far and in the earlier developer preview has been very encouraging. Using Windows 8 does seem to require a fair amount of self-recalibration and I found it a little tricky initially figuring out how everything worked coming from the Win95-7 world. Luckily, Ars have an excellent orientation article if you find yourself floundering a little.

It’s a shame that virtual machines (and most current physical hardware) won’t allow experimenting with the full tablet experience on offer owing to a lack of appropriate touch interfaces. That said, it is still very possible to get a good feel for Windows 8 using the familiar keyboard and mouse. Despite my early misgivings about the Metro UX I am quite excited about where Microsoft are heading with their attempted transition to the post-PC era. Time and user adoption will tell whether their gamble has paid off.

If by a man’s works ye shall know him..

February 18, 2012 2 comments

I love this business card idea – simple but creative. That and it features a text editor I couldn’t do without :-)

Distributed Computing : Folding @ Home Update

July 30, 2008 Leave a comment

I used to be quite an avid F@H contributor, I ran several folding rigs and wrote a bit of code for addins to the project. I have not done any real folding since late 2007, that is until this week. I built myself an awesome gaming system last week containing an Intel Core 2 Duo 8400 CPU, 4Gb of RAM and a ATi 4870 GPU. I am extremely pleased with the way this system works and decided to try out some of the high performance clients the F@H project offers.

These essentially break down into SMP and GPU. The former, SMP (Symmetric MultiProcessing) is a client designed to spread the unit workload over the physical (or virtual) cores available on a system rather than using the single threaded normal clients. Disappointingly it doesn’t appear the cores have been modified to achieve this, instead it looks like a managed series of single threaded processes are used. I will need to investigate this further as I had some problems getting it to work – more on this hopefully at a later date. The latter takes advantage of the stream processors on the graphics cards to perform folding, it is in essence a great example of GPGPU. This client absolutely rocks on my ATi 4870’s 800 stream processors currently giving me an approximate PPD (points per day) of greater than 2200 using the Gromacs GPUv2 core.

So, to anyone interested in Folding at Home, despite not being on the ‘officially supported’ list, the ATi 4870 is a work unit crunching beast. :)

(Profile)

Post XP SP3 Update problem

July 18, 2008 2 comments

Despite all the problems circulating the web about Windows XP Service Pack 3, I thought I would go ahead anyway on a new installation. The installation part went fine and the system restarted properly with no lock ups, stops or looping restarts. So far so good, unfortunately I celebrated my good fortune too soon – Windows Update stopped functioning. Whilst updates were being downloaded, Windows XP would fail to actually perform the update.

I did a bit of googling and whilst I didn’t find any accounts exactly matching my problem, I decided to follow the advice on this Microsoft KB article.

First of all, stop the automatic update service from the command prompt.

1. Open up Start Menu > Run

2. Type “cmd” and press Enter.

3. In the command box, type “net stop wuauserv”, should should get the following confirmation:

Now we need to reregister the DLL involved in the Windows Update process.

4. Type in “regsvr32 %windir%\system32\wups2.dll”. The following control box should pop up after a moment:

Now we need to start the update service and hopefully all should be well again.

5. Type “net start wuauserv” which should yield this confirmation:

Thats it, updates started working for me immediately afterwards. If this didn’t do the trick for you, follow the alternative methods on Microsoft’s KB article linked above.

Of continuing Tuesdays…

June 17, 2008 2 comments

Yes, granted the title to this post makes no sense and basically relates to nothing, but I thought it set up this post rather well. :) You see, this is just a brief post to apologise for the less than usual frequency of my updates in the last couple of weeks. Its been a bit of a busy time so I have slightly taken my finger off of the pulse of the various industries I follow. Since the majority of my posts are reactionary commentaries or rants or how-tos it follows I have not been writing much.

Also, the much vaunted laptop (dv2799) I bought a couple of weeks ago has now had to be swapped a third time!! The first time it had a strange RAM corruption problem in the GFX RAM and the second laptop I received had a poorly constructed USB header which shorted the whole thing out. *sigh*. Still third time lucky I hope, its a lovely laptop, its just a shame it is let down by dubious quality control at Hewlett Packard’s end.

So, stay tuned, I have by no means lost interest in my blog! :) For now, I leave you with the comedy genius of Simon Amstell.

Choosing your next PC’s Operating System (the 64bit fiasco)

June 5, 2008 4 comments

I am in the process of building a new gaming PC. Well, I should come clean, I have been in the process for almost 5 months now – I am mostly decided on the specifications but minor incompatibilities / annoyances cause me to stall. When this happens, real life typically takes over and by the time I look at my ‘final’ specification again, I normally rip it up and start from scratch due to new hardware being released or price drops. *exhale* I am finally on the verge of finalising the specification, the only things still holding me back are the graphics card (after news of ATi’s 4xx0 series) and the amount of RAM to put into my machine. The latter is heavily influenced by the Operating System I plan to run.

There are two crucial elements to any computer system which must work in harmony, the software and the hardware. Whilst this hardly an earth shattering announcement, I never cease to be amazed at the backlash in the form of blog / forum posts from people who forget this. Realistically when building (or buying) your next Gaming PC at the moment your choices are limited to Windows XP or Vista. Both Linux and Mac OSX suffer from platform compatibility issues with major new games and whilst the former enjoys fair server support for online gaming, neither really has much traction in the desktop gaming market.

The difference between Vista and XP is far more than cosmetic, whilst many are quick to criticise Vista for a number of reasons, I am actually a fan of Microsoft’s latest Operating System for a variety of reasons. Sure, it is feature-poor compared to initial designs and has it’s own annoyances, but the number of extra features and advances make it decisively the better Operating System. There is a caveat, for Vista to run comfortably for gaming purposes needs at least 1 Gb of RAM for itself. This on its own is no big deal – RAM is extraordinarily cheap at the moment, however the issue of platform (32bit/64bit) is now rearing its ugly head.

64 bit computing is nothing new, infact AMD processors have had 64bit extensions (called x86-64) for a number of years since the K8 platform back in 2003. Intel did not catch up (despite starting earlier than AMD) and produce viable 64bit chips until the Pentum 6xx series (late 2004), having stumbled initially with their IA64(T) specification developed for their Itanium platform.

Given this was four years ago, why are we not all running on 64bit XP or Vista? The answer is simple, in the same way that driver support initially crippled Vista’s adoption, 64bit drivers are fairly few and far between. What this means, is a lot less hardware will run properly under a 64bit Operating System. Given this situation, why do we even care about 64 bit computing? Why is it not relegated to high end computing and server farms? Mathematics.

Unfortunately, with a 32 bit Operating System, there is a mathematical limitation to the amount of memory the system can address. At most, Vista (or XP) in 32bit will only address 4Gb of total RAM. This includes both the graphics card and the main system memory. This brings my point about Vista comfortably using one Gb of RAM all by itself to sharp focus. Whilst Yes, the price of RAM is cheap there is something about me that dislikes buying 4Gb of RAM (to enable dual channel mode) only to have a quarter of it not accessible by the system. I wrote about this in detail in a previous post.

So what is the solution? Whilst I am huge fan of Vista (and have recently bought a Vista laptop) I do not think it is suitable for desktop gaming. With Windows XP, I have had fairly bloated a driver / runtime loaded installs using no more than 300Mb of RAM which realistically enables most PC gamers to get away with 2Gb of system RAM with no perceptible loss in gaming performance. This unfortunately would not be the case for a similar system running Vista and as such, unfortunately scuttles Vista for this market in my humble opinion.

Categories: Gaming, Linux, PC, Rant, Windows Tags: , , , , , ,

Move over EEE PC, Say Hello to the EBOX

May 31, 2008 Leave a comment

Well, thanks to a leak covered by both electronista and the Inquirer, we know a little more about the ASUS PC that I blogged about back in March. I will be called the EBOX and still bares a remarkable similarity to a Nintendo Wii. Apart from this not much has changed, we have no new release images nor any concrete facts about the CPU. It is likely to be an Intel Atom based on murmurings surrounding the series 900 second revision (or possibly third generation EEE PCs) which would make sense. Apart from this, the unit is rumoured to contain a healthy 2Gb of RAM and a traditional spindle hard disk drive with purportedly 160Gb storage. All of this will run ASUS’s custom flavour of Xandros. Given that this is the same tailored OS that ASUS have used in the EEE PC laptop range, it is likely we will see a similar GUI / interface for the EBOX.

I am a little puzzled as to why ASUS would move away from solid state disks for the EBOX. Whilst power considerations no longer have as much importance on a desktop platform, the EBOX is unlikely to attract the ‘one pc’ crowd as it will not replace the traditional computer in the home. I think it could potentially fit very well as a satellite computer in a kitchen or other room in the house were basic surfing / computer usage is required, but this only reinforces the question, why did ASUS go for a traditional hard drive? Satellite systems are typically that, lightweight machines which almost act like thin clients for other machines or storage or services on the home (or office) network. As such and given the EBOX PC is not designed to be portable, space is not as important a commodity as it is on the EEE PC laptop.

Well enough supposition for now, official release / preview is apparently scheduled for the 3rd of June, I daresay all questions will be answered then.

Categories: F/OSS, News, PC Tags: , , , , , ,

Vista SP1 and the Red Herring (+ breaking the 32bit 4Gb limit)

May 29, 2008 6 comments

We all knew it was looming, the mathematical limit to address referencing in 32bit computing. A 32Bit number can only be between 0 and 4,294,967,295 which neatly adds up to 4Gb and what this means is, using existing architectures, a program (or Operating System) will not be able to address more than this number of bytes of system RAM via the existing system called byte addressed memory allocation.

What this means for those among us who do not speak geek, is a system which is built or shipped with 4Gb of RAM (and some other cases*) will not be able to fully utilise all of that space.

Lets take a trip back in history and imagine a room with a cupboard containing 256 drawers. Each drawer could hold one bit of binary information and was administered by a librarian. Anytime anyone wanted a piece (or pieces) of information, they had to ask the librarian. What I am describing here, is the era of 8bit computing circa late 1970/ early 80s with the cupboard representing system memory and the librarian representing the Operating System’s memory management system. During day to day running of the system the librarian takes data in and returns data to people (program threads) from the corresponding drawers where the information is stored. Everything works, everyone is happy.

Now what happens if we introduce a second cupboard containing another 128 or 256 drawers? The librarian can only keep track of information stored in the first 256 drawers and as a result, nothing can be stored or retrieved from the newly added cupboards; in effect, they do not exist. Time to get a new secretary i.e. goto 64bit computing (or in this example, replace the 8bit librarian with a swanky 16bit one – who will even ever use 65536bits of RAM? :D )

But wait, there is more… I read today that Windows Vista SP1 changes (depending on hardware configuration) the total amount of displayed RAM from 3.5 Gb (current the RTM limit when 4Gb is put in the machine) to the full 4Gb, although this still does not help, given the limitation previously discussed. But this made me curious, if the Operating System could see RAM, then surely it was not a BIOS / mathematical fundamental limitation. Turns out I was at least half right …

You see, although the fundamental mathematical limitation can not be breached, there is a rather interesting technique called Physical Address Extension. Using this process, a 32bit Windows system can address more than 4Gb of RAM upto a (present) maximum of 128Gb. To explain what Physical Address Extension (PAE) is, lets go back to the previous example and introduce a new figure – an administrator.

The role of this new entity, is to allocate and manage the time of their underling. Lets also assume we are still running a 8bit system (with the 256bit limit) and have 1024bits of memory i.e. four times the mathematical limit. On the face of it, the extra memory is invisible to the librarian however the administrator is smart enough to both know about the extra memory and who (i.e. what program) is currently using what amount of it. As such, any person (program) can request the full mathematical limit 256 drawers for their own use at the same time as another person (and another …etc) requests more memory.The administrator can instruct the librarian which series of drawers to use per person (program).

This is loosely referred to as 36bit computing and, as the non power of 2 number suggests it is a bit of a tweak. The physical address size was increased (on a 32bit processor) from 32 to 36bits back during the days of Pentium Pro (circa 1997) and most modern CPUs have maintained this legacy. It is important to point out, this does not make all 32bit processors 36bit processors as the change happened in the MMU (memory management unit). Modern Operating systems use page tables to store information about the Virtual Memory system and allocate it based on processes requirements. In effect they act like the administrator from my trivialised example and allow multiple processes to benefit from a pool of memory which traditional 32bit systems (without PAE) would not.

I know what you are thinking, you are rejoicing at being able to avoid the negative aspects of migrating to 64bit computing, but hang on, there are a couple of important caveats. Firstly, each thread (person in our example) can only access a maximum of the mathematical limit of RAM. That means, in a system with 16Gb of RAM, you could quite easily have 3 or 4 processes each taking up 4Gb, but no one process taking up 8 or 16Gb. The other bad point is, it is not supported** in Vista or XP. In-fact, to use such a feature, you would need to be running a Server Operating System from Microsoft or a Linux equivalent. Interestingly enough, Linux contains support for PAE since kernel version 2.6 although I will not discuss it further in this post.

Presently, the only Operating Systems with suitable (or rumoured) PAE support are :

Windows 2000: Datacenter Server and Advanced Server Editions

Windows Server 2003: Enterprise and Datacenter Editions

Windows Server 2008: Enterprise and Datacenter Editions

As you can see, non are particularly home desktop friendly. So, despite Vista displaying the correct amount of RAM in Service Pack 1, it is still fundamentally limited to the 32bit mathematical limit despite Microsoft having the technology to at least improve on the functionality of such systems.

On a side note, I brought this up with a few people at my head office. I work for a large UK retail company that sells PCs and Laptops. I was surprised to see when our first 4Gb models came into the stores a few months ago that they were running Vista 32bit Editions. The UK is not a litigious as the United States, but I can’t help wondering how long it will be before the lawsuits start flying. After all, it is misrepresentation in my book to sell something that, due to a software shortcoming, can never be fully utilised to the specification it was advertised at. Particularly since an alternative is available to OEMs and yet, all retailers not just the one I work for seem to be taking a cavalier attitude towards this.

*The total amount of addressable space inside a 32bit system must add up to 4096Mb, this includes system and Video RAM, so if you have an all singing, all dancing SLI graphics card with 2Gb of Graphical RAM, the total amount of system RAM you will be able to address is around 2Gb.

**Actually this is not true, ever since Windows XP Service Pack 2, Microsoft has used PAE for security purposes coupled with the NX bit. This is a hardware security feature built into a processor which allows program and system developers greater control over what they designate to be executable and non-executable user/memory space. Microsoft has set a fundamental limitation of the amount of RAM being used by home versions of 32bit Operating Systems to 4Gb regardless of the fact the technology to increase this is in place.

The 2k bug

May 12, 2008 1 comment

Whilst it seems the Internet enjoys a good Microsoft Vista bashing (see previous post on topic) research today came out suggesting Windows 2000, an eight year old operating system that recently entered long term support phase by Microsoft, is more ‘secure’ than Windows Vista. (Cue fanboy and antiboy posts.)

But this is rather misleading, let us not forget, Windows 2000 was released in February 2000, a dark era where firewalls, security software and Windows Update were treated with suspicion previously reserved for black magic. Ok, so maybe I am exaggerating slightly, but back then the average PC had either a Pentium 2 or 3 processor between 600Mhz – 1.2Ghz, between 32-128Mb of RAM and a 20Gb hard disk and was aimed at the business market not consumers who had the privilege of running Windows ME (let the justified ME bashing commence.) But we are still missing the point here, now the only users that run Windows 2000 (which accounted for about 2% of all Internet traffic in March 2008 ) are those who are comfortable power users (like Steve Gibson) or those with old hardware (e.g. Third world etc.) As such, it is not worth the malware authors’ time to target such a small percentage of the userbase when they are more likely to snare the vulnerable XP or Vista users.

Worse still, serious doubts have been raised over the validity of this study given PC Tools did not scientifically determine the states of key security within the operating like Windows Vista’s UAC or even which service packs were installed on the computers. As noted by Ars technica, often the first action by typical malware is to download the target package(s) onto a system immediately after it has been compromised with the usually relatively small initial exploit. This could mean that their numbers are greatly misleading when three or four ‘infections’ could actually be a single instance of malware.

The only way to scientifically conduct such a test, would be with three virtual machines, one running Windows 2000, one with Windows XP and finally one with Vista each running a with a comparable set of security tools and the latest patches. That way, after each exposure, the virtual machine could be examined to determine if the exploit was successful and if so, the degree to which the target machine was compromised. At the end of the experiment, the virtual machine is ‘switched off’ without writing the changes to it’s virtual disk and restarted to test the next exploit. Using this methodology, all exploits can be tested equally and methodically and various configurational permutations can also be tried (e.g. Operating systems with only default security measures etc.)

Let us also not forget, there is no way to tell whether these threats are serious silent drive by download style exploits (which would constitute a serious threat) or as a result of user ignorance which even the most secure operating systems and security applications can not guard against. Playing Devil’s advocate, I can see a case that unscientific tests like these better represent real world conditions, however it can not be used to judge to reliability or security of Operating Systems nor the users using them as no conditions nor variables have been made constant. As such, unfortunately, these results have no validity as far as I am concerned.

ID Officially Announce Doom 4

May 8, 2008 2 comments

In a somewhat surprising move, ID Software today announced they had begun development of Doom 4. This is not particularly earth shattering in itself given the spate of recent rumours to this effect, however the reason it surprised me was that ID Software are already fairly far into a project named ‘Rage‘ which appears to be a post-apocalyptic vehicle slash first person shooter based on ID Software’s Tech 5 Engine, currently in development. Whilst is would not be unusual to ID to be working on two games at the same time using the same engine (Quake 4 / Doom 3 anyone?), given the rumours circulating about a new Quake game, I didn’t think we would be seeing another Doom game so soon.

Judging by the Careers page, the extra staff ID Software are taking on for this project will be require ‘applicable skills’ for developing for PC, Xbox 360 and PS3 platforms indicating ID Software are looking to make this a multi-platform game in much the same way as Doom 3 which was also released on the Xbox. This is, however, just early supposition on my part at this stage.

Doom 3 was criticised for being too dark, too broody, too linear and having too little variation. I disagree, having found it atmospheric and a lot of fun to play, but what worries me, is where ID takes us from here. Quake 4 didn’t really do it for me, I preferred Doom 3 for a number of reasons. The story was simpler and more elegant as was the environment. Whilst being a colonial marine and interacting with other marines and military equipment was fun in Quake 4; it felt a little over done and I never really bought into the whole Quake universe past Quake 2. There were, however, moments which I genuinely enjoyed not just because they brought something fresh into the ID-style FPS genre but also because they were quite unexpected. (Those that have completed Quake 4 will know of the Hospital section I am referring to!)

Doom 3 really was a no brainer in that it was classic Doom style game play with a modern engine, I will be decisively underwhelmed if ID are planning to just update the graphics for Doom 4.

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: