I managed to acquire, for the price of a nice lunch, a brand new Elonex media center Artisan LX a couple of days back. I was initially very excited because up to then I had still been running my first media center was really just an experiment, built from scratch containing mostly old components I had around my place. A year and a bit on, I am firmly hooked on a PC based PVR system the cornerstone of my entertainment system. It contained an Athlon 2600+ processor with 512Mb of DDR coupled with a DVB-T Hauppage tuner and an 80Gb drive for recordings running the open source MediaPortal software. So as you can see, there was plenty of room for improvement.
This was the first time I have really had a tinker with the Windows Media Center range of Operating systems that Microsoft produce and I went in with few expectations, apart from wanting at least as comparable an experience in terms of functionality and flexibility as I have enjoyed with MediaPortal.
The first thing that struck me was how fickle Windows Media Center 2005 was, even with all the roll ups (essentially what Microsoft call Service Packs for Media Center OS) installed. Wikipedia sums up the ‘capabilities’ of WMC 2005:
‘Media Center originally had a limitation of 1 analog tuner, but was raised to 2 analog tuners with Media Center 2005. With Update Rollup 1 for Media Center 2005, support for a digital tuner was added, but an analog tuner must still be present for the digital tuner to function. With Rollup 2, up to 4 TV tuners can be configured (2 analog and 2 HDTV). All the tuners must use the same source, for example they must all be off an aerial or a set-top box using the same guide data, you cannot mix Sky Digital and DVB-T for example.’
XP Media Center really shows its age here – I do not watch any analogue transmissions, so for a Media Center to require a legacy piece of hardware just to be able to access DVB (digital) seems preposterous. But that was not the worst thing! Windows Media Center 2005 is not capable of pulling EPG data OTA (over-the-air) instead requiring an overly elaborate system that relies on a permanent, always on Internet connection. This also raises some privacy concerns as ‘anonymous’ data, which is not entirely anonymous as Microsoft asks for your postcode during set up, is fed back to Microsoft which can include recording / watching trends and general EPG usage. Hitherto my media center system has not been networked. Considering it is in the opposite corner of my house, and I do not stream my recordings or have formal media shares, I never felt the need to network it. It was nice to just have a static, secure system without any security programs or periodic updates – now security monitoring of my media center has been added to my list of digital chores.
None the less, I was determined to give it a fair go, so I added a wifi adaptor, added some plug-ins and configured everything. After spending eight hours getting everything working, playing around and testing… I went back to my custom build. Not all the problems can be put squarely at Microsoft’s feet however. Elonex declared bankruptcy shortly after launching this range and the malicious part of me can see why, if this mediacenter is the sum total of their expertise.
Whilst the case looked rather nice from the outside, the hardware and the design of the internals is what really lets it down. The only element Elonex got right was the noise (or lack thereof) – the media center barely gives out a murmur when idle due to only a since fan which is housed inside the power supply. It runs at 690rpm, which draws air over the CPU heatsink (which has four heat pipes) and directly out the side of the case. However, I stressed ‘at idle’ before for a reason. When the media center does anything the incredibly noisy hard drive starts very audibly clicking and crunching away and it completely lets the machine down.
However that’s not the worst thing about this mediacenter. Due to the fact that there is only one very slow fan the airflow in the case is restricted to circulating around the motherboard tray, the processor then out the power supply. The harddrive and PCI / AGP cards are completely neglected. This point was slammed home when the harddrive consistently reported temperatures of high 50s to 62 degrees Celsius!!! Worse still, when I idled the system, that heat didn’t dissipate. The hard drive is locked into place with a pretentious plastic locking mechanism which neither improves the accessibility of the drive bay nor decreases the vibrations from the drive. There is no thermal (or thermally viable) contact between the hard drive and the case and as such, the hard drive is left smouldering away with no way to cool down predictably with next to no drop in temperature. There is a valid point that maintaining electronic components at a set temperature prolongs their life by avoiding constantly repeating thermal differentials (i.e. heating and cooling) however the fact remains that 60+ degrees centigrade is far too hot for a hard drive. Although my brief research on this did not yield any definitive threshold, most sources agree that 50-55 degrees Centigrade is about the absolute maximum recommended operating temperature.
Couple this practically zero thermal conduction with a lack of airflow and you have a recipie for a very short hard drive life. Even worse, this thermal issue was not limited to HDD, the south-bridge and GFX heatsinks were equally poorly cooled and get unpleasantly hot to the touch.
Worst of all, it is just slow. CpuID and the BIOS disagreed with each other about the exact Intel processor that powers the system. I believe it to be either an Intel Pentium 4 530 (at 3.06Ghz) or a Celeron D 345. There is no way the much older Athlon 2600+ processor with the same RAM should be out performing this setup and yet it does so without breaking a sweat.
All in all, very disappointing. A remarkable demonstration of technical ignorance on the part of Elonex. But hey, I didn’t pay for it and now I have an extra DVB-T tuner back in my original, self built machine.
Design (cosmetic) : 8/10 – Pleasing, with a nice Hi-fi look.
Design (technical) : 2/10 – Poor components poorly arranged.
Cooling : 6/10 – Great CPU and powersupply cooling, but everything else is woefully neglected.
Acoustics : 6/10 – Silent until it has to touch the harddrive, still a good effort though
Connectivity : 8/10 – Lots of connectors for digital Audio and Video
Capacity : 5/10 – 200Gb harddrive with a portion taken for recovery. I wouldn’t trust it though and by modern standards it is rather anemic.
Overall : 2/10 – Great for free, if I paid anything for it I would have been annoyed.
Despite all the problems circulating the web about Windows XP Service Pack 3, I thought I would go ahead anyway on a new installation. The installation part went fine and the system restarted properly with no lock ups, stops or looping restarts. So far so good, unfortunately I celebrated my good fortune too soon – Windows Update stopped functioning. Whilst updates were being downloaded, Windows XP would fail to actually perform the update.
I did a bit of googling and whilst I didn’t find any accounts exactly matching my problem, I decided to follow the advice on this Microsoft KB article.
First of all, stop the automatic update service from the command prompt.
1. Open up Start Menu > Run
2. Type “cmd” and press Enter.
3. In the command box, type “net stop wuauserv”, should should get the following confirmation:
Now we need to reregister the DLL involved in the Windows Update process.
4. Type in “regsvr32 %windir%\system32\wups2.dll”. The following control box should pop up after a moment:
Now we need to start the update service and hopefully all should be well again.
5. Type “net start wuauserv” which should yield this confirmation:
Thats it, updates started working for me immediately afterwards. If this didn’t do the trick for you, follow the alternative methods on Microsoft’s KB article linked above.
I am in the process of building a new gaming PC. Well, I should come clean, I have been in the process for almost 5 months now – I am mostly decided on the specifications but minor incompatibilities / annoyances cause me to stall. When this happens, real life typically takes over and by the time I look at my ‘final’ specification again, I normally rip it up and start from scratch due to new hardware being released or price drops. *exhale* I am finally on the verge of finalising the specification, the only things still holding me back are the graphics card (after news of ATi’s 4xx0 series) and the amount of RAM to put into my machine. The latter is heavily influenced by the Operating System I plan to run.
There are two crucial elements to any computer system which must work in harmony, the software and the hardware. Whilst this hardly an earth shattering announcement, I never cease to be amazed at the backlash in the form of blog / forum posts from people who forget this. Realistically when building (or buying) your next Gaming PC at the moment your choices are limited to Windows XP or Vista. Both Linux and Mac OSX suffer from platform compatibility issues with major new games and whilst the former enjoys fair server support for online gaming, neither really has much traction in the desktop gaming market.
The difference between Vista and XP is far more than cosmetic, whilst many are quick to criticise Vista for a number of reasons, I am actually a fan of Microsoft’s latest Operating System for a variety of reasons. Sure, it is feature-poor compared to initial designs and has it’s own annoyances, but the number of extra features and advances make it decisively the better Operating System. There is a caveat, for Vista to run comfortably for gaming purposes needs at least 1 Gb of RAM for itself. This on its own is no big deal – RAM is extraordinarily cheap at the moment, however the issue of platform (32bit/64bit) is now rearing its ugly head.
64 bit computing is nothing new, infact AMD processors have had 64bit extensions (called x86-64) for a number of years since the K8 platform back in 2003. Intel did not catch up (despite starting earlier than AMD) and produce viable 64bit chips until the Pentum 6xx series (late 2004), having stumbled initially with their IA64(T) specification developed for their Itanium platform.
Given this was four years ago, why are we not all running on 64bit XP or Vista? The answer is simple, in the same way that driver support initially crippled Vista’s adoption, 64bit drivers are fairly few and far between. What this means, is a lot less hardware will run properly under a 64bit Operating System. Given this situation, why do we even care about 64 bit computing? Why is it not relegated to high end computing and server farms? Mathematics.
Unfortunately, with a 32 bit Operating System, there is a mathematical limitation to the amount of memory the system can address. At most, Vista (or XP) in 32bit will only address 4Gb of total RAM. This includes both the graphics card and the main system memory. This brings my point about Vista comfortably using one Gb of RAM all by itself to sharp focus. Whilst Yes, the price of RAM is cheap there is something about me that dislikes buying 4Gb of RAM (to enable dual channel mode) only to have a quarter of it not accessible by the system. I wrote about this in detail in a previous post.
So what is the solution? Whilst I am huge fan of Vista (and have recently bought a Vista laptop) I do not think it is suitable for desktop gaming. With Windows XP, I have had fairly bloated a driver / runtime loaded installs using no more than 300Mb of RAM which realistically enables most PC gamers to get away with 2Gb of system RAM with no perceptible loss in gaming performance. This unfortunately would not be the case for a similar system running Vista and as such, unfortunately scuttles Vista for this market in my humble opinion.
Whilst it seems the Internet enjoys a good Microsoft Vista bashing (see previous post on topic) research today came out suggesting Windows 2000, an eight year old operating system that recently entered long term support phase by Microsoft, is more ‘secure’ than Windows Vista. (Cue fanboy and antiboy posts.)
But this is rather misleading, let us not forget, Windows 2000 was released in February 2000, a dark era where firewalls, security software and Windows Update were treated with suspicion previously reserved for black magic. Ok, so maybe I am exaggerating slightly, but back then the average PC had either a Pentium 2 or 3 processor between 600Mhz – 1.2Ghz, between 32-128Mb of RAM and a 20Gb hard disk and was aimed at the business market not consumers who had the privilege of running Windows ME (let the justified ME bashing commence.) But we are still missing the point here, now the only users that run Windows 2000 (which accounted for about 2% of all Internet traffic in March 2008 ) are those who are comfortable power users (like Steve Gibson) or those with old hardware (e.g. Third world etc.) As such, it is not worth the malware authors’ time to target such a small percentage of the userbase when they are more likely to snare the vulnerable XP or Vista users.
Worse still, serious doubts have been raised over the validity of this study given PC Tools did not scientifically determine the states of key security within the operating like Windows Vista’s UAC or even which service packs were installed on the computers. As noted by Ars technica, often the first action by typical malware is to download the target package(s) onto a system immediately after it has been compromised with the usually relatively small initial exploit. This could mean that their numbers are greatly misleading when three or four ‘infections’ could actually be a single instance of malware.
The only way to scientifically conduct such a test, would be with three virtual machines, one running Windows 2000, one with Windows XP and finally one with Vista each running a with a comparable set of security tools and the latest patches. That way, after each exposure, the virtual machine could be examined to determine if the exploit was successful and if so, the degree to which the target machine was compromised. At the end of the experiment, the virtual machine is ‘switched off’ without writing the changes to it’s virtual disk and restarted to test the next exploit. Using this methodology, all exploits can be tested equally and methodically and various configurational permutations can also be tried (e.g. Operating systems with only default security measures etc.)
Let us also not forget, there is no way to tell whether these threats are serious silent drive by download style exploits (which would constitute a serious threat) or as a result of user ignorance which even the most secure operating systems and security applications can not guard against. Playing Devil’s advocate, I can see a case that unscientific tests like these better represent real world conditions, however it can not be used to judge to reliability or security of Operating Systems nor the users using them as no conditions nor variables have been made constant. As such, unfortunately, these results have no validity as far as I am concerned.
I was bemused to read on bbc news earlier that a trivially simply ploy stung half a million file sharers. The concept is nothing new having been started a fair few years ago by virus / malware writers and adopted by Copyright enforcement agencies in recent years. Do the anatomy of a decentralised file sharing system, anyone can seed a file. Once this seeded file is made available to the peer-to-peer network it either becomes advertised to a localised central file distributor (referred to as a Super Node or Server) or is found during a spider search query run by another user logged into the peer to peer network. If these files are topical or sought after, they can be transferred onto a different node (client) rapidly. There they are stored in the second user’s ‘shared’ directory where more people can download it.
Once a seeded file has been downloaded and spread over a few tens of nodes the rate at which it can be downloaded by others increases almost exponentially with a cascade like effect. Other people of the peer to peer network are lured into downloading this file based on the number of people who have it therefore assuming it must be genuine and would be comparatively quick to obtain. Couple this with a topical or sought-after song / album or file aimed at the masses (who statistically would contain a fair percentage of PC-illiterate users and those with a penchance for agreeing to all the pop ups they come across) means these files explode across networks.
This malicious file in question appears to have masqueraded as a MP3 by Girls Aloud. Given the fact that on running the file pops up a message saying the computer requires a codec to play the song and tries to direct you to a website in order to download it, most computer users would stop and reexamine what they had just downloaded. People that brazenly proceeded and downloaded the malicious ‘codec’ package had spyware installed on their system which would ‘bombard’ users with pop ups. Also, the download file would spawn copies of itself within the User’s shared folder under different names to try to make itself attractive to a greater audience.
But what happened? How were people tricked into downloading an MP3 file but ended up running a malicuous program? The answer to this lies in the file type. Broadly speaking, there are two ways in which a file can be opened:
1) via script or binary execution (e.g. .exe, .com, .vbs, .java, .scr … and some others)
2) via program read from an external application (e.g. .txt, .doc, .wav, .mpg, .avi …. and MANY more.)
MP3 files (Moving Picture Experts Group version 1 audio layer 3) are the latter, upon execution, Windows searches through its list of known file extensions stored in the registry to see what it should do. It instantly finds the entry for MP3 and sees this type of file is handled by a media player like Windows Media Player, WinAMP, iTunes etc etc. Windows then executes the media player which, on loading, opens the MP3 file specified in the command line argument, decodes a block, fills its buffer and starts to play. Unless a clever trick like a buffer overflow is used, which have historically been responsible for security breaches in various Windows programs as well as console homebrew development, this renders all ‘program read’ type files harmless*. As such we have to look elsewhere for the source of this problem.
That brings us nicely to the point I wanted to raise in this post, file extensions and more specifically, security vulnerabilities in their implementation. Recent versions of Windows from XP (and possibly earlier, I can not remember) have automatically hidden the file extension by default leaving the user to distinguish between file types by iconographic representations. Whilst at times this is both cleaner looking and more functional, it does present an interesting security problem, what if there are two file extensions? Window will quite happily truncate the file .xxx from a file name leaving the first extension, despite the fact Windows ignores anything before the final .xxx . As a result, if you name a file SomethingInteresting.mp3.exe, in its default state, Windows will happily display the file as SomethingInteresting.mp3 but will execute the file as an EXE when double clicked. Obviously, if you quieried the file by right clicking on it and selecting properties you would be immediately told what type of file it is, but most people will take the file at face value.
Luckily there is a very simple way to gaurd against such black magic, in Windows XP and Vista** in the file browser, goto the Tools menu and select Folder Options.
In this dialog, uncheck ‘Hide extensions for known file types’ and click Apply followed by clicking Apply to all folders.
And that’s it! A simple check box and some common sense now separates you from being lured into downloading fake or malicious files.
* Some files like some movies can have containers which direct the media player or operating system to web pages. It is not just media files which are vulnerable but this is a completely different topic.
** In Vista you may have to enable the classic menu