Home > Microsoft, News, PC, Rant, Windows > The 2k bug

The 2k bug

Whilst it seems the Internet enjoys a good Microsoft Vista bashing (see previous post on topic) research today came out suggesting Windows 2000, an eight year old operating system that recently entered long term support phase by Microsoft, is more ‘secure’ than Windows Vista. (Cue fanboy and antiboy posts.)

But this is rather misleading, let us not forget, Windows 2000 was released in February 2000, a dark era where firewalls, security software and Windows Update were treated with suspicion previously reserved for black magic. Ok, so maybe I am exaggerating slightly, but back then the average PC had either a Pentium 2 or 3 processor between 600Mhz – 1.2Ghz, between 32-128Mb of RAM and a 20Gb hard disk and was aimed at the business market not consumers who had the privilege of running Windows ME (let the justified ME bashing commence.) But we are still missing the point here, now the only users that run Windows 2000 (which accounted for about 2% of all Internet traffic in March 2008 ) are those who are comfortable power users (like Steve Gibson) or those with old hardware (e.g. Third world etc.) As such, it is not worth the malware authors’ time to target such a small percentage of the userbase when they are more likely to snare the vulnerable XP or Vista users.

Worse still, serious doubts have been raised over the validity of this study given PC Tools did not scientifically determine the states of key security within the operating like Windows Vista’s UAC or even which service packs were installed on the computers. As noted by Ars technica, often the first action by typical malware is to download the target package(s) onto a system immediately after it has been compromised with the usually relatively small initial exploit. This could mean that their numbers are greatly misleading when three or four ‘infections’ could actually be a single instance of malware.

The only way to scientifically conduct such a test, would be with three virtual machines, one running Windows 2000, one with Windows XP and finally one with Vista each running a with a comparable set of security tools and the latest patches. That way, after each exposure, the virtual machine could be examined to determine if the exploit was successful and if so, the degree to which the target machine was compromised. At the end of the experiment, the virtual machine is ‘switched off’ without writing the changes to it’s virtual disk and restarted to test the next exploit. Using this methodology, all exploits can be tested equally and methodically and various configurational permutations can also be tried (e.g. Operating systems with only default security measures etc.)

Let us also not forget, there is no way to tell whether these threats are serious silent drive by download style exploits (which would constitute a serious threat) or as a result of user ignorance which even the most secure operating systems and security applications can not guard against. Playing Devil’s advocate, I can see a case that unscientific tests like these better represent real world conditions, however it can not be used to judge to reliability or security of Operating Systems nor the users using them as no conditions nor variables have been made constant. As such, unfortunately, these results have no validity as far as I am concerned.

  1. May 13, 2008 at 6:14 pm

    Hmm.. The infoweek link seems to be down right now.
    That test should have taken a minimum of two years to get right.

    Win2000 did have a spot of severe vulnerabilities, but I think with the adoption of high speed internet, many users opted to recieve a router. Which, in many cases, does include a firewall of sorts. Sure, they may not have been the industrial strength Cisco variety (in fact the earlier firewalls were limited to 10mbps internal routing), but a firewall is a firewall. The number of attackers who would bother with one is negligible and the types of vulnerabilities that could easily pass through one (I.E. that masquerades as HTTP traffic) had to rely on certain conditions that were not always present.

    Win2000 users were much more likely to have more than one system anyway. So it became the logical choice to use some type of shared internet connectivity.

    I have a feeling that this assessment concluded by the study would function in a very similar manner in gauging Win95 security, in which case it would find the OS to be the most secure platform introduced in the history of Microsoft.

    “…nor the users using them…” Well, that’s part two of the two part path to security.
    You can only do so much for someone who insists on sticking his head out the window of a moving vehicle.

    There’s virtually nothing that can be done for anyone who insists in being insecure for the sake of user friendliness. Or who insists on being qualified to use an OS while barely understanding how to be secure. But I feel software manufacturers are partly to blame for this.

    Secure by Default, Flexibility and User Friendly are seen as somehow being mutually exclusive concepts. When, in fact, they should be the standard for all well designed software and the default state in any installation.

    No, I’m not talking about Mac or Linux here since Mac puts a straight jacket around the user when it comes to system customisation while Linux trades user friendliness for security and flexibility.

    If someone could come up with an OS that has the eye-candy of Mac with the security of OpenBSD, flexibility of Linux with the compatibility of Windows…
    Oh well, we can dream.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: