Archive

Archive for the ‘Microsoft’ Category

Beta fish is back

February 29, 2012 Leave a comment

The Windows 8 Consumer Preview is now available to download and try out for free. This is a great opportunity to take the latest public cut of Windows 8 for a spin. Whether to dabble with HTML5 to create Metro apps or just explore the radical UI changes, trying out Windows 8 in a VM couldn’t be simpler. If you haven’t used virtual machines before Lifehacker has a great guide to get you started and Ars Technical have a nice series of screen grabs of the process.

20120229-204458.jpg

I do like the cute little touches like the Beta Fish shown during OS boot and on the default wallpaper. This, albeit cosmetic point, shows the painstaking attention the whole user experience has received. I’m sure cracks in the façade will emerge, but everything that I have experimented with so far and in the earlier developer preview has been very encouraging. Using Windows 8 does seem to require a fair amount of self-recalibration and I found it a little tricky initially figuring out how everything worked coming from the Win95-7 world. Luckily, Ars have an excellent orientation article if you find yourself floundering a little.

It’s a shame that virtual machines (and most current physical hardware) won’t allow experimenting with the full tablet experience on offer owing to a lack of appropriate touch interfaces. That said, it is still very possible to get a good feel for Windows 8 using the familiar keyboard and mouse. Despite my early misgivings about the Metro UX I am quite excited about where Microsoft are heading with their attempted transition to the post-PC era. Time and user adoption will tell whether their gamble has paid off.

Mediaportal 1.1 is Final(ly released)

August 15, 2010 2 comments

I think I am going to give Mediaportal another go, now that 1.1 final has been finally (after so much time and so many release candidates) released. I had been using Mediaportal as my media center software of choice since about 2007 (back in the 0.2.x days) but recently I had been lured away with some of the shinier features of Windows 7 Media Center.

Although overall I have enjoyed the user experience and polish of 7MC, the frustrations and limitations (lack of good plug-ins, local content, themes etc..) continue to mount. There are a lot of things 7MC is simply great at (series record, intelligently recording at another time to deal with timing clashes, Windows integration, four-foot configuration and so on..) but I think I am ready for a bit more freedom again from my media center.

Up in the cloud

October 1, 2008 Leave a comment

Microsoft has a number of core business revenue streams – otherwise known as cash cows. Despite strong indications that regardless of the recent lightweight application paradigm shift to the ‘cloud’ Microsoft have remained staunchly of the view that the operating system, as we know it today, will still be present in the future. So todays announcement indicating a potential branching from the desktop application centric philosophy is quite astonishing. According to ComputerWorld, Microsoft are looking to unveil a version of Windows codenamed  ‘Windows Clouds’ within a month. It will be very interesting to see the approach Mircosoft take with this project considering they are were quite keen to emphasise this will not detract from the ongoing Windows 7 work which is the planned successor to Windows Vista.

I previously weighed in on my opinion on cloud computing and very little has emerged to change my mind so far. I recently tried gOS v3 codename Gadgets which is the lightweight Linux distribution formally its own flavour based on the Enlightenment DR13 window manager and I am not that impressed. I found the integration between Google services (presented via barely concealed HTML widgets) and the operating system felt very amateurish. This coupled with the fact that version 3 is based on the more feature rich Gnome window manager, any assertion of this being a ‘stripped down’, light weight operating system for ‘netbooks’ sounds rather strained.

I do not doubt that one day, a certain percentage of desktops and laptops will be light weight (or thin client) systems accessing storage, applications and processing power from a ‘mothership’ in much the way cloud computing is evolving now. However it seems to make much more sense for a family or household or even a group of people to buy a central ‘home server’. This will however be very different to Windows Home Server and will resemble more the old style dumb terminals where multiple clients connect to one central machine.

Well that is my prediction, we will talk in ten years! For now, long live monster power rigs! :) As a final note, it will be interesting to see where Apple fit into this in the coming years. iSlim? iWeb? iJot?

Windows ‘Mojave’ Experiment

August 21, 2008 Leave a comment

I have been saying it all along and this just confirms it. Microsoft took (presumably) a random group of people and showed them the Windows ‘Mojave’, the purported successor to the ‘current’ Windows. So, forget about 7 and take a look for yourself.

I can’t really say too much more without giving the game away, although part of me wonders just how random this actually was. Without wishing to be offensive, these people do look to be fairly PC-illiterate and it wouldn’t be too hard for Microsoft to manipulate the outcome. On the other hand, with the amount of ill conceived rubbish being circulated about Vista it doesn’t take too great a leap of the imagination.

EDIT: Just did a bit of reading and found out the test bed for this experiment was a HP dv2000 laptop with 2Gb of RAM. I had a dv2799 (for a short duration) and I know they are very capable machines (although the workmanship is terrible – I have 4 go wrong but never-mind), however not outside the realms of the ‘average’ consumer system. This is good as it at least makes it a fairly fair demonstration.

Windows Media Center 2005 woes

July 23, 2008 Leave a comment

I managed to acquire, for the price of a nice lunch, a brand new Elonex media center Artisan LX a couple of days back. I was initially very excited because up to then I had still been running my first media center was really just an experiment, built from scratch containing mostly old components I had around my place. A year and a bit on, I am firmly hooked on a PC based PVR system the cornerstone of my entertainment system. It contained an Athlon 2600+ processor with 512Mb of DDR coupled with a DVB-T Hauppage tuner and an 80Gb drive for recordings running the open source MediaPortal software. So as you can see, there was plenty of room for improvement.

This was the first time I have really had a tinker with the Windows Media Center range of Operating systems that Microsoft produce and I went in with few expectations, apart from wanting at least as comparable an experience in terms of functionality and flexibility as I have enjoyed with MediaPortal.

The first thing that struck me was how fickle Windows Media Center 2005 was, even with all the roll ups (essentially what Microsoft call Service Packs for Media Center OS) installed. Wikipedia sums up the ‘capabilities’ of WMC 2005:

‘Media Center originally had a limitation of 1 analog tuner, but was raised to 2 analog tuners with Media Center 2005. With Update Rollup 1 for Media Center 2005, support for a digital tuner was added, but an analog tuner must still be present for the digital tuner to function. With Rollup 2, up to 4 TV tuners can be configured (2 analog and 2 HDTV). All the tuners must use the same source, for example they must all be off an aerial or a set-top box using the same guide data, you cannot mix Sky Digital and DVB-T for example.’

XP Media Center really shows its age here – I do not watch any analogue transmissions, so for a Media Center to require a legacy piece of hardware just to be able to access DVB (digital) seems preposterous. But that was not the worst thing! Windows Media Center 2005 is not capable of pulling EPG data OTA (over-the-air) instead requiring an overly elaborate system that relies on a permanent, always on Internet connection. This also raises some privacy concerns as ‘anonymous’ data, which is not entirely anonymous as Microsoft asks for your postcode during set up, is fed back to Microsoft which can include recording / watching trends and general EPG usage. Hitherto my media center system has not been networked. Considering it is in the opposite corner of my house, and I do not stream my recordings or have formal media shares, I never felt the need to network it. It was nice to just have a static, secure system without any security programs or periodic updates – now security monitoring of my media center has been added to my list of digital chores.

None the less, I was determined to give it a fair go, so I added a wifi adaptor, added some plug-ins and configured everything. After spending eight hours getting everything working, playing around and testing… I went back to my custom build. Not all the problems can be put squarely at Microsoft’s feet however. Elonex declared bankruptcy shortly after launching this range and the malicious part of me can see why, if this mediacenter is the sum total of their expertise.

Whilst the case looked rather nice from the outside, the hardware and the design of the internals is what really lets it down. The only element Elonex got right was the noise (or lack thereof) – the media center barely gives out a murmur when idle due to only a since fan which is housed inside the power supply. It runs at 690rpm, which draws air over the CPU heatsink (which has four heat pipes) and directly out the side of the case. However, I stressed ‘at idle’ before for a reason. When the media center does anything the incredibly noisy hard drive starts very audibly clicking and crunching away and it completely lets the machine down.

However that’s not the worst thing about this mediacenter. Due to the fact that there is only one very slow fan the airflow in the case is restricted to circulating around the motherboard tray, the processor then out the power supply. The harddrive and PCI / AGP cards are completely neglected. This point was slammed home when the harddrive consistently reported temperatures of high 50s to 62 degrees Celsius!!! Worse still, when I idled the system, that heat didn’t dissipate. The hard drive is locked into place with a pretentious plastic locking mechanism which neither improves the accessibility of the drive bay nor decreases the vibrations from the drive. There is no thermal (or thermally viable) contact between the hard drive and the case and as such, the hard drive is left smouldering away with no way to cool down predictably with next to no drop in temperature. There is a valid point that maintaining electronic components at a set temperature prolongs their life by avoiding constantly repeating thermal differentials (i.e. heating and cooling) however the fact remains that 60+ degrees centigrade is far too hot for a hard drive. Although my brief research on this did not yield any definitive threshold, most sources agree that 50-55 degrees Centigrade is about the absolute maximum recommended operating temperature.

Couple this practically zero thermal conduction with a lack of airflow and you have a recipie for a very short hard drive life. Even worse, this thermal issue was not limited to HDD, the south-bridge and GFX heatsinks were equally poorly cooled and get unpleasantly hot to the touch.

Worst of all, it is just slow. CpuID and the BIOS disagreed with each other about the exact Intel processor that powers the system. I believe it to be either an Intel Pentium 4 530 (at 3.06Ghz) or a Celeron D 345. There is no way the much older Athlon 2600+ processor with the same RAM should be out performing this setup and yet it does so without breaking a sweat.

All in all, very disappointing. A remarkable demonstration of technical ignorance on the part of Elonex. But hey, I didn’t pay for it and now I have an extra DVB-T tuner back in my original, self built machine.

Design (cosmetic) : 8/10 - Pleasing, with a nice Hi-fi look.

Design (technical) : 2/10 – Poor components poorly arranged.

Cooling : 6/10 - Great CPU and powersupply cooling, but everything else is woefully neglected.

Acoustics : 6/10 - Silent until it has to touch the harddrive, still a good effort though

Connectivity : 8/10 – Lots of connectors for digital Audio and Video

Capacity : 5/10 - 200Gb harddrive with a portion taken for recovery. I wouldn’t trust it though and by modern standards it is rather anemic.

Overall : 2/10 – Great for free, if I paid anything for it I would have been annoyed.

Vista SP1 and the Red Herring (+ breaking the 32bit 4Gb limit)

May 29, 2008 6 comments

We all knew it was looming, the mathematical limit to address referencing in 32bit computing. A 32Bit number can only be between 0 and 4,294,967,295 which neatly adds up to 4Gb and what this means is, using existing architectures, a program (or Operating System) will not be able to address more than this number of bytes of system RAM via the existing system called byte addressed memory allocation.

What this means for those among us who do not speak geek, is a system which is built or shipped with 4Gb of RAM (and some other cases*) will not be able to fully utilise all of that space.

Lets take a trip back in history and imagine a room with a cupboard containing 256 drawers. Each drawer could hold one bit of binary information and was administered by a librarian. Anytime anyone wanted a piece (or pieces) of information, they had to ask the librarian. What I am describing here, is the era of 8bit computing circa late 1970/ early 80s with the cupboard representing system memory and the librarian representing the Operating System’s memory management system. During day to day running of the system the librarian takes data in and returns data to people (program threads) from the corresponding drawers where the information is stored. Everything works, everyone is happy.

Now what happens if we introduce a second cupboard containing another 128 or 256 drawers? The librarian can only keep track of information stored in the first 256 drawers and as a result, nothing can be stored or retrieved from the newly added cupboards; in effect, they do not exist. Time to get a new secretary i.e. goto 64bit computing (or in this example, replace the 8bit librarian with a swanky 16bit one – who will even ever use 65536bits of RAM? :D )

But wait, there is more… I read today that Windows Vista SP1 changes (depending on hardware configuration) the total amount of displayed RAM from 3.5 Gb (current the RTM limit when 4Gb is put in the machine) to the full 4Gb, although this still does not help, given the limitation previously discussed. But this made me curious, if the Operating System could see RAM, then surely it was not a BIOS / mathematical fundamental limitation. Turns out I was at least half right …

You see, although the fundamental mathematical limitation can not be breached, there is a rather interesting technique called Physical Address Extension. Using this process, a 32bit Windows system can address more than 4Gb of RAM upto a (present) maximum of 128Gb. To explain what Physical Address Extension (PAE) is, lets go back to the previous example and introduce a new figure – an administrator.

The role of this new entity, is to allocate and manage the time of their underling. Lets also assume we are still running a 8bit system (with the 256bit limit) and have 1024bits of memory i.e. four times the mathematical limit. On the face of it, the extra memory is invisible to the librarian however the administrator is smart enough to both know about the extra memory and who (i.e. what program) is currently using what amount of it. As such, any person (program) can request the full mathematical limit 256 drawers for their own use at the same time as another person (and another …etc) requests more memory.The administrator can instruct the librarian which series of drawers to use per person (program).

This is loosely referred to as 36bit computing and, as the non power of 2 number suggests it is a bit of a tweak. The physical address size was increased (on a 32bit processor) from 32 to 36bits back during the days of Pentium Pro (circa 1997) and most modern CPUs have maintained this legacy. It is important to point out, this does not make all 32bit processors 36bit processors as the change happened in the MMU (memory management unit). Modern Operating systems use page tables to store information about the Virtual Memory system and allocate it based on processes requirements. In effect they act like the administrator from my trivialised example and allow multiple processes to benefit from a pool of memory which traditional 32bit systems (without PAE) would not.

I know what you are thinking, you are rejoicing at being able to avoid the negative aspects of migrating to 64bit computing, but hang on, there are a couple of important caveats. Firstly, each thread (person in our example) can only access a maximum of the mathematical limit of RAM. That means, in a system with 16Gb of RAM, you could quite easily have 3 or 4 processes each taking up 4Gb, but no one process taking up 8 or 16Gb. The other bad point is, it is not supported** in Vista or XP. In-fact, to use such a feature, you would need to be running a Server Operating System from Microsoft or a Linux equivalent. Interestingly enough, Linux contains support for PAE since kernel version 2.6 although I will not discuss it further in this post.

Presently, the only Operating Systems with suitable (or rumoured) PAE support are :

Windows 2000: Datacenter Server and Advanced Server Editions

Windows Server 2003: Enterprise and Datacenter Editions

Windows Server 2008: Enterprise and Datacenter Editions

As you can see, non are particularly home desktop friendly. So, despite Vista displaying the correct amount of RAM in Service Pack 1, it is still fundamentally limited to the 32bit mathematical limit despite Microsoft having the technology to at least improve on the functionality of such systems.

On a side note, I brought this up with a few people at my head office. I work for a large UK retail company that sells PCs and Laptops. I was surprised to see when our first 4Gb models came into the stores a few months ago that they were running Vista 32bit Editions. The UK is not a litigious as the United States, but I can’t help wondering how long it will be before the lawsuits start flying. After all, it is misrepresentation in my book to sell something that, due to a software shortcoming, can never be fully utilised to the specification it was advertised at. Particularly since an alternative is available to OEMs and yet, all retailers not just the one I work for seem to be taking a cavalier attitude towards this.

*The total amount of addressable space inside a 32bit system must add up to 4096Mb, this includes system and Video RAM, so if you have an all singing, all dancing SLI graphics card with 2Gb of Graphical RAM, the total amount of system RAM you will be able to address is around 2Gb.

**Actually this is not true, ever since Windows XP Service Pack 2, Microsoft has used PAE for security purposes coupled with the NX bit. This is a hardware security feature built into a processor which allows program and system developers greater control over what they designate to be executable and non-executable user/memory space. Microsoft has set a fundamental limitation of the amount of RAM being used by home versions of 32bit Operating Systems to 4Gb regardless of the fact the technology to increase this is in place.

The 2k bug

May 12, 2008 1 comment

Whilst it seems the Internet enjoys a good Microsoft Vista bashing (see previous post on topic) research today came out suggesting Windows 2000, an eight year old operating system that recently entered long term support phase by Microsoft, is more ‘secure’ than Windows Vista. (Cue fanboy and antiboy posts.)

But this is rather misleading, let us not forget, Windows 2000 was released in February 2000, a dark era where firewalls, security software and Windows Update were treated with suspicion previously reserved for black magic. Ok, so maybe I am exaggerating slightly, but back then the average PC had either a Pentium 2 or 3 processor between 600Mhz – 1.2Ghz, between 32-128Mb of RAM and a 20Gb hard disk and was aimed at the business market not consumers who had the privilege of running Windows ME (let the justified ME bashing commence.) But we are still missing the point here, now the only users that run Windows 2000 (which accounted for about 2% of all Internet traffic in March 2008 ) are those who are comfortable power users (like Steve Gibson) or those with old hardware (e.g. Third world etc.) As such, it is not worth the malware authors’ time to target such a small percentage of the userbase when they are more likely to snare the vulnerable XP or Vista users.

Worse still, serious doubts have been raised over the validity of this study given PC Tools did not scientifically determine the states of key security within the operating like Windows Vista’s UAC or even which service packs were installed on the computers. As noted by Ars technica, often the first action by typical malware is to download the target package(s) onto a system immediately after it has been compromised with the usually relatively small initial exploit. This could mean that their numbers are greatly misleading when three or four ‘infections’ could actually be a single instance of malware.

The only way to scientifically conduct such a test, would be with three virtual machines, one running Windows 2000, one with Windows XP and finally one with Vista each running a with a comparable set of security tools and the latest patches. That way, after each exposure, the virtual machine could be examined to determine if the exploit was successful and if so, the degree to which the target machine was compromised. At the end of the experiment, the virtual machine is ‘switched off’ without writing the changes to it’s virtual disk and restarted to test the next exploit. Using this methodology, all exploits can be tested equally and methodically and various configurational permutations can also be tried (e.g. Operating systems with only default security measures etc.)

Let us also not forget, there is no way to tell whether these threats are serious silent drive by download style exploits (which would constitute a serious threat) or as a result of user ignorance which even the most secure operating systems and security applications can not guard against. Playing Devil’s advocate, I can see a case that unscientific tests like these better represent real world conditions, however it can not be used to judge to reliability or security of Operating Systems nor the users using them as no conditions nor variables have been made constant. As such, unfortunately, these results have no validity as far as I am concerned.

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: