Archive

Posts Tagged ‘Microsoft’

Beta fish is back

February 29, 2012 Leave a comment

The Windows 8 Consumer Preview is now available to download and try out for free. This is a great opportunity to take the latest public cut of Windows 8 for a spin. Whether to dabble with HTML5 to create Metro apps or just explore the radical UI changes, trying out Windows 8 in a VM couldn’t be simpler. If you haven’t used virtual machines before Lifehacker has a great guide to get you started and Ars Technical have a nice series of screen grabs of the process.

20120229-204458.jpg

I do like the cute little touches like the Beta Fish shown during OS boot and on the default wallpaper. This, albeit cosmetic point, shows the painstaking attention the whole user experience has received. I’m sure cracks in the façade will emerge, but everything that I have experimented with so far and in the earlier developer preview has been very encouraging. Using Windows 8 does seem to require a fair amount of self-recalibration and I found it a little tricky initially figuring out how everything worked coming from the Win95-7 world. Luckily, Ars have an excellent orientation article if you find yourself floundering a little.

It’s a shame that virtual machines (and most current physical hardware) won’t allow experimenting with the full tablet experience on offer owing to a lack of appropriate touch interfaces. That said, it is still very possible to get a good feel for Windows 8 using the familiar keyboard and mouse. Despite my early misgivings about the Metro UX I am quite excited about where Microsoft are heading with their attempted transition to the post-PC era. Time and user adoption will tell whether their gamble has paid off.

Up in the cloud

October 1, 2008 Leave a comment

Microsoft has a number of core business revenue streams – otherwise known as cash cows. Despite strong indications that regardless of the recent lightweight application paradigm shift to the ‘cloud’ Microsoft have remained staunchly of the view that the operating system, as we know it today, will still be present in the future. So todays announcement indicating a potential branching from the desktop application centric philosophy is quite astonishing. According to ComputerWorld, Microsoft are looking to unveil a version of Windows codenamed  ‘Windows Clouds’ within a month. It will be very interesting to see the approach Mircosoft take with this project considering they are were quite keen to emphasise this will not detract from the ongoing Windows 7 work which is the planned successor to Windows Vista.

I previously weighed in on my opinion on cloud computing and very little has emerged to change my mind so far. I recently tried gOS v3 codename Gadgets which is the lightweight Linux distribution formally its own flavour based on the Enlightenment DR13 window manager and I am not that impressed. I found the integration between Google services (presented via barely concealed HTML widgets) and the operating system felt very amateurish. This coupled with the fact that version 3 is based on the more feature rich Gnome window manager, any assertion of this being a ‘stripped down’, light weight operating system for ‘netbooks’ sounds rather strained.

I do not doubt that one day, a certain percentage of desktops and laptops will be light weight (or thin client) systems accessing storage, applications and processing power from a ‘mothership’ in much the way cloud computing is evolving now. However it seems to make much more sense for a family or household or even a group of people to buy a central ‘home server’. This will however be very different to Windows Home Server and will resemble more the old style dumb terminals where multiple clients connect to one central machine.

Well that is my prediction, we will talk in ten years! For now, long live monster power rigs! 🙂 As a final note, it will be interesting to see where Apple fit into this in the coming years. iSlim? iWeb? iJot?

Windows ‘Mojave’ Experiment

August 21, 2008 Leave a comment

I have been saying it all along and this just confirms it. Microsoft took (presumably) a random group of people and showed them the Windows ‘Mojave’, the purported successor to the ‘current’ Windows. So, forget about 7 and take a look for yourself.

I can’t really say too much more without giving the game away, although part of me wonders just how random this actually was. Without wishing to be offensive, these people do look to be fairly PC-illiterate and it wouldn’t be too hard for Microsoft to manipulate the outcome. On the other hand, with the amount of ill conceived rubbish being circulated about Vista it doesn’t take too great a leap of the imagination.

EDIT: Just did a bit of reading and found out the test bed for this experiment was a HP dv2000 laptop with 2Gb of RAM. I had a dv2799 (for a short duration) and I know they are very capable machines (although the workmanship is terrible – I have 4 go wrong but never-mind), however not outside the realms of the ‘average’ consumer system. This is good as it at least makes it a fairly fair demonstration.

Windows Media Center 2005 woes

July 23, 2008 Leave a comment

I managed to acquire, for the price of a nice lunch, a brand new Elonex media center Artisan LX a couple of days back. I was initially very excited because up to then I had still been running my first media center was really just an experiment, built from scratch containing mostly old components I had around my place. A year and a bit on, I am firmly hooked on a PC based PVR system the cornerstone of my entertainment system. It contained an Athlon 2600+ processor with 512Mb of DDR coupled with a DVB-T Hauppage tuner and an 80Gb drive for recordings running the open source MediaPortal software. So as you can see, there was plenty of room for improvement.

This was the first time I have really had a tinker with the Windows Media Center range of Operating systems that Microsoft produce and I went in with few expectations, apart from wanting at least as comparable an experience in terms of functionality and flexibility as I have enjoyed with MediaPortal.

The first thing that struck me was how fickle Windows Media Center 2005 was, even with all the roll ups (essentially what Microsoft call Service Packs for Media Center OS) installed. Wikipedia sums up the ‘capabilities’ of WMC 2005:

‘Media Center originally had a limitation of 1 analog tuner, but was raised to 2 analog tuners with Media Center 2005. With Update Rollup 1 for Media Center 2005, support for a digital tuner was added, but an analog tuner must still be present for the digital tuner to function. With Rollup 2, up to 4 TV tuners can be configured (2 analog and 2 HDTV). All the tuners must use the same source, for example they must all be off an aerial or a set-top box using the same guide data, you cannot mix Sky Digital and DVB-T for example.’

XP Media Center really shows its age here – I do not watch any analogue transmissions, so for a Media Center to require a legacy piece of hardware just to be able to access DVB (digital) seems preposterous. But that was not the worst thing! Windows Media Center 2005 is not capable of pulling EPG data OTA (over-the-air) instead requiring an overly elaborate system that relies on a permanent, always on Internet connection. This also raises some privacy concerns as ‘anonymous’ data, which is not entirely anonymous as Microsoft asks for your postcode during set up, is fed back to Microsoft which can include recording / watching trends and general EPG usage. Hitherto my media center system has not been networked. Considering it is in the opposite corner of my house, and I do not stream my recordings or have formal media shares, I never felt the need to network it. It was nice to just have a static, secure system without any security programs or periodic updates – now security monitoring of my media center has been added to my list of digital chores.

None the less, I was determined to give it a fair go, so I added a wifi adaptor, added some plug-ins and configured everything. After spending eight hours getting everything working, playing around and testing… I went back to my custom build. Not all the problems can be put squarely at Microsoft’s feet however. Elonex declared bankruptcy shortly after launching this range and the malicious part of me can see why, if this mediacenter is the sum total of their expertise.

Whilst the case looked rather nice from the outside, the hardware and the design of the internals is what really lets it down. The only element Elonex got right was the noise (or lack thereof) – the media center barely gives out a murmur when idle due to only a since fan which is housed inside the power supply. It runs at 690rpm, which draws air over the CPU heatsink (which has four heat pipes) and directly out the side of the case. However, I stressed ‘at idle’ before for a reason. When the media center does anything the incredibly noisy hard drive starts very audibly clicking and crunching away and it completely lets the machine down.

However that’s not the worst thing about this mediacenter. Due to the fact that there is only one very slow fan the airflow in the case is restricted to circulating around the motherboard tray, the processor then out the power supply. The harddrive and PCI / AGP cards are completely neglected. This point was slammed home when the harddrive consistently reported temperatures of high 50s to 62 degrees Celsius!!! Worse still, when I idled the system, that heat didn’t dissipate. The hard drive is locked into place with a pretentious plastic locking mechanism which neither improves the accessibility of the drive bay nor decreases the vibrations from the drive. There is no thermal (or thermally viable) contact between the hard drive and the case and as such, the hard drive is left smouldering away with no way to cool down predictably with next to no drop in temperature. There is a valid point that maintaining electronic components at a set temperature prolongs their life by avoiding constantly repeating thermal differentials (i.e. heating and cooling) however the fact remains that 60+ degrees centigrade is far too hot for a hard drive. Although my brief research on this did not yield any definitive threshold, most sources agree that 50-55 degrees Centigrade is about the absolute maximum recommended operating temperature.

Couple this practically zero thermal conduction with a lack of airflow and you have a recipie for a very short hard drive life. Even worse, this thermal issue was not limited to HDD, the south-bridge and GFX heatsinks were equally poorly cooled and get unpleasantly hot to the touch.

Worst of all, it is just slow. CpuID and the BIOS disagreed with each other about the exact Intel processor that powers the system. I believe it to be either an Intel Pentium 4 530 (at 3.06Ghz) or a Celeron D 345. There is no way the much older Athlon 2600+ processor with the same RAM should be out performing this setup and yet it does so without breaking a sweat.

All in all, very disappointing. A remarkable demonstration of technical ignorance on the part of Elonex. But hey, I didn’t pay for it and now I have an extra DVB-T tuner back in my original, self built machine.

Design (cosmetic) : 8/10 – Pleasing, with a nice Hi-fi look.

Design (technical) : 2/10 – Poor components poorly arranged.

Cooling : 6/10 – Great CPU and powersupply cooling, but everything else is woefully neglected.

Acoustics : 6/10 – Silent until it has to touch the harddrive, still a good effort though

Connectivity : 8/10 – Lots of connectors for digital Audio and Video

Capacity : 5/10 – 200Gb harddrive with a portion taken for recovery. I wouldn’t trust it though and by modern standards it is rather anemic.

Overall : 2/10 – Great for free, if I paid anything for it I would have been annoyed.

Post XP SP3 Update problem

July 18, 2008 2 comments

Despite all the problems circulating the web about Windows XP Service Pack 3, I thought I would go ahead anyway on a new installation. The installation part went fine and the system restarted properly with no lock ups, stops or looping restarts. So far so good, unfortunately I celebrated my good fortune too soon – Windows Update stopped functioning. Whilst updates were being downloaded, Windows XP would fail to actually perform the update.

I did a bit of googling and whilst I didn’t find any accounts exactly matching my problem, I decided to follow the advice on this Microsoft KB article.

First of all, stop the automatic update service from the command prompt.

1. Open up Start Menu > Run

2. Type “cmd” and press Enter.

3. In the command box, type “net stop wuauserv”, should should get the following confirmation:

Now we need to reregister the DLL involved in the Windows Update process.

4. Type in “regsvr32 %windir%\system32\wups2.dll”. The following control box should pop up after a moment:

Now we need to start the update service and hopefully all should be well again.

5. Type “net start wuauserv” which should yield this confirmation:

Thats it, updates started working for me immediately afterwards. If this didn’t do the trick for you, follow the alternative methods on Microsoft’s KB article linked above.

When the file extension… is not the file extension.

May 8, 2008 Leave a comment

I was bemused to read on bbc news earlier that a trivially simply ploy stung half a million file sharers. The concept is nothing new having been started a fair few years ago by virus / malware writers and adopted by Copyright enforcement agencies in recent years. Do the anatomy of a decentralised file sharing system, anyone can seed a file. Once this seeded file is made available to the peer-to-peer network it either becomes advertised to a localised central file distributor (referred to as a Super Node or Server) or is found during a spider search query run by another user logged into the peer to peer network. If these files are topical or sought after, they can be transferred onto a different node (client) rapidly. There they are stored in the second user’s ‘shared’ directory where more people can download it.

Once a seeded file has been downloaded and spread over a few tens of nodes the rate at which it can be downloaded by others increases almost exponentially with a cascade like effect. Other people of the peer to peer network are lured into downloading this file based on the number of people who have it therefore assuming it must be genuine and would be comparatively quick to obtain. Couple this with a topical or sought-after song / album or file aimed at the masses (who statistically would contain a fair percentage of PC-illiterate users and those with a penchance for agreeing to all the pop ups they come across) means these files explode across networks.

This malicious file in question appears to have masqueraded as a MP3 by Girls Aloud. Given the fact that on running the file pops up a message saying the computer requires a codec to play the song and tries to direct you to a website in order to download it, most computer users would stop and reexamine what they had just downloaded. People that brazenly proceeded and downloaded the malicious ‘codec’ package had spyware installed on their system which would ‘bombard’ users with pop ups. Also, the download file would spawn copies of itself within the User’s shared folder under different names to try to make itself attractive to a greater audience.

But what happened? How were people tricked into downloading an MP3 file but ended up running a malicuous program? The answer to this lies in the file type. Broadly speaking, there are two ways in which a file can be opened:

1) via script or binary execution (e.g. .exe, .com, .vbs, .java, .scr … and some others)

2) via program read from an external application (e.g. .txt, .doc, .wav, .mpg, .avi …. and MANY more.)

MP3 files (Moving Picture Experts Group version 1 audio layer 3) are the latter, upon execution, Windows searches through its list of known file extensions stored in the registry to see what it should do. It instantly finds the entry for MP3 and sees this type of file is handled by a media player like Windows Media Player, WinAMP, iTunes etc etc. Windows then executes the media player which, on loading, opens the MP3 file specified in the command line argument, decodes a block, fills its buffer and starts to play. Unless a clever trick like a buffer overflow is used, which have historically been responsible for security breaches in various Windows programs as well as console homebrew development, this renders all ‘program read’ type files harmless*. As such we have to look elsewhere for the source of this problem.

That brings us nicely to the point I wanted to raise in this post, file extensions and more specifically, security vulnerabilities in their implementation. Recent versions of Windows from XP (and possibly earlier, I can not remember) have automatically hidden the file extension by default leaving the user to distinguish between file types by iconographic representations. Whilst at times this is both cleaner looking and more functional, it does present an interesting security problem, what if there are two file extensions? Window will quite happily truncate the file .xxx from a file name leaving the first extension, despite the fact Windows ignores anything before the final .xxx . As a result, if you name a file SomethingInteresting.mp3.exe, in its default state, Windows will happily display the file as SomethingInteresting.mp3 but will execute the file as an EXE when double clicked. Obviously, if you quieried the file by right clicking on it and selecting properties you would be immediately told what type of file it is, but most people will take the file at face value.

Luckily there is a very simple way to gaurd against such black magic, in Windows XP and Vista** in the file browser, goto the Tools menu and select Folder Options.

In this dialog, uncheck ‘Hide extensions for known file types’ and click Apply followed by clicking Apply to all folders.

And that’s it! A simple check box and some common sense now separates you from being lured into downloading fake or malicious files.

* Some files like some movies can have containers which direct the media player or operating system to web pages. It is not just media files which are vulnerable but this is a completely different topic.

** In Vista you may have to enable the classic menu

The Wow is here! (With some tweaking)

April 30, 2008 1 comment

I just came across a great site called MyVistaBoot.com . As the name suggests, it is dedicated to sprucing up that fairly boring Vista boot screen. Each new boot screen is packaged with an installer so it is trivial to get them on your system without resorting to the use of third party applications as was necessary with Windows XP. Take a look, there are some very elegant ones on there to suit every taste.

UPDATE: My mistake, the file downloaded replaces the winload.exe.mui file directly. It is not as simple as just replacing the Windows file but the instructions are clear and concise.

Minor Vista Bug

April 21, 2008 2 comments

Has anyone else noticed this strange GUI bug? When you goto the Notification Area property page you are met with the following screen.

Now if you opt to not show the network icon, this happens:

For some reason, the power icon is also removed even if it is checked. If you actually apply this, only the network icon is actually removed, its just strange Microsoft have not found and fixed this yet given it is a trivial GUI issue.

Categories: Microsoft, Random, Windows Tags: , , , ,

Hidden World of Linux: Follow up Part 1 – NAS

April 10, 2008 4 comments

Since my previous post on the hidden uses of Linux attracted so much attention, I thought I would do a brief follow up adding a bit more to my conclusion in which I discussed the main drawback to all these great Linux distributions – power consumption. At some point I am going to buy a power meter and test a variety of old computers I have around the house to see how much power they draw, but for now I just want to give some illustrated examples of low power hardware that can be bought which are ideal for some of the uses described in my prior post.

This is the first of two follow up posts. This way I can go into detail about each specific section. In this post I will be discussing NAS (Network Attached Storage) and will follow up shortly with a post on Firewalls later.

Realistically retail NAS devices fall into two categories, ones with a single harddrive and ones with multiple harddrives.

Single hard drive setups

There are a large variety of single harddrive NAS systems available at fairly reasonable prices and, unless you need a specific feature that a Linux/BSD distribution like FreeNAS provides, it will likely be better to purchase a separate NAS drive. This way you do not need to worry about installation / upgrading potentially buggy software and the power requirement will be in the tens of Watts.

For the sake of argument, let us consider three hardware examples for building (or reusing an old computer for) a single drive NAS. The first is by far the cheapest – reusing your old PC. All that is really required is a new harddrive to replace the small one the PC would originally have been shipped with.

At an average price of £35 for a 250Gb SATA drive (slightly less for an IDE version,) simply reusing an old PC is by far the cheapest option, however there are a number of things to watch out for. Old computers used to have limitations as to the maximum hard drive capacity the BIOS on the motherboard would be able to address. Back in the days of single GB hard drives, a then theoretical limit of 137Gb must have seemed as far off as 32Gb RAM for desktops does today. Fast forward back to today; whilst modern systems are very happily addressing far more than 137Gb thanks to logical block (LBA) 48bit addressing, chances are you will want at least around 160Gb space for your NAS meaning this could be a problem for some of the really old hardware. The reason for this so called “ATA Interface Limit” issue (which is by no means the first in computing – check out this great article) is a mathematical limitation in the way in which harddrives used to be accessed at a very low level using discrete geometry (cylinder, head and sector numbers.) BIOS patches are available although these are few and far between.

Worn power supplies are also a potential hazard, check before deploying a system for 12/24 hour use that the power supply cooling fan is in good condition and that there are no overheating issues caused by an old or clogged cooling system in the rest of the hardware. Please do not open up a power supply – such an action could be dangerous if you do not know what you are doing. When in doubt, replace it – it will be cheaper in the long run than if you end up setting fire to your house or destroying your data through a power spike induced head crash. In summary, this option is by far the cheapest of the three, but there can be some problems along the way.

The second option I explored, would be to buy a complete, custom tailored PC system for use as a headless NAS. I went to one of the eshops I frequently purchase from and quickly, virtually built a low powered, cheap PC that would be suitable for such a purchase. Surprisingly, it turns out that building your own NAS box is a lot less expensive than I would have thought with my NAS PC costing a total of £108 (Full specification and links in appendix at the end of this post) inclusive of the £35 250Gb harddrive used in the previous example. This compares very favourably with the (currently) cheapest single HDD NAS box available from the same eshop which is £77. With your own PC, you get the advantage of customising the services your NAS provides giving you greater control coupled with expandability down the road, an option unavailable when buying a retail NAS. The downside to this is the increased power consumption. To mitigate this, I picked recent components which have power saving features like AMD’s Cool and Quiet as well as the special, low power consumption versions rather than going for a generation (or two) old technology which was roughly the same price anyway.

The final ‘self-built’ NAS hardware option I wanted to explore is building a NAS with ultralow power embedded components frequently used in routers / modems and in actual NAS systems. It is possible to buy a limited selection of embedded motherboards, some even with low power processors like the VIA C7 or AMD Geode. VIA C7 processor boards seem to be a lot cheaper, and I selected a board which had everything minus RAM, the HDD (hard disk) and a power supply. Unfortunately, due to the limited production scales of some of these ITX boards (you pay a premium for the miniaturisation) the cost of building such a low power device was higher than I anticipated. The total price for a small, very low power embedded NAS build was £143 (full specification in appendix at the end of post) also inclusive of 250 Gb hard disk drive.

As you can see, the cheapest option, (predictably) would be to reuse old hardware assuming it is only two or three generations old. In all three PC specifications, I have kept the harddrive size and cost the same in order to allow for a greater comparison, but I find it hard to recommend either self build option even given the extra flexibility that such a computer would yield running a BSD distribution like FreeNAS. Also, although FreeNAS is a fairly mature product, there is no guarantee that it will work flawlessly with the hardware you have (I had some ACPI issues with my test machine) which would render potential effort useless. If you have an old PC and hard drives lying around then you have nothing to loose by trying FreeNAS, I would even encourage it, otherwise I must stick to my original comment – if you are only want a NAS for casual backup on a single drive, buy an off the shelf product.

Multiple hard drive setups

If on the other hand you want more than a single HDD, this is where things start to get interesting, there are very few (reasonably) priced multi disk NAS systems on the market. The key exception is a piece of hardware I alluded to in my previous post which I would like to talk briefly now about. (I am sure other options exist, but this is the only reasonably priced one currently available in the UK market.)

The enclosure I found which would allow two drives to be used is made by Nanopoint and is model ‘Icy Box IB-NAS4220-B.’ It has an interesting feature set, supporting 2 SATA harddrives with Samba, NFS, FTP, RAID 1 & 0 as well as a USB to act as a print server. Unfortunately it is twice the price here in the UK than in the US but it seems to be one of the very few NAS enclosures that allows for RAID 1 across two harddrives. This was important as I am after a system that has built in redundancy – if one hard disk failed – another automatically had a copy of all the files. (Although the theory behind RAID is somewhat flawed – more on this another time.) I am seriously tempted to buy one of these and if I do I will write a full review with how it compares to FreeNAS at a later stage. UPDATE: I have found another similar device by Netgear (SC101 SAN/NAS device) although it only supports IDE drives, the other features seem roughly the same.

This is the point where FreeNAS starts to really distinguish itself from some of the commercial offerings. The reason is simple, anything more than one or two hard drives is seen as either SOHO (Small Office / Home Office) or Corporate grade and has an appropriate price tag and feature set. FreeNAS can, and will scale beautifully with a number of hard drives (even performing fault tolerant RAID 5 as well as the more popular RAID 1) although at the moment, it does not support clustering or failovers. This is relatively trivial as we are getting now into the realms of enterprise grade computing.

Due to the relatively simple firmware required to get these devices working (even with a variety of services) it will likely be cheaper over the course of a year to skip distributions like FreeNAS or OpenFiler and instead opt for a NAS drive enclosure, unless you specifically need some of the features FreeNAS offers or you are using several hard disks.

Related Idea : Virtualisation

Thumos made an interesting point in one of my posts about using a server running multiple virtual environments which each role (e.g. firewall, NAS / SAN, MythTV etc) all running on one PC. The downside of this would be, as he noted, dramatically increased hardware requirements and to be honest, I am not confident such a system would be able to handle all those roles effectively but I am not an expert on Virtualisation. Windows Server 2008 can do some pretty amazing things in this respect with their hypervisor based virtualisation system.

Related Idea: Windows Home Server

Although strictly speaking Windows Home Server is a completely different program (and incompatible with freedom (or F/OSS) software philosophies) it deserves a mention given the subject of this post. Built on a modified Windows Server 2003 r2 core, Windows Home Server adds automated backup as well as some impressive disk management tools. Perhaps the most striking to me was the absence of RAID as we classically see it. RAID has become ubiquitous for redundant, performance or server/enterprise grade storage solutions mostly because the only practical alternative is confined to high end data centers. Ask an IT expert or geek the various modes to connect multiple hard disks and invariably you will get a discussion involving RAID 0,1,5 (or mixed modes like 0+1, 5+0, 5+1, 6, 8 etc) and JBOD spanning with likely no mention of DFS or FRS. These are Microsoft technologies developed “in-house” by their Advanced Technologies Lab (ATL).

To understand DFS and its routes, I had to take a brief crash course in enterprise level computing as the technology was not initially developed for use in Windows Home Server finding its routes a few years before, however the similarly between DFS and the storage technology in WHS is very similar as Paul Thurrott notes in an early preview of WHS. Infact, DFS started life a as a way to transparently link various SMB (Samba) Shares in a way in which there would be greater flexibility, transparency and reliability in corporate environments with multiple data centers. DFS generally can be used in one of two ‘modes’, the first being locally administered (without an Active Directory) and the second being domain based roots which by their design provides redundancy and is the most commonly used. There is an excellent demo of this technology on the Microsoft website.

The key to software implementation of data redundancy in Windows Home Server is found in the transparent way storage shares are presented to the end user, not through a network mapped drive letter or a (classic) network share. Infact, WHS automatically shadow copies data in such a way that a copy of it exists on more than one hardware device protecting against failure. This is completely different from RAID 0 which directly mirrors the contents of an entire drive (byte for byte) onto another one to provide redundancy. In the event of a hard disk failure (or capacity upgrade) the RAID array must be taken offline and rebuilt with a replacement disk. Furthermore, because the data is mirrored from one hard drive to another, the maximum size of the mirrored array is constrained to the smallest drive in the array. Windows Home Server supports hot swapping of disks, meaning that if a hard disk fails there (likely) is no data loss nor interruption in service. If an extra drive is added (e.g. via USB) or an existing drive is hotswapped it expands the overall space available to encompass the new storage and automatically (shadow) copies the data on it’s existing drive(s) to (re)create redundancy.

The hardware requirements are significantly higher than just running FreeNAS, a minimum of a 1Ghz processor and 512Mb of RAM are required before the installation will continue making it twice (or 3/4 times) more resource hungry than F/OSS equivalents. The ability to access your data remotely (through Windows Live integration) is interesting as it acts like a RAS dynDNS service, but it means trusting a third party for your authentication. A properly configured local network with secure FTP or Samba services would provide exactly the same (if less flashy) functionality with the advantage of giving you complete control over who, what and where your network can be accessed from.

Conclusion

There are features that FreeNAS provides which ‘off-the-shelf’ NAS enclosures will not and for this it is an extremely good piece of software. For multiple harddrives and / or multiple users all requiring different services, I would recommend FreeNAS everytime possibly even with some of the ITX hardware (coupled with a PCI RAID card) suggested above, however for someone wishing to make a single HDD into a NAS for occational home use it is unlikely to be a smart choice.

Appendix : Example hardware costs

Please note, these are example prices correct at time of research, please do not go and take this as a recommendation of a system specification, it is for illustration only.

First example : Equipment already in your home.

Existing hardware eliminates a lot of initial outlay.

Harddrive: £35 (Seagate 250Gb SATA HDD) – Although I am not a fan of Seagate, there are better drives available.

Total Cost: £35

Second example: Building a very basic / cheap PC

Processor: £19 (AMD Low Power (45 W) AM2 Sempron)

Motherboard: £27 (MSI Motherboard)

RAM: £7 (512Mb Extra Value PC2-5400 RAM)

Harddrive: £35 (Seagate 250Gb SATA HDD) – Although I am not a fan of Seagate, there are better drives available.

Power Supply: £10 (Budget 350Watt) – Although I would STRONGLY recommend never buying a budget PSU.

Case: £10 (Budget ATX case)

Total Cost: £108


Third example: Building a low power ‘ITX’ form factor PC

Motherboard & Processor: £50 (Via iDOT) – Very cheap low power board

RAM: £7 (512Mb Extra Value PC2-5400 RAM)

Harddrive: £35 (Seagate 250Gb SATA HDD) – Although I am not a fan of Seagate, there are better drives available.

Case & Power Supply: £42 (Simple small case)

Total Cost: £143

Vista Bashing = Cheap Traffic!?

April 5, 2008 12 comments

It seems the web (and certainly the blogosphere) is full of posts damning Vista for various reasons and I do not believe all this harsh criticism is justified. It all came to a head when I read a particular blog entry tonight. I started writing a brief reply in order to express my feelings on the matter, but it turned into a semi-lengthy rant which I would like to reproduce in a somewhat tweaked / editing form here.

What worries me is that it is very fashionable to bash Vista. It feels like any self proclaimed Tech expert thinks it is almost their prerogative to write long anti-Vista articles based on and citing other anti-Vista articles – does anyone else see a pattern emerging here?!

For the record I should say I am a huge fan of Linux, I run more Linux boxes than Windows, but of those windows boxes, the majority are XP and only one is Vista. I am very happy with Vista as well as XP but it is about managing your expectations. It is completely unrealistic to assume Vista will run on hardware that is a couple of years old (or even some budget machines.)

Surprise surprise, it won’t, Vista has been plagued by hardware and software incompatibilities – what does this tell us? Simply that Microsoft was not lying when it said Vista is a major update to the Windows platform. Historically all major updates have had driver and software compatibility issues (anyone remember XP 5/6 years ago?!?) Drivers are the responsibility of the manufacturer NOT Microsoft, for years prior to release Microsoft were talking to hardware companies, asking them to update their drivers but most ignored them. Why!?? Very simply because they will sell more hardware if people have to go out and buy Vista certified equipment. It is not in their interest to revisit hardware they released 2 years ago – it does not make them any more money and the consumer be damned.

Saying that, there are a number of platforms / situations when Vista is clearly not suitable and for those I still run XP – it is more responsive on such hardware and has the added bonus of comfort factor (i.e. I have been using it for years and I am very familiar with it,) but lets not forget, this is old technology that has not really been worked on since 2005 (sp2.) SP3 is nothing more than a security roll up with a few extra Vista developed features added. The desktop rendering in XP (called GDI+) is based on a software stack that is several years old and incapable of hardware accelerated desktop compositing – the same thing Mac OSX and Linux have been capable of for years.

The problem is, noone seems to have a long enough memory to remember the Windows 2000 / 98 saga, or the Windows XP / 2000 saga that followed that…

There is nothing wrong with Vista*, similarly nothing wrong with XP*, nor is there nothing wrong with Linux*, and even with OSX* – it depends on what hardware you have and what you want to do with it.

* Of course it is not as black and white as this, all platforms have their inherent strengths and weaknesses.

I wish we would move beyond this fanboy like bashing, if there is merit to a discussion I am all for it, but I am getting fed up of reading the same FUD constantly. Most of it is simply fishing for cheap traffic.

/Rant 🙂