I purchased a computer just a few weeks ago – a travel laptop with a whole lot of power in a two-pound package (Sony Vaio G1 – available only as a Japanese import from places like Dynamism). There was an option to get the computer with the new Windows Vista installed, so i figured I should get with the times and step up to the new operating system.
Yes, it was a mistake on my part. Nothing really works, anything I need to access is embedded in flashy but useless and counter-intuitive menus, and the things I install to get around all of Windows roadblocks end up being *removed* by Windows when it does its automatic upgrades.
Why is Vista so bad even after six years of development and a myriad of promises? Because instead of seeing computer users as its customers, Microsoft has apparently chosen to put the needs of its fellow corporations first. This shouldn’t be surprising, particularly when we stop to remember that Windows primary customers are the giant companies who install Windows on each and every one of their workers’ machines.
But the way Microsoft chose to show its loyalty through Vista was through an elaborate set of – Digital Rights Management – limitations. Microsoft wants to help corporations prevent the illegal use of their content, such as music and videos. The easiest way to do that is to limit the use of non-secure file formats. And the easiest way to do that is to not supply the codecs required to view them in one of the installed media players.
But Microsoft has taken this a step further: not only do they not supply the necessary codecs, but – at least in my experience so far – they overwrite the alien codecs if the user installs them himself. During certain updates, Vista removes files and utilities it suspects of being able to permit illegal movie watching.
If a user chooses to stop an update, there might even be repercussions. One report indicates that Windows “Genuine Advantage” notification calls Microsoft automatically, telling the company who has aborted the installation.
The utter uselessness of my new Vista machine, combined with the week I was going to have to wait for an XP disk and the current barriers to installing the Mac OS on a non-Mac machine, convinced me to download and install Ubuntu, one of the more user-friendly versions of Linux out there.
Yes, I’m working on it right now, and it makes even the Mac OS seem like a forest of unnecessary gizmos. Linux is blazingly fast compared with Microsoft’s OS, utterly simple, complete with any application you can imagine and – more amazingly – based on an entirely different philosophy than Windows. There’s a spirit of abundance and transparency in this Linux universe. Need something, and you just grab it. Pay, if you like, what you like, when you’ve determined its of value to you.
Working in Linux reminds you that your computer is just one drive in a network. That getting your machine to do something new really just means grabbing a few lines of code from someone who has tried it before. It means working in a collaborative space where productivity and creativity are more important than protecting a movie studio’s futile efforts at maintaining control-by-force over the digital media it releases.
But the fact that I’m now using it as my principle operating system means something else: that soon a whole lot of people will be, too. Linux has finally arrived. Maybe not this year, but 2008 will almost certainly be the year of Ubuntu in the same way 2005 or 6 was the year of Bittorrent. It will reach critical mass, penetrate the general market, or do whatever it is that means coming of age.
And it will herald the beginning of a new era in computing – one characterized less by limitation than by possibility, less by “security” (for Linux uses a shell) than by open systems, and less by consumerism than by collaboration and do-it-yourself. In a sense, the age of computing will finally begin. Again.