There used to be a point to operating systems. They provided a necessary context for software applications to exist, run and interact with users. Of course, they are still necessary at a fundamental level, but they are much less relevant to their users than they used to be.
After using Windows 7 non-stop since January, about the most complimentary thing I can say about it - and I'm not being in any way facetious - is that I don't notice it. Unlike its predecessor, Vista, it doesn't randomly draw attention to itself by interrupting you unnecessarily with warnings and alerts, and it doesn't progressively slow to a crawl or get confused while it offers up the computing equivalent of the thousand yard stare with its spinning donut of death. And Windows XP for all it had going for it, itself couldn't help demanding that you chickity-checked it before you rickity-wrecked by doing a tedious wipe and re-install every year or so.
I have exclusively used OS X at home for five years and once I had gotten over the initial "oooh I'm a cool Mac user, look at me!" phase, my amazement plateaux'ed and levelled off at 'moderately impressed' for much the same reasons as Windows 7. (This may be also partly due to the fact that OS X has not seen anything but modest levels of improvement and innovation over the last seven or so years of updates. I mean, a transparent menu bar and Spotlight? Whoop-dee-doo.)
And as much as virtualizing one operating system within another, or setting up a dual-boot partition on my Mac to enable me to run Windows or OS X as I choose really appeals to my inner geek, the practical reality is I've never needed to do it. It's a cool tech demo to impress friends, or a security blanket for Windows users worried about losing something important when they port to a Mac.
I think my point is that operating systems are at their most relevant when they get noticed. And I don't mean in a good way. You couldn't help notice that you were running MS-DOS and early versions of Windows as they operated more like an extra application layer upon which you ran your actual applications. Plus, operating systems provided a proprietary context around which operating system builders could create successful businesses.
But today, good operating systems are the ones that just get out of the way and let you edit your photos, build your websites, write your mails and use your apps. And as such, they are at best only momentarily relevant.
Soon I don't think people will draw much if any distinction between one piece of hardware over another based upon which operating system it runs. And if that's the case, it makes you wonder what Google could hope to achieve by adding another one to the list. Unless it's just lashing out in many different directions, hoping that one of its many bets may finally pay off and save it from its over dependency upon search advertising.
Hey Eric, 1989 just called and says it wants its business plan back.
After using Windows 7 non-stop since January, about the most complimentary thing I can say about it - and I'm not being in any way facetious - is that I don't notice it. Unlike its predecessor, Vista, it doesn't randomly draw attention to itself by interrupting you unnecessarily with warnings and alerts, and it doesn't progressively slow to a crawl or get confused while it offers up the computing equivalent of the thousand yard stare with its spinning donut of death. And Windows XP for all it had going for it, itself couldn't help demanding that you chickity-checked it before you rickity-wrecked by doing a tedious wipe and re-install every year or so.
I have exclusively used OS X at home for five years and once I had gotten over the initial "oooh I'm a cool Mac user, look at me!" phase, my amazement plateaux'ed and levelled off at 'moderately impressed' for much the same reasons as Windows 7. (This may be also partly due to the fact that OS X has not seen anything but modest levels of improvement and innovation over the last seven or so years of updates. I mean, a transparent menu bar and Spotlight? Whoop-dee-doo.)
And as much as virtualizing one operating system within another, or setting up a dual-boot partition on my Mac to enable me to run Windows or OS X as I choose really appeals to my inner geek, the practical reality is I've never needed to do it. It's a cool tech demo to impress friends, or a security blanket for Windows users worried about losing something important when they port to a Mac.
I think my point is that operating systems are at their most relevant when they get noticed. And I don't mean in a good way. You couldn't help notice that you were running MS-DOS and early versions of Windows as they operated more like an extra application layer upon which you ran your actual applications. Plus, operating systems provided a proprietary context around which operating system builders could create successful businesses.
But today, good operating systems are the ones that just get out of the way and let you edit your photos, build your websites, write your mails and use your apps. And as such, they are at best only momentarily relevant.
Soon I don't think people will draw much if any distinction between one piece of hardware over another based upon which operating system it runs. And if that's the case, it makes you wonder what Google could hope to achieve by adding another one to the list. Unless it's just lashing out in many different directions, hoping that one of its many bets may finally pay off and save it from its over dependency upon search advertising.
Hey Eric, 1989 just called and says it wants its business plan back.