X
Business

Should operating systems have a 'best before' expiry date?

In hindsight, a complete headshot to an operating system from a kill-switch once it hits a proposed 'expiry date' is a rubbish idea. How about this instead?
Written by Zack Whittaker, Contributor
idiot-guitar-keyboard-man-zaw2.png

I'm an idiot.

When openly hypothesising and pontificating an idea to the world of many million potential readers, it would of course be a wise idea not to be an idiot about it.

Dear readers of America (and the wider world), you were right. Having a kill switch implanted into the very core of an operating system, forcing users to upgrade to the latest and greatest was a stupid idea.

This post was of course going to be called, "Who is the biggest idiot in tech punditry this week?". But according to my colleagues this would have been 'SEO unfriendly' and probably tweaked Google into bumping me up to a higher result of the search term 'idiot'.

Granted, it probably wasn't as bigger cock-up as the Morro story which haunts me like the rabid badger chasing me in my sleep... but everybody gets one.

Now that's out of the way...

Anyway, now that the admin is out of the way, let me change my original hypothesis.

Hardware rapidly changes to the point where Moore's law is negated by a huge jump in development. But these operating systems continue to live on, either on hardware it was designed for back in the day which still exists for those unable to upgrade, or on computers of today's hardware specification which it still works well on.

Back in 2001 when XP was originally released, The Register rounded out its thinking:

"[Microsoft] itself is home of the expiration date, habitually unleashes a 'new' operating system on us every year, and deploys a battery of weapons - retirement of MCP (Microsoft Certified Professional) qualifications, price juggling, withholding of handy widgets, menacing new licensing regimes - in order to achieve de facto expiration of the old ones."

The fact is, is that old operating systems are phased out as XP has been by being pulled from the retail shelves and updates being ceased, except those of the most important and critical to protecting worldwide network security.

Only last week, I was being walked through the new IT infrastructure of a London college which showed some of the most up-to-date equipment and technology on offer for students studying there; 21.5" iMac computers, Ubuntu-run library catalogue machines and Windows 7 running on the vast majority of computers with fingerprint readers and graphics tablets.

Yet when passing through an academic department - essentially a corridor of offices, I noticed most of the staff were in fact using Windows XP still. The university favoured the best technology for the students, and not so much the staff.

So if operating systems are upgraded every few years as Mac OS X is, more regularly than Windows it is fair to say, perhaps it would be an acceptable idea to give these operating systems a pre-defined expiry date, on the premise that users are aware of this and sign up to it when purchasing it.

Perhaps an all-out gunshot wound to the proverbial head would not be wise. But just as non-genuine copies of Windows kick up a fuss and change the background, a similar feature could be applied reminding users to upgrade. Again, for long-term users, an incentive to upgrade to the latest operating system providing the hardware can support it, a discount could be offered.

With the benefit of hindsight, a complete meltdown of an operating system once it hits this expiry date - maybe a full five or ten years after it is first released, would be stupid, probably illegal. I suspect that is where I went wrong first time around. But nevertheless, it would keep versions vastly current, money rolling into the manufacturers, and would allow consistency and fluidity across most end-users.

What do you think?

Editorial standards