I read a great piece by my colleague, Chris Dawson, this morning entitled I love Linux, but it’s not going to save the world and after reading a few of his readers comments, I felt compelled to get in my two cents. His point (at least what I took away from the piece) is that just because Linux can run on old hardware does not mean that you should keep that old hardware lying around -- despite the somewhat specious argument of a UK study that pointed out the old hardware which was still in use was not in a landfill somewhere. (More likely, it will end up in a storage room somewhere, waiting for someone like Chris to get the time to fix it!)
To be sure, Linux is a remarkably flexible operating system and a suitable replacement for UNIX in settings where UNIX was once king. Further, it can be configured with a robust GUI and perform as a remarkably flexible desktop operating system.
In other words, both UNIX and Linux can be configured to run on very modest hardware, as long as your needs are modest. And, when your needs are not so modest, it's up to the job when deployed on robust hardware.
Some of Chris's readers scoffed at the 3-5 year life-cycles to which Chris referred -- arguing that with Linux, much older hardware would do just as well. Others took pride in being able to turn old Windows 95 systems into Linux thin-clients. Well that's fine and dandy -- if you ignore the human costs associated with maintaining old hardware.
It's not about cost, it's about productivity.
Chris, with his depth of knowledge of IT, can spend his time replacing broken floppy drives or he can be answering his students computing questions. He can be hunting down an intermittent memory or hard drive problem or he can be teaching computer science courses. Or, he can be writing a proposal for a new service which will help his educators teach -- and thus his students learn more effectively, using the latest tools. Which do you suppose is the best use of his time?
In his own school, Chris has an extensive mixed environment designed to meet a variety of needs for his students and his faculty. In a university setting, the goals are exactly the same, except that the range of needs is far greater -- as is the diversity of skill levels of the user population -- faculty, staff, and students. Further, the extent and reliability of services provided by the university can have a far-reaching impact -- not just on faculty and students on your campus but on researchers throughout the world who have become dependent upon those services for collaboration with your faculty.
Cost pressures, plus lack to expertise (in both accounting and IT) on the part of Administrators and Ed Tech staff often lead to a consumer's approach to IT upgrades:
- If it ain't broke, don't fix it.
- Buy it and use it until it breaks.
- It must be less expensive to fix than to replace.
- Buy no more than what I need now.
- Buy the least expensive, the vendor doesn't matter.
The enterprise (be it in a business or university setting) must ask larger questions:
- What do each of my users need? How long will they have this need?
- Will this product still meet those needs for three to five years?
- Will the vendor fix it if it breaks during its life-cycle?
- What on-going expenses will be incurred by choosing this product?
Many readers assume that three-to-fives years is an arbitrary period of time for hardware life-cycles. It is not:
Accountants like five years because hardware is a capital expense. Tax law dictates how capital expenses are treated and whether you are in a not-for-profit environment (in which educational institutions find themselves) or in business, these rules of accounting still apply.
IT professionals like three years because of Moore's Law. Moore's law states that processor power doubles every 18 months. Though merely a supposition, it has proved to be a reliable measure for nearly thirty years. This means that the machine you buy today will be, at best, 25% as powerful as the machine you will buy three years from now.
A number of our readers like to criticize the apparent trend toward bloated (others might say feature-rich) software. Their argument is that adding all that bling to software makes it no more useful to the average user yet makes us dependent upon more and more robust hardware -- thus shorter hardware life-cycles. (I find it ironic that other readers argue that an apparent lack of competition thwarts innovation. So which is it? Bling or innovation?)
Looking to the second point above, it becomes clear that software which is upgraded often will need more robust hardware sooner rather than later. Considering that software publishers always lag the capabilities of the hardware, three years comes out about right.
Case in point: Vista will run (albeit poorly) on six-year-old hardware and it runs easily on properly configured three-year-old hardware -- though it runs best only on the newest hardware (at any price-point).
When the dynamics of the environment dictate the need -- not the average user -- the needs of the few today should dictate life-cycle purchases for the many who must still be able to use that hardware in three to five years. Try to get more than five years out of your hardware and you will leave your most robust users (be they educators or students) out in the cold with no way to meet their needs.