Datacentres are by their nature somewhat sterile and antiseptic places, but many of them hide a dirty little secret: cables so tangled they make the plots of Days Of Our Lives look logical by comparison.
I was reminded of this while touring through the datacentre of ISP Internode in Adelaide this week.
Opened earlier this year, the CBD facility is the second one built by Internode in its home town; the first experienced a common problem. "Our previous datacentre ran out of power before it ran out of space," explained product manager Jim Kellett.
The new facility shouldn't have that problem for quite a while, in part because it's making increasing use of virtualisation to increase the number of businesses that can be handled on a single server.
Want to know more?
The benefits of virtualisation are well understood -- lower hardware costs, reduced power consumption, fewer systems to manage -- and Kellett says that the shift has been entirely beneficial. "It's really hard to come up with a downside to virtualisation."
Currently, the virtualised systems are used to provide specific packaged hosted services, but Kellett says the company is examining the potential for running virtual private servers, where companies could rent access to a specific basic virtual server (or servers) which could be configured however they liked.
The challenge, Kellett notes, is working out how to price and market such services, since virtualisation is still in its relative infancy.
As I mentioned at the start, one concealed upside of virtualisation is in cable management. Internode's own boxes within the datacentre, used for running its internal systems and for its hosted customers, look like textbook examples of how to cable: neatly bundled sets of exactly the right length, with nothing tangled and no confusion about what goes where.
Sadly, it's a different story in some of the racks rented by other businesses who install their own equipment, where the key design principle seems to be Spaghetti Junction.
One system has so many tangled cables I can only assume that the systems administrator involved decided to make things as confusing as possible before switching jobs.
Quite apart from making life difficult, tangled and excessively lengthy cables can have performance implications. Keeping datacentres cool relies on efficient air flow, and an excessive clump of cables can interfere with that. Having said that, though, everything seems to be in working order, so maybe I'm just being paranoid in my desire for order.