4 of 6Image
Datacentre security is vital: there are biometric readers and secure mantraps (pictured) on both levels of the datacentre that control access to the datahalls and are programmed so that the first set of doors must close before the second set opens. Each door has a proximity card reader that is pre-programmed with a client's access information.
To prevent unwanted guests from accessing the facility, there is a 3m high fence and a gate that is manned by security 24/7. If somebody were to penetrate the fence, they'd have a tough time evading the intruder alarms around the complex, and 50 infrared CCTV cameras.
These security features help make Virtus into a Tier III. Another thing clients look for in a datacentre is increased resilience so they can be confident their services will be available all the time.
"The reason people go for Tier III is twofold. One is because there is a level of redundancy built in so there can be some failure and the service will continue," David Watkins, Virtus operations director, told ZDNet. "More importantly, Tier III gives you the ability to concurrently maintain equipment."
The Enfield site fits into the Tier III category because it abides by the "n+1 rule", which means that there is always one spare piece of equipment, be it a generator or an air=conditioning unit, to provide back up in the case of a failure.
For example, the Tier III Enfield site needs two generators to function but there are three just in case one fails. Meanwhile, a Tier IV would need a complete spare set.
Watkins explained that it would have cost Virtus an extra £15m to make the site Tier IV and would have involved adding large quantities of extra equipment.
Image: Sam Shead
The heat-generating servers are kept at the right temperature by pumping cool air up into the 'cold aisles' that they sit either side of.
"There's a fan like you have on a PC that pulls the hot air out the back of the server while pulling the cold air in through the front to keep the server cool," said Watkins.
The hot air is extruded out of the back and rises towards the ceiling by convection. It eventually gets sucked back into one of the air conditioning units and is transported to the large cooling units outside the warehouse where it is cooled down again.
Again, the sites has 'N+1' cooling units, so there are four but only three are needed at any one time for operation.
When Enfield air temperatures are below a certain threshold the pumps will use the water's natural temperature to avoid expending unnecessary energy.
Image: Sam Shead
The datacentre has diverse fibre and power supplies which enter the building at opposite ends, and enable the facility to continue operating as normal even if one of the feeds experiences problems.
The power supply consists of two 8MW 11KV feeds from the National Grid but Watkins said that the site will never use the full 8MW. He claimed that the site is currently drawing on less than 20 percent of that.
Nonetheless, datacentres are power-hungry environments and all that power doesn't come cheap. Ultimately, the amount of energy used is determined not by Virtus but by the clients inside the datacentre.
"The energy bill depends upon the amount of power that tenants draw, which we can't control," said Watkins.
When operating at full capacity, the site will draw 8MW of power per hour, which is equivalent to 27,000 houses.
Image: Sam Shead