Photos: Inside Rackspace's new datacenter and how it aims to cut millions from running costs
Bare earth 15 months ago, this 15-acre site is now home to managed cloud company Rackspace's new 130,000 square foot datacenter, which it unveiled this week.
The way the facility implements a number of datacenter design and power-saving features gives it a Power Usage Effectiveness (PUE) rating of 1.15 – well below the industry average of 1.7. Rackspace expects that rating to translate into multi-million dollar savings projected over the next 20 years.
The installation has been created by datacenter development firm Digital Realty near Gatwick airport, south of London, close to the existing Rackspace metro fiber ring and the main European long-haul fiber-optic network.
The Crawley, Sussex, datacenter will serve all regions but Rackspace is expecting particular demand from UK and European customers.
Entering the facility
Here's the main entrance to Rackspace's Sussex datacenter, which contains four data suites with a total capacity of 12MW. The whole site could eventually house two further data hubs of similar scale, taking the total capacity up to about 30MW.
The company already has a datacenter in southern England, in Slough to the west of the capital.
Attention to the new installation's green credentials extended to its construction, which minimised landfill and environmental impact and used locally-sourced material wherever possible, according to Rackspace.
Development firm Digital Realty also adopted a modular approach to key items of plant, such as uninterruptible power supplies and generators, which were assembled in workshops offsite and installed as finished units.
Along with the usual perimeter fencing, barriers and an inconspicuous location that even local taxi drivers find hard to track down, the new datacenter uses badge and biometric scanners to control access.
In the background you can see the mantrap – a space with secure doors at either end – through which visitors and staff must pass to gain access to the main facility, where further access restrictions apply.
The building is designed with a central spine - containing facilities such as the build room for testing and assembling new hardware – flanked by the four data halls, two on either side.
The tech ops room
The technical operations room is the facility's nerve center. From here, Rackspace staff can monitor events ranging from rack-level power consumption to a generator failing to start.
Every rack has its own independent A and B power feeds. Whatever current a rack is drawing is visible to technicians in the tech ops room, contributing to their comprehensive view of load management across the whole site.
The Rackspace campus is served by its own 132kV substation, which will support 72MW. That substation is supplied via an A and a B power feed from the national grid at nearby Three Bridges.
50,000 physical servers
The initial configuration for the Rackspace datacenter allows for 50,000 physical servers, seen here in data hall one.
All the equipment arrives at the site in an external deboxing area, to minimise the amount of dust and debris in the installation, before the hardware is moved to the build room.
As well as about 30 Rackspace staff, the company will be using a number of other workers at the site provided by third parties in areas such as building management and physical security.
Indirect air cooling
With an average temperature of 24 degrees Centigrade – about 75 Fahrenheit - the data halls are flooded with air from the white grills seen on the right of this image using indirect air cooling systems from Excool.
Behind the server racks, the closed doors seen on the left of the picture slide open to reveal the hot aisles, where temperatures range from between 32 and 34 degrees Centigrade, or 90 and 93 degrees Fahrenheit.
Inside a server rack
This is the view from inside a server rack, looking up to the ceiling of the datacenter. All the cables are colour coded to aid maintenance.
So far, some 200,000 metres of cable have been used in the construction of the site.
This picture shows how the designers of the Crawley facility have minimised structural columns to improve air flow and dispensed with the typical datacenter raised floors, using solid concrete instead.
Rackspace said this measure is consistent with its aim of eliminating infrastructure where possible, including power-distribution units.
The company has also used transparent-pane LED lighting throughout the site after a survey of staff movements in its other datacenters showed that the more expensive technology could pay for itself after 12 months.
The survey revealed that with about 1,000 staff movements a week in its data halls, LEDs are a cost-effective choice.
Up on the roof
The roof is gently pitched to allow the collection of as much rain water as possible for later use in sprays in the indirect air cooling unit.
To the left of the gantry is one of the uninterruptible power supply rooms that back up the main supply.
For each 3MW data hall, the designers have 10 units in an N+2 configuration. The idea is that eight units run while the other two are on standby. In fact, all 10 usually run at the same time at a lower level for greater energy efficiency. In the case of a failure of one of the units, the remaining machines take more of the load.
The equally-spaced small domes below and to the right of the gantry are the light intakes for the sun pipes, or sun tubes, which help provide natural illumination during the day to the building's main spine.
The sun pipes
Seen here in close-up, each sun pipe, or sun tube, forms another element in attempts to make the site as green as possible.
The sunlight passes through lenses at either of the tube whose internal surfaces are highly polished.
Seen from below, in the right conditions the sun pipes can appear as bright as the LED units next to them.
This roof-mounted indirect air cooling system, which operates without mechanical refrigeration, is at the heart of Rackspace's energy-saving measures.
The 1.15 PUE rating that the cooling technology helps provide gives a £40 ($60) saving per kW per month compared with an average 1.7 PUE installation. Over the life of Rackspace's 20-year lease, assuming the site is running at full 30MW capacity, the savings should run into millions of dollars.
Room for expansion
For the moment, only part of the Crawley datacenter has been fitted out. Rackspace said the overall campus also allows for further expansion in stages, to allow for improvements in technology.
Rackspace and Digital Realty have contributed the design of the building to the Open Compute design project.