Register for your free ZDNet membership or if you are already a member, sign in using your preferred method below.
Why Capgemini thinks its Merlin datacentre is the most sustainable around...
This is Capgemini's Merlin datacentre due to open on 12 October in Swindon, Wiltshire. The IT service provider touts the 3,000 square metre datacentre as the most sustainable facility in the world.
Capgemini's sustainability claim is based in part on the Power Usage Effectiveness (PUE) rating of the site, and also takes into account the amount of embedded carbon used in the construction of the datacentre, water use and contaminants produced.
Merlin's PUE is 1.1 which is lower than Google's E datacentre (1.12) and HP's Wynyard facility (1.16), although Yahoo! recently opened a datacentre with a PUE of 1.08.
The datacentre is located in an existing warehouse (shown above) which was previously used by car company Honda to store components for its manufacturing plant down the road. The fact that this is an existing building means no carbon was emitted in the construction of the site.
Capgemini will use Merlin to allow businesses to place their hardware into the datacentre which the company will then manage.
Merlin is made up of a number of 250 square metre modules which contain up to 1,248 server racks apiece. There are currently four modules in place but the warehouse will eventually hold 12.
The modules are transported by lorry to the Merlin site where they are placed inside the warehouse.
When a customer's service contract ends with Capgemini, they have the option of taking the module out of the Merlin facility and taking it to another location, where they can run it themselves or hand over the maintenance to another service provider. This means the datacentre can be moved around rather than it being necessary for a new facility to be built when a business decides to move its datacentre operations.
The modules have been built using sustainable materials sourced within a 100 mile radius of where they were manufactured. Only one component was sourced outside this radius and the modules are 95 per cent recyclable.
Shown above is the entrance to one of the four modules already in place. The entrance is an air lock to stop air inside the module being contaminated with dust and to maintain its temperature.
Security measures at Merlin include biometric fingerprint detectors, shown above, fitted on several of the interior doors.
Merlin is rated at Intrusion Level 3 and also includes a perimeter fence that triggers an alarm if vibration is detected. The area between the perimeter fence and datacentre building is covered by infra-red beams at night, to detect intruders, and intelligent CCTV that responds to movement.
Outside the datacentre, Merlin's security features also include these gates and metal bollards - which are designed to withstand the impact of an articulated lorry travelling at 30mph.
This is the touchscreen control panel within one of the datacentre modules. It monitors conditions within the datacentre including air temperature, fan speed and PUE.
The datacentre is also being touted as cheap to run: "We wanted green to be seen as not costing lots of money," Paul Anderson, European infrastructure outsourcing programme director for Capgemini UK, told silicon.com.
The air optimiser (pictured at the back, centre) draws in air from outside the building and filters it before it goes in to cool the server equipment. The air can also be cooled by passing it over elements filled with cold water (adiabatic water cooling) but Capgemini says this will only be used in emergency situations.
Merlin was located in Swindon due to the stable climatic conditions in that region - the temperature in Swindon has never risen above 31.5C since records began. Swindon is also within 40 miles of the Capgemini datacentre in Bristol which is required for synchronous connectivity for the backing up of data.
This is a view into one of four 'cold aisles' in one of the Merlin modules, through vents in the access door. The datacentre draws air from the air optimiser through the door into the datacentre to cool the server equipment. The vents in the door are fully open in this case.
The vents in the cold aisle access door can be closed, as shown above. This happens automatically when less cooling is required – in this case because there is not much server equipment operating in this aisle.
This is one of the 'hot aisles' where the air used to cool the servers emerges, and is drawn out of the datacentre and out of the building via large vents. The temperature of air coming out of the modules can hit 47C.
This picture shows the power cables that bring electricity to each module. The cable trays are 5m above the ground meaning they're extremely hard to access should anyone wish to tamper with them.
Each module provides 1,000 watts of power per square metre which can be upgraded to 2,000 watts per square metre.
These are the three diesel-powered generators that kick into action in the event of the national grid electricity supply failing. There will be six generators when the Merlin facility is running at full capacity.
Merlin uses flywheel technology to provide the uninterruptable power supply (UPS) in the event of an outage, after which the generators kick into action. The flywheel is driven by the mains electricity power and continues to spin to provide 18 second of power between the mains power being cut and the generators starting up.
Traditional UPS systems use batteries but due to the sustainability commitments it had with Merlin, Capgemini wanted to reduce the contaminants (such as battery acid) used in the datacentre.
Merlin also has several rooms where customers can set up their kit and test it before putting it into one of the datacentres, as pictured above.