Datacentres have begun to rule the world — vast, secure, climate-controlled IT palaces that provide the backbone of our working lives.
This round-up of datacentres from around the world gives an insight into the powerful technology that toils behind the scenes to keep the modern world turning.
Capgemini's Merlin datacentre (above) is housed within an old 86,000-square-foot warehouse in Swindon, UK. It has a total datacentre area of 10,000 square feet, and a power usage effectiveness (PUE) rating of 1.1.
PUE expresses the proportion of power that must be expended to support the IT infrastructure, versus the power that actually runs the racks, servers, network equipment and other essential components of the datacentre. A PUE rating of 1.1 means that only 10 percent of the total facility's power goes on the support infrastructure, with the rest going on the equipment. A PUE of 1.1, therefore, is pretty impressive — and it makes Capgemini's centre one of the most efficient in this collection.
The Las Vegas Sands Corporation runs two casino resorts in the gambling mecca, the Venetian and the Palazzo.
The two establishments run off the same infrastructure, which uses 300 servers to support more than 11,000 suites and rooms, 3,000 slot machines and 200 gambling tables, and run nine websites.
MORE DETAILS: Inside a Las Vegas casino resort datacentre
The bulk of the Venetian and Palazzo's core systems run on six IBM's iSeries servers (pictured), formerly known as AS/400s.
TelecityGroup's Powergate co-location facility in west London (above) is used as a spillover capacity site for data from the company's five Docklands-based and two City-based datacentres.
The facility has a PUE rating of around 1.65. As a co-location provider, Telecity is not able to choose the hardware in the facility, but it has some control over how it can increase energy efficiency.
It offers varying levels of security to companies, ranging from open racks (left) to cages (centre back) to cages with biometric and/or password protection (right).
TAKE THE TOUR: Inside Telecity's co-location datacentre
Petroleum Geo-Services (PGS) operates its European datacentre in Weybridge, Surrey. The facility crunches seismic data gathered by a fleet of ships spread across the globe.
READ MORE: Inside an oil industry datacentre
The datacentre, dubbed a "megacentre" by the firm, was opened on November 2008 and replaced a 15-year-old facility. It was designed by Keysource, a datacentre specialist contractor, and has a PUE rating of 1.148. This was achieved in part through the use of cooling techniques such as filter bags (pictured), which filter warm air and send it back through a water-cooled gate for reuse.
HP's latest datacentre is built on 13.4ha of land out in Western Sydney. Currently under construction, it has been designed from the ground up, in the form of three separate shells that are able to operate independently of each other.
Each cell provides a space of 2000m2 for racks in a hot-aisle configuration, and an additional 500m2 of raised floor space.
The majority of racks in the first cell run on a concrete slab (pictured). Hot air is exhausted out, into the aisles.
READ MORE: HP opens Sydney datacentre: photos
Rackspace's Slough-based datacentre houses the hardware for its UK cloud, along with the other servers rented by its customers. It has 1,600 racks in place, of which 120 support its cloud.
The datacentre, in operation since June 2008, was in the process of being expanded when ZDNet visited in 2011, with the company adding a further data hall. At the same time, Rackspace is bringing in new cooling systems to increase the efficiency of the site and cut its power costs.
Rackspace's server hardware is predominantly supplied by Dell. It operates a multi-vendor networking approach: Cisco is the predominant provider of switching technology, while Juniper Networks supplies backbone services and Brocade provides equipment for load balancing.
TAKE THE TOUR: Inside Rackspace's UK cloud datacentre
Perhaps the datacentre with the most impressive home of all is housed at the Barcelona Supercomputing Centre.
Located amid the colonnades and Romanesque arches of the Torre Girona chapel, MareNostrum is one of the fastest supercomputers in the world.
No longer a place of worship, today the chapel is the site of supercomputing research into computer, Earth and life sciences. The machine has 10,240 IBM Power PC 970MP processors that have a combined peak performance of 94.21 teraflops.
Of course, not all datacentres are located on dry land. The Celebrity Equinox cruise ship is 17 decks high and 317m long, and has to offer some serious IT infrastructure to meet the tech needs of up to 2,852 passengers.
For a start, Celebrity Equinox, built in 2009 for Celebrity Cruises at the Meyer Werft shipyard in the German North Sea port of Papenburg, has three datacentres, 1,600 Mac Minis, 1,350 Apple TVs, 967 hotspots and a high-performance network. Ensuring all that equipment works is the job of an IT department of seven, run by infrastructure and operations manager Marc de Lange.
Pictured is the back-end of the cabin TV and entertainment system, which is housed in the third datacentre, and consists of Apple's now-discontinued Xserve hardware.
ASG Group's datacentre and cloud computing facility in Perth, Australia opened in September 2011. The company claims it can withstand a one in 100-year flood event, and it features two separate raised data halls totalling 550m2. It has a PUE of 1.5.
Above, power to racks and cages are supplied under the floor in power cabling trays, while data is separated in overhead trays.
TAKE THE TOUR: Inside ASG's Perth datacentre: photos
Some datacentres are pushing the boundaries of cooling tech, for instance by dipping IT systems in liquid. Green Revolution Cooling says its dielectric coolant provides the most efficient cooling and lowest cost per watt in the industry, reducing total energy consumption by 95 percent.
Here are two quads with eight racks and two water modules each as part of the CGGVeritas installation in Houston, Texas, which boasts 24 racks with 600KW capacity.
IBM's 'green datacentre' in Poughkeepsie, New York has been designed as a showroom of 'best practices' for datacentre design. The facility, which performs workloads for IBM and some of its customers, has an air-cooled side and a water-cooled side.
It pumps in water from the nearby Hudson River to cool its racks and employs energy-saving techniques such as rear-door heat exchangers. Tidy cable management and heat-mapping help bring its PUE down to about 1.27.
TAKE THE TOUR: IBM's recipe for a happy datacentre, in pictures
General Electric's Adrian Shankln, Global Data Center Manager (above), shows off one of the racks of new high density servers in GE's $48m state-of-the-art datacentre in Louisville, Kentucky.
The facility is one of the first in the world with LEED Platinum certification. LEED stands for Leadership in Energy and Environmental Design and is awarded by the US Green Building Council for projects that go above and beyond standard building codes to create sustainable, energy-efficient buildings. It's tough to get the basic LEED certification, and only six percent of all LEED buildings achieve the Platinum certification.
The site has a rich history: in 1954, the Louisville GE complex became home to the first UNIVAC computer deployed in a private business (before that, all computers were part of government projects).
Colocation specialist NextDC opened the doors to its shiny new datacentre in Melbourne, Australia in July.
The facility is 17,500 square metres, delivering a 12MW ICT load. It has six data halls within a large concrete bunker, and has a PUE rating of 1.35.
Pictured above is the hot-aisle rack containment in Data Hall 2.
READ MORE: NextDC's Melbourne M1 datacentre: photos
Pictured above is the BladeRoom datacentre factory in Mitcheldean, Gloucestershire, UK.
BladeRoom has been in the business of building and selling datacentres for five years, putting to work its expertise gained over 20 years of making self-contained facilities for the healthcare and food sectors. Its containers have helped Capgemini achieve a high level of efficiency with its Merlin datacentre in Swindon.
It takes one day for the floor section of a module to be mated with the ceiling module via struts and to have its floor panelled with wood. Once the floor and ceiling have been combined (pictured, left), the module is panelled (right) and then modified internally to conform to the design specifications of the buyer.
Installed BladeRoom modules reported PUE ratings of between 1.13 and 1.34, the company told ZDNet when we visited in 2011.
GALLERY: Inside a datacentre factory
Above: a Colt modular datacentre, part of London 3, a much larger legacy Colt site in Welwyn Garden City, UK. The modules are stacked on top of one another inside Colt's 100,000-square-foot warehouse (pictured).
Colt operates 19 datacentres across Europe providing colocation and managed infrastructure services to a variety of businesses. With around over 1,000 racks, and with an aggregate power drawdown of 33MVA, London 3 is one of Colt's most important datacentres. It represents six percent of the co-location provider's datacentre capacity in Europe, the Middle East and Africa. For this reason, Colt has chosen the site as a flagship for its new modular hall design for datacentres.
The manufacture and testing of the modules is done by Colt, before being shipped to the customer site.
Iceland, with its abundant geothermal resources, is aiming to become a destination for low-cost datacentres, and colocation specialist Verne Global is one of the first to set up a facility in the country.
The company has shipped a Colt datacentre module to Iceland and it installed in an ex-NATO military base. The module consumes around 1.5MW of power. The site has a substation that can supply up to 60MW of power, and the company has secured guaranteed low-cost electricity from Icelandic utility Landsvirkjun for the next 20 years.
State-owned Landsvirkjun is able to provide Verne Global with 100-percent 'green' electricity, as it generates power from renewable hydroelectric and geothermal sources native to Iceland.
Colt's modules are built to an exoskeletal design where their power distribution (pictured), cooling and fire-suppression systems are placed on the outside of the module. This means they can be maintained without having to have lots of people traipsing in and out of the server rooms.
READ MORE: Inside an Icelandic datacentre