Inside an oil industry datacentre

Inside an oil industry datacentre

Summary: ZDNet UK has toured an oil industry 'megacentre' to see what types of demands this strenuous, computing-intensive industry places on its datacentres

SHARE:

 |  Image 5 of 8

  • Thumbnail 1
  • Thumbnail 2
  • Thumbnail 3
  • Thumbnail 4
  • Thumbnail 5
  • Thumbnail 6
  • Thumbnail 7
  • Thumbnail 8
  • PGS aisle

    The megacentre has a power usage effectiveness (PUE) rating of 1.148, which includes the power cost of the separate mini-datacentre used to store the tapes. The main processing hall itself has a PUE of 1.127.

    Power usage effectiveness expresses the ratio between the total facility power and the power used for the IT equipment itself. The closer a PUE rating gets to one, the greater proportion of a facility's power is being expended on its IT equipment and the lower on the supporting infrastructure.

    The PUE of 1.127 was achieved through the separation, cooling and recirculation of air within the datacentre, using a combination of adiabatic cooling, outside air and filtering to cool the air without spending much on power. The facility can be free-cooled for all but 100 hours per year.

    Each set of racks face onto one another, with the exhaust vents facing into a central corridor, sealed off from the rest of the datacentre. Inside the corridor, the hot air rises into a ceiling aisle and passes through to the cooling systems.

    Photo credit: Jack Clark


    Want to know more about PGS's 'lunatic fringe' computing? Read ZDNet UK's datacentre tour diary.


  • Heat aisle corridor

    Inside the heat aisle corridor, temperatures can climb to as high as 49°C. This is partially because of the intensity of the jobs run by the servers, Turff said. Typical co-location facilities run at around a 5KW drawdown per rack, but PGS runs at between 15 and 20KW, due to the relative power-intensity of its high-performance computing specification servers, he added.

    The hot air rises and is conducted through an overhead plenum into an adjoining sequence of interconnected rooms, then it is pushed back into the datacentre to be drawn through the front of the servers.

    Photo credit: Jack Clark


    Want to know more about PGS's 'lunatic fringe' computing? Read ZDNet UK's datacentre tour diary.


  • Filter bags

    After the air has left the rack cabinets, it travels across the plenum above the megacentre's ceiling. It then comes down into a room filled with filter bags (pictured), through which it is strained.

    Photo credit: Jack Clark


    Want to know more about PGS's 'lunatic fringe' computing? Read ZDNet UK's datacentre tour diary.


Topics: Datacentre Tour, Networking

Jack Clark

About Jack Clark

Currently a reporter for ZDNet UK, I previously worked as a technology researcher and reporter for a London-based news agency.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

Talkback

0 comments
Log in or register to start the discussion