/>
X

A peek inside NextDC’s S2 data centre

NextDC let ZDNet inside its second of three facilities in Sydney. Here’s a look at the 30MW facility located in Macquarie Park, 13 kilometres north-west of the Sydney CBD.

|
headshot-asha-mclean-for-zd.jpg
|
nextdc-pic-01.jpg
1 of 11 Asha Barbaschow/ZDNet

Welcome to S2

NextDC opened its Sydney S2 data centre doors to ZDNet, offering a glance at the company's ninth facility in Australia.

S2, as the name suggests, is NextDC's second facility in Sydney, based in Macquarie Park, 13 kilometres north-west of the Sydney CBD. S2 is 700 metres away from S1. S3, which is on its way, will be located in Gore Hill, 5 kilometres down the road from S1 and S2.

S1, which was opened in 2013, boasts a 16MW total load. It took five and a half years to fill. S2 total load will be 30MW and NextDC has already sold more than 20MW in less than two years.

S3's initial IT load is expected to be 12MW, with total capacity of 80MW expected to be available by the first half of 2022.

Here's a look inside S2.

nextdc-pic-02.jpg
2 of 11 Asha Barbaschow/ZDNet

Still under completion

S2 went live in May last year, but the company was still building the walls, the floors, and everything else around its first live data hall.

The facility achieved functional completion in the first week of August 2020 and it is mostly complete, with a few bits of work still going on. There's also not many people around, but that can be attributed to COVID-19.

Work on S3 has already started, with NextDC head of channels Steve Martin saying the grounds have been cleared and piers are being dug, with construction to commence "soon".

nextdc-pic-03.jpg
3 of 11 Asha Barbaschow/ZDNet

A bit of background on S2

This is a model of the building, the easiest way to explain the engineering behind the facility.

In simple terms, it is a building of two halves: The eastern half is all full of infrastructure -- that's plants and equipment that power the facility and keep it up and running and operational; the western half is all data halls -- customer racks.

"What's interesting about this, it's a bit like a bee's nest, a whole bunch of consistent holes in the building, that's because we pre-built all of our infrastructure, all of our rotary UPS infrastructure, all of our diesel generators, our electrical infrastructure," Martin explained.

"We built them on skids and simply slid them into the building -- it was a plug and play style of expansion with the building."

Most of the previous buildings have been built and the diesels have been rolled in, provisioned inside the building, and the electrical equipment has been brought in and then similarly installed.

"We actually did something similar in Brisbane in B2 but that infrastructure was actually on the roof of the building rather from the side, and we also did something similar in P2 in Perth. But this is the first time we've actually installed the infrastructure from the side of the building," NextDC chief sales officer Adam Scully added.

The idea is that as the facility increases, the company can slot in more supporting infrastructure as required.

nextdc-pic-04.jpg
4 of 11 NextDC

There’s even a customer chillout area

"One of the beginning DNA components of NextDC is our strong focus on the people that come to our data centres," Martin said.

To that end, NextDC provides customers with a "chillout" space, as well as meeting space for customers to use.

There's even a vending machine boasting data centre spare parts.

"When you're onsite and you need to get a patch cable or some rack nuts, cable ties, whatever it might be, you can grab what you need straight out of the vending machine," Martin said.

"We try to think of all of these details to make the human experience a bit more enjoyable."

The massage chair, however, is taking a rest during COVID-19.

nextdc-pic-05.jpg
5 of 11 NextDC

An upping of security

"We build our data centres so that you don't ever have to interact with us if you don't want to," Martin joked.

From a security point of view, customers and visitors are to carry an access card around at all times, with fingerprint-protected access defining each individual's authorised path throughout the facility.

After swiping your card and having your fingerprint approved, you enter a capsule before you can get into the fun stuff.

Each data hall aisle has a swipe point and that defines which rack/s you can access.

Access can additionally be controlled through the NextDC 1DC app.

There's also over 500 cameras monitoring the entire facility as it's open 24/7.

nextdc-pic-06.jpg
6 of 11 NextDC

Interconnect rooms

This is a photo of the exterior. Due to the nature of what's inside these rooms, there are four interconnect rooms inside S2, which is where NextDC's carrier and telecommunications partners bring their networks into the building.

"We have lots of networks come in, all of the major carriers and many of the minor carriers will run cabling infrastructure into here," Martin said.

The interconnect rooms are connected into different pits in the street for customers to get that "redundancy and diversity they require".

Martin said many of NextDC's larger customers take both path diversity and often carrier diversity.

nextdc-pic-07.jpg
7 of 11 NextDC

Powering the S2 beast

There's a lot going on inside an electrical plant room, but the primary thing is a rotary fly wheel. It weighs about 3 tonnes and spins at 3,000 revolutions per minute, containing enough kinetic energy to power S2 for 15 seconds.

"If there is a mains power outage, the mains power comes through this fly wheel, spins the fly wheel, and then delivers power to the whole building," Martin explained.

"If the mains power is no longer spinning that fly wheel inside the switch room, it detects there is no mains power coming in, so it sends a signal to our diesel generator and the diesel is a hot start diesel, so it means that we can turn it on and be flat to the floor delivering power in about 5 seconds."

There are 24 fly wheels and 24 diesel generators in S2 at the moment.

NextDC classes S2 as a "tier 4" facility, the only one in Sydney and one of four in Australia -- the other three are B2 in Brisbane, M2 in Melbourne, and P2 in Perth.

"Our generation one data centres and a number of other data centres around the country are tier 3 or tier 3-like, meaning they are concurrently maintainable, this means you can pull down any piece of infrastructure and do maintenance on it, and still deliver 100% uptime," Martin said.

"Tier 4 we are fault tolerant -- I can have a disaster with any infrastructure … we build every single piece of infrastructure in its own IR-rated room."

nextdc-pic-08.jpg
8 of 11 Asha Barbaschow/ZDNet

In the data halls

The idea, Martin said, is to make it easy for customers to deploy at any scale they wish.

There are quarter racks and cages available, and everything in between.

"Customers can just buy a rack and put their kit in it, ready to go. Very quick, very simple, very easy process," he said.

"For larger deployments or compliance requirements, they might take something like [a cage] and that cage is a secure environment dedicated to that customer."

Cages are built to the specific size that a customer requires.

One customer takes up three-quarters of an S2 data hall.

nextdc-pic-09.jpg
9 of 11 Asha Barbaschow/ZDNet

Some empty racks

One of the rack examples offered to NextDC customers.

nextdc-pic-10.jpg
10 of 11 Asha Barbaschow/ZDNet

The cooler part of the facility

"We use hot aisle containment, that means we contain the hot air and we extract it out of the back of the racks," Martin said. "It goes in through the roof, comes in, computer-rated air conditioning systems, the hot air gets cooled, and the cold air blows back into the data hall through the vents in the wall."

It is audibly louder where there's more compute.

The blue indicates the front of the racks, with a "room" behind each rack dedicated to sucking out the heat.

"The cooling all happens through the front, the cold air is going in through the front, sucking through the fans in the individual equipment, and blowing hot air through the back," Martin told ZDNet.

"What we do with the containment is keep the cold air separate from the hot air, otherwise If we didn't, that cold air would be hot feeding into the racks at the end of those aisles."

NextDC has already gone carbon neutral for its own IT load, but there's more on the horizon where "green" initiatives and sustainability is concerned, Martin said.

nextdc-pic-11.jpg
11 of 11 NextDC

From S3 to S4 and beyond

Scully said NextDC likes to do everything in-house, boasting that its own team of engineers were charged with designing new facilities and upgrading existing ones.

Although work on S3 has only just begun, S4 is on the drawing board -- as are many others -- with a focus on maintaining customer needs.

"We will carry on tier 4 as our standard into generation three, but we like to make sure that we're focused on what customer needs are, so through our evolution we've seen customer rack density increase," Scully said.

He said when NextDC started, average customer rack density was about 2-3kW with "generation two", which is about 6-10kW. For "generation three", the compute needs of customers are 10kW and beyond in a single rack.

You can expect high-density data centres from NextDC so the company can accommodate a large amount of compute in a single rack, Martin said.

"We're also looking at the growth in public cloud and hyperscale, so they have a very different approach to their needs in a data centre," Martin added.

The council planning after land is acquired is actually the longest process.

Related Galleries

A peek inside NextDC’s S2 data centre
nextdc-pic-01.jpg

Related Galleries

A peek inside NextDC’s S2 data centre

11 Photos
What a brand new data center looks like - from the inside
02.jpg

Related Galleries

What a brand new data center looks like - from the inside

11 Photos
Pictures: Inside Lenovo's new Beijing campus
lenovo-campus-intro.jpg

Related Galleries

Pictures: Inside Lenovo's new Beijing campus

22 Photos
High-performance storage: From flash drives to server hard drives (April 2018 edition)
OWC ThunderBlade SSD

Related Galleries

High-performance storage: From flash drives to server hard drives (April 2018 edition)

12 Photos
Photos: Inside Apple, Facebook, Google, IBM's frozen Nordic datacenters
ndcboom7.jpg

Related Galleries

Photos: Inside Apple, Facebook, Google, IBM's frozen Nordic datacenters

13 Photos
Photos: Inside vast abandoned mine set to be world's biggest data center
lmd07.jpg

Related Galleries

Photos: Inside vast abandoned mine set to be world's biggest data center

10 Photos
The 10 scariest cloud outages (and lessons learned from them)
nasdaq-businessman-losses-corbis.jpg

Related Galleries

The 10 scariest cloud outages (and lessons learned from them)

20 Photos