SUWANEE, GA. — Situated along a corridor of datacenters near Atlanta, GA, chip-making powerhouse AMD boasts a new facilities that it hopes will transform its business by reducing costs and increasing efficiency — cutting its global datacenter operations from 18 facilities to just two.
While it may not scream excitement from the outside, AMD's Atlanta facility will eventually become the company's sole U.S. datacenter, with the other in Cyberjaya, Malaysia.
Disclosure: Zack Whittaker travelled as a guest of AMD. No agreements or non-disclosures were signed.
The building itself is relatively unremarkable. But AMD isn't splashing its cash on decorative furniture and plush office space. It's investing vast sums into the new 150,000 sq. ft. datacenter in order to save as much as $8.5 million in annual savings. AMD has an 11-year lease on the property, which will run through until the end of the current decade.
(From left-to-right: Andy Bynum, Corporate Vice President, Global Infrastructure & Operations; Jake Dominguez, Chief Information Officer; Lamar Washington, Critical Environment Manager)
There is tight access control at the Atlanta datacenter, including wireless card readers to control access to various sectors of the building. Some areas are more restricted than others, including the main networking room — where the datacenter receives its 10 Gbps fiber line.
Just a handful of employees work at the Atlanta datacenter — 21 on-site staff to be exact — who share an open office space. The premises use only one-quarter of the available space so far, according to executives. But with the savings made, further investments can be made in the coming months and years.
Lamar Washington, the datacenter's manager, explains in the monitoring room the number of screens on desks and the wall that keep an eye on the vital functions of the facility. The dial displays, teams can keep an eye on power consumption, cooling, and other functions, which, as you might expect, need to stay in the green, or yellow — otherwise something clearly isn't working.
Here he is explaining that a good portion of the facility's monitoring focuses on the temperature. Atlanta, during the summer, is anything but cool. During the late-fall, winter, and spring months, the datacenter can run in economize mode, which uses less power to cool the data racks.
Here, you can see a number of spare Dell tape drives that are able to store up to 3 terabytes (compressed) of data. These can be slotted into servers when they require replacement, or fail. And AMD uses technology from rivals and partners alike, including Dell and HP.
AMD executives weren't kidding — they have hundreds of spare drives in case others' fail. There are dozens of boxes of drives, which according to executives, in this room alone there are up to 2 petabytes. In terms of the overall datacenter tape storage capacity, the Atlanta facility is approaching the 5 petabyte capacity.
Build it, overclock it, and burn it into the ground, a crucial litmus test for the various technologies and hardware components that may eventually find their way into the main datacenter. This room is about the size of an average New York City apartment — about 600-700 sq. ft. — and houses more than a dozen machines packed with hardware that run rapid-fire simulations. The room itself is significantly warmer than the other rooms, but in order to maximize its potential, the room does not haver air conditioning.
These computers are intended to burn and fail, executives said, by burning hot. The components are yanked out and replaced once they crumble. This helps determine the best technologies and hardware components for the server racks in the main part of the datacenter.
This vast space will eventually hold about 5-10 halls, segregated from each other, and packed with server racks. AMD executives said a modular concept will be used in order to expand as and when it's necessary. Right now, the warehouse-sized room is relatively bare, but it can house hundreds of servers. This is one of two main data rooms in the facility.
The main distribution frame (MDF) room where these cables run into AMD considers a corporate secret, and photos were not permitted. In the central networking room where the datacenter's fiber cables flow into, there are network appliances from HP, Cisco, and Fortinet to supply firewalls and network distribution. It houses a fiber line that allows 10 Gbps speeds, but AMD declined to disclose how much of that line is used.
Again, security remains tight within the building, including spinning doors that allow only single-person access at a time. But don't worry about the fire regulations — you can just head out through the door on the right.
The datacenter has room for 10 individual data halls. This, the "diamond" data hall, is one of two that are currently operational. This one room alone is about 1,500 sq. ft. — a fraction of the overall 6,000 sq. ft. of IT space that AMD plans to utilize.
The floors in the "diamond" data hall, like its other server rooms, are fitted with under-floor ventilation to help draw out the warm air booted out of the servers. This data hall isn't yet up to full capacity so it's not particularly warm, but the plastic sheets prevent the heat from escaping, and the warm air is channelled into the floor.
More fiber and networking cables connect the main distribution frame (MDF) room to the server racks, maximizing the data flow and increasing the speed in which desktop apps are run at their various U.S. and global locations.
AMD executives said the company was able to condense its global datacenter operations by decommissioning 76 percent of its physical servers, and 72 percent of its virtual servers.
AMD now virtualizes more than 90 percent of its corporate datacenter needs. The company said this was not just to cut down on its own IT costs and expenditure, but also to increase efficiency, which the chipmaker wants to pass on to its customers. How? By making it faster at developing and rolling out products to the end-user by focusing on its own internal productivity. According to the executives, tasks that would normally take weeks now take just a few days thanks to the datacenter consolidation.
In an innocuous looking wall-mounted panel, this display explains how much demand and output is needed to power the various chilling systems. According to AMD executives, the vast number of corporate datacenters (not limited to AMD) across the U.S. takes up about 1.5 percent of the country's power. In the next four years, that figure is expected to quadruple.
Every datacenter needs a place to relax — no, it's not that kind of room — it's surprisingly loud. But it's vital function keeps the server racks cool so they can operate at maximum efficiency. Each one of these rooms has three chill engines, and eventually there will be four chill rooms. There's currently just one, but AMD plans to bulk out its cooling network as it expands through stages two, three, and four, which concludes in 2015.
The next room packed with servers, dubbed the "citrine" data hall, is as warm as a New York City summer in mid-afternoon — so to the vast majority, it's not pleasant to stand around in for very long.
This data hall is significantly warmer than the "diamond" data hall because it is running at capacity. It's significantly warmer and not very humid as a result of the cooling engines ticking over. Here you can see a temperature gauge on the roof that monitors the temperature. If the cooling units were turned off, the room would exceed 100F in just five minutes, Washington said.
Each hall is packed with a row of more than 200 servers, which houses Dell, HP machines with various AMD chips. The company said hundreds of two to-16-coreprocessors with 256GB of memory replaced the single and dual-core processors that were used in the Austin datacenter.
These servers, along with those in the "diamond" data hall, have been consolidated and condensed down to a point where they are powerful enough to run many as 23 million jobs per month. That's roughly 31,000 each hour.
Another slightly smaller warehouse at the other end of the building won't be as filled with natural light for much longer. Currently used as a storage room, this space will house another batch of servers.
This room in the very back of the facility helps to recondition the warm air around the facility. It draws in fresh air from outside and conditions it, using free cooling techniques. If it's warm outside (particularly during the summer), the compressors kick into gear — and it's entirely automated.
The datacenter uses "free cooling" technology, which helps to economize the costs of running the datacenter. Low outside temperatures can help to cool the water that flows through the datacenter, in order to cool the server racks inside. This can be finely tuned to run overnights on particularly cool evenings, even during warmer months — automatically and autonomously — to lower power costs.
The datacenter has a row of diesel generators that can kick in at a moment's notice. With that moment taking about 10 seconds, vast uninterruptable power supply (UPS) batteries bridge that gap so that servers don't suddenly lose power or stop functioning. Each data hall has its own UPS room. AMD staff said the servers are on a closed transition switch, which means the servers (should) never suffer power downtime.
Uninterruptable power supply batteries sit opposite the diesel generators. According to Washington, there have been very few failures of anything — from server drives, power supply, and motors, and so on. And where there is wear and tear, more often than not it's something small and can be easily replaced, he said.
Many of these wall-mounted alarms are situated around the datacenter, which flash if there are issues with the coolant systems, or if hydrogen levels are higher than expected.
These tags are attached to long closed-linked chains that allow staff to close valves and isolate areas in the pipework above in case of leaks or to get secondary flow to the pipes. These overhead pipes run chilled water that flows throughout the facility.
These massive chillers weigh more than about 20 mid-sized U.S. Army tanks each, and are designed to cool the water in-flow so it can be pumped around the facility and keep the server racks at a stable temperature. The entire water system can pump as much as 752 gallons of water through the facility each minute, Washington said.
AMD's datacenter has contingencies in place for almost every eventuality. The water that flows through the facility is filtered and treated to maximize overall efficiency. The blue pipes feeds water from the city.
And should something catastrophic happen, the water tower in the back can pump water through the facility for as much as half-an-hour. It could take at minimum a couple of hours to replace the water in the tower during uptime.