Data centre 101

Secrecy seems to shroud the data centre arena -- all well and good for security's sake, but not so great when trying to pick a provider. RMIT IT Test Labs' Kire Terzievski pulls back the curtains to find what data centre options exist.



Behind closed doors
Secrecy seems to shroud the data centre arena -- all well and good for security's sake, but not so great when trying to pick a provider. RMIT IT Test Labs' Kire Terzievski pulls back the curtains to find what data centre options exist.


Contents
Security
Building management
Cabling
Managed services
Second data centre
Data centre checklist
Sidebar: HP develops smart rack
About RMIT

Most companies, for some time now, have recognised the benefits of using data centres to increase reliability, redundancies, and processing speeds.

SAN (storage area networks), NAS (network attached storage) devices, and server farms -- the data centre -- places all of this in one controlled local environment.

There are two types of data centres -- the corporate data centre and Internet data centre. The corporate data centre is maintained from within the corporation while the Internet data centre is operated typically by an Internet service provider (ISP).

We are going to look at what makes up a data centre and what to look out for when choosing one to host your equipment. Finding a terrible lot of information about data centre make-up can be hard as most data centres, won't make certain information public because of concerns over security.

Designing data centres is definitely a specialty area. The equipment that is housed inside is complex and there are specific requirements for heating and cooling as well as safety that must be met.

For this reason, data centres typically abide by standards when it comes to power and general safety but the rest is left to best practice.


Contents
Introduction
Security
Building management
Cabling
Managed services
Second data centre
Data centre checklist
Sidebar: HP develops smart rack
About RMIT

Security

Security is of the utmost importance when looking at data centres -- you are putting an enormous amount of trust in someone else with your servers so you want to make sure they are using state-of-the-art security equipment and that they follow best practices.

Data centre security is generally handled on a physical and electronic level. This should start at the main entry to the data centre, which should be through a main reception area that is staffed 24x7. You should also make sure there is no more than one entrance and access should be by appointment only.

Customers should be required to provide photo ID at main reception. Once approved, visitors should be issued with a temporary ID pass by the security person. This pass should be handed back to the data centre staff when leaving, at which point the visitor will be signed off the premise.

The data centre reception should also be equipped with surveillance cameras that capture digital images of visitors at all times. A good data centre will also have specific procedures to be followed in order to get access and this might even entail having visitors escorted at all times.

The next line of security should be mantraps. A mantrap is a small room with two doors where a person is authenticated by a security guard, biometric system, or swipe card, and then allowed through the first door into the mantrap.

If an alarm is activated due to failed authentication, the first door locks, trapping the person. The person has to be authenticated again at this point to be able to get past the second door.

Some clever mantraps will use devices to weigh a person who enters to make sure only one person at a time is walking in (and not piggybacking someone else).

Both the mantrap and the main reception area are required to be bulletproof.

Data centres can occupy just one room of a building, or a floor or even a whole building. As you would imagine, they house racks full of servers (mainly 1RU) that are locked in cabinets. These cabinets are locked in cages which go the full height of the ceiling. Floors are laid out with customer cages spread over a number of rooms.

Typically, a person from the data centre would open the cage where your cabinets are. You don't get a key to your cage -- this is kept by the data centre. They, however, don't have a key to get into your cabinet, you get to hold onto that.

Cages should also be equipped with cameras that are recording at all times. The data centre should also keep a list of assets that are checked in and out by customers.

Physical security is just as important to your data centre as IT security.

We're not talking about a big security guard walking around a data centre holding a torch but rather things that will protect your assets from earthquake, fires, and flood -- all events that can spell disaster.


Contents
Introduction
Security
Building management
Cabling
Managed services
Second data centre
Data centre checklist
Sidebar: HP develops smart rack
About RMIT

Building management

All physical hardware devices come with environmental requirements that include acceptable temperature and humidity ranges. For that reason monitoring systems have to be in place.

HVAC (the heating, ventilation, and air conditioning system) controls the ambient environment temperature, humidity, air flow, and air filtering within a data centre. The data centre is required to connect this to the fire alarm and suppression systems, so the centre can properly shut down if a fire starts.

As fire needs oxygen, the HVAC system can feed the fire and can also spread deadly smoke into all areas of the centre, making the fire alarm even more important.

Raised floors
Servers and communications equipment squeezed into small spaces can create problems with cooling. Data centres have been designed with this in mind.

Today's servers dissipate so much heat that they produce hot spots, and can ultimately fail if not cooled properly. By cooling the entire floor to cool the servers you cannot remove the heat fast enough from all the racks, so data centres provide a raised floor.

Servers are placed in racks on raised floors with air conditioning units underneath to provide cold air directly to the front of the racks, enabling the servers to draw cold air from the front and blow it out the back. Also the rear of one rack always faces the rear of another -- this way the hot air is kept away from the front of the servers. Data centres are also monitored to keep them at a constant temperature.

Fire detection and suppression
Fire detection systems are quite sophisticated. These detection systems include things such as early warning systems that sample air molecules and detect potential fires up to two days before they happen, say in the event of overheating servers, which they will locate and detect.

Fire suppression systems are usually made up of FM-200s, which are similar to the Halon-based systems except that they don't damage the ozone. These systems are specifically designed not to cause damage to computer equipment.

Water sprinklers are simpler and less expensive than FM-200 systems. These can come in different forms. Wet pipes always contain water and are discharged by temperature control sensors. The disadvantage of this is if a nozzle or pipe breaks it can cause extensive water damage.

Dry pipes on the other hand don't hold any water. Water is held back by a valve until a certain temperature is reached. There is, however, a delay from when the water is released as it is not allowed into the pipes until the fire alarm has sounded.

UPS/Generator
Some data centres are located on top of multiple power grids driven by power companies. Power can also come from UPS battery backup units and diesel generators. Generators typically have a sufficient amount of fuel to provide power in an event of a long power disruption.

Connectivity
Typically, a data centre will have multiple Tier 1 connections (100M) entering the building. This enables them to offer multiple levels of redundancy for your Internet connection. Some are connected to some of the world's largest internet backbone service providers and strategically connected to major hubs to improve data speeds to the USA or other parts of the world.

By having direct links into the Telstra, Optus and Vodafone networks, you not only have redundancy like we mentioned before but also the expertise and the high quality of service that large telcos can have in wide area networks.


Contents
Introduction
Security
Building management
Cabling
Managed services
Second data centre
Data centre checklist
Sidebar: HP develops smart rack
About RMIT

Cabling

Cabling pathways in a data centre are found in overhead cable trays and under raised floors. The raised floors hide cables quite well and provide easy access. They are mainly run in separate pathways from power cables and fire suppression systems. This applies to both standard Cat 5 cables and fibre cables which are also run in separate pathways or ducts, mainly because they have different stress requirements to plain copper.

Bandwidth
Data centres will usually guarantee you a minimum level of bandwidth, while at the same time allowing you to take advantage of spare bandwidth so you can burst up to a maximum level. Customers have the chance to choose from a number of bandwidth plans which include monthly metered bandwidth and then excess bandwidth charged per MB. These plans are suitable for customers that don't expect to get too much traffic.

Customers can also purchase burstable plans which may vary in speed. These plans would be used by customers who expect heavy traffic.

Network
Another focal point of a data centre is the internal network.

Networks in data centres contain routers and switches that transfer data from your servers to the outside world. Not being able to communicate with the outside world would spell disaster for your company, so you will find data centres will offer fully redundant networks so there is no single point of failure.

Service levels
Data centres provide customers with service level agreements measured on a monthly basis, 24 hours a day, seven days a week.

Most of the time you will see data centres advertising 99.9 percent uptime, or with few more nines after the decimal point. This information is usually found on data centre Web sites. Scheduled maintenance or outages caused by other carriers are not considered to be part of the normal downtime.

Accessing your equipment
Efforts to conserve space within a data centre KVM (keyboard, video, and mouse) switch are becoming a lot more popular.

These efforts allow a single keyboard, monitor and mouse to control multiple servers. Hardware such as the Command Centre from Raritan, which is new to the market, allows you to remotely monitor all your servers through one box and you only have to remember one IP addres -- that of the Command Centre.

Once connecting to the Command Centre you can view information on ports, users and devices connected to your servers.

Cost
The cost of leasing rack space can vary dramatically. It depends on the levels of security that are offered and the bandwidth that is purchased. You also have to take into consideration managed services fees. It's really best to contact the hosting facility so they can work out what's best for you. Data centres will rarely advertise cost, preferring instead to provide this information upon request.

Data centres offer two types of racks -- open racks and locked cabinets. Open racks work out to be a lot cheaper but they are less secure.

With an open rack, your rack is enclosed in a cage with other customer's equipment. The racks aren't individually locked, but are locked in the cage. When you need to access your equipment a security person would typically escort you to your rack.

With locked cabinets you get the flexibility of having your own security key to the cabinet. You can also lease a whole cage if you think you will fill it up as well as secondary redundant cage or a rack at a different site.

Bandwidth is usually charged per MB and for incoming traffic only and depending on whether the data centre has access to multiple Tier 1 connections, you tend to pay more.


Contents
Introduction
Security
Building management
Cabling
Managed services
Second data centre
Data centre checklist
Sidebar: HP develops smart rack
About RMIT

Managed services

Most data centres will offer managed services, providing products from servers to communications equipment, skills and security consulting and operations for UNIX/ Linux, Windows, and Mac Platforms. Suppliers of managed services usually offer term contracts of six months to three to five years. They have highly qualified operators working on a 24x7 shift structures with skills in all operating systems, and LAN and WAN systems.

Managed services suppliers take control of your IT equipment from financing the purchase of the equipment, locating the equipment within their data centre, installing links between it and the customer (if need be), and most importantly, provide a service level agreement (SLA).

There is quite an extensive range of managed services that are available. General offerings include:

  • Managed hardware.
  • Server configuration.
  • Managed server hardware, operating system and applications.
  • Firewall, bandwidth, antivirus, content filtering, remote communications and backups management.
  • Load balancing and clustered solutions.
  • Disaster recovery.

Security services offering cover:
  • Firewall logging.
  • VLANs.
  • Intrusion detection systems.
  • Honey pot servers.

Internet services offered are:
  • SMS & WAP services.
  • Payment gateways (credit card clearing services).
  • SSL certificates.
  • Domain name registration, POP mail accounts.



Contents
Introduction
Security
Building management
Cabling
Managed services
Second data centre
Data centre checklist
Sidebar: HP develops smart rack
About RMIT

Second data centre

Without a doubt data centres will have a second data centre used for disaster recovery, backup and for testing purposes. You have a few options here into how you want to handle disaster recovery.

Make sure the backup site you choose is far enough from the primary site so that the same disaster that affects the primary location cannot affect your secondary data centre. There are a couple of different types of backup sites you can have, including cold sites, hot sites, and warm sites.

Cold sites are typically large warehouses or empty office buildings that have no computing facilities preinstalled. They do have standby telecommunications links that can be activated with short notice.

The advantage is cold sites are inexpensive as you don't have to maintain any computers and you don't have telecommunications bills every month. The disadvantage is you have to bring all your equipment over to the new site then configure it, and enable communications links.

A hot site is a fully operational site with all hardware and communications links ready to take over from the primary site in case of disaster.

The data on the primary site is also continuously replicated to the hot site. This will give you unsurpassed disaster recovery but you can imagine the cost of doing this is extremely high -- you virtually have to double your budget for hardware, software and services.

A warm site is somewhere in between hot and cold and contains the equipment and communications links so you can quickly continue with operations. These sites have equipment that is already preconfigured but they don't replicate any data from the primary site. Hence, the only thing you would need to do is move the data from the hot site to the warm site.

We don't have any scenario winners or an editors' choice award this time. To see whether data centres are adhering to the best practices would take a very long time so for that reason we didn't go to the lengths of visiting and doing spot checks on how some of these data centres are run but rather we decided to let readers know what can be found inside these data centres and what generally denotes good practice.

However, we have created a best practices checklist which you might want to use to tick off when looking for a data centre to host your equipment. A lot more can be added to the list but we decided to include the main points so that you can at least get a short list together much quicker. We also asked the following data centre companies to fill out the checklist: CSC, Datacom, Equinix, Fujitsu, Global Switch, Hostway, Hostworks, IBM, InfrastruXure, Macquarie Corporate, Optus, Raritan, Telstra, Unisys, Virtual.Offis, and Web Central. However many were reluctant to fill it out, either for security reasons, or because they didn't trust their counterparts in the industry to fill it out honestly. We have included those that did respond, but advise you to take it as a guide only. Reference: CISSO (Tittel, Stewart, Chapple).

Data centre checklist

Company Equinix Macquarie Telecom Unisys Australia Virtual.Offis WebCentral
Phone 02 8337 2000 02 8221 7777 02 9647 7777 02 9776 2300 1800 800 099
Facility specifications
Multiple physical separate connections to public power grid Yes Yes Yes Yes Yes
Continuous power supply with backup UPS systems Yes Yes Yes Yes Yes
Generators for continous operation in long term interruption Yes Yes Yes Yes Yes
Conform to or exceed building structural codes:
(bullet proof glass, fire doors, reinforced walls and so on). Yes Yes Yes Yes Yes
Heat and very early smoke detection alarm Yes Yes Yes Yes Yes
Flood sensors Yes Yes Yes Yes Yes
Easily removable panels in raised flooring Yes Yes Yes Yes Yes
Separate grounding systems to prevent grounding loops Yes Yes Yes Yes Yes
Physical security
Rules on badge sharing and piggyback entry Yes Yes Yes Yes Yes
Written statement of work upon sign in Yes Yes Yes Yes Yes
Entry from lobby to the data centre through mantrap turnstile Yes Yes Yes No Yes
Building access
24x7 physical security Yes Yes Yes Yes Yes
Limited number of building entrances Yes Yes Yes Yes Yes
Visitor logging procedures Yes Yes Yes Yes Yes
Card, biometric, or similar entry locks Yes Yes Yes Yes Yes
Security video monitoring Yes Yes Yes Yes Yes
Equipment
Video surveillance and motion sensors for entrances and equipment cages Yes Yes Yes Yes Yes
Locked cages with ceilings and locking cabinets Yes Yes No Yes Yes
Secure rooms available Yes Yes No Yes Yes
Operations
Database of all installed equipment Yes Yes Yes Yes Yes
Performance reporting Yes Yes Yes Yes Yes
Spare equipment on site of key network equipment Yes Yes Yes Yes Yes
Connectivity
Multiple connections to Tier 1 carriers Yes Yes Yes Yes Yes
Aggregate bandwidth sufficient to scale network for service needs Yes Yes Yes Yes Yes
Formalised SLA policies Yes Yes Yes Yes Yes
Cabling
Cable runs located under raised flooring and marked Yes Yes Yes Yes Yes
Cables physically protected via tie downs Yes Yes Yes Yes Yes
Cabling designed to Cat 6 specifications Yes Yes Yes Yes Yes
Cabling on raceways tied down Yes Yes Yes Yes Yes
Managed services
Vendor can provide servers and network equipment Yes Yes Yes Yes Yes
Vendor can manage your equipment Yes Yes Yes Yes Yes
Offsite storage available Yes Yes Yes Yes Yes
Regular scheduled security audits Yes Yes Yes Yes Yes


Contents
Introduction
Security
Building management
Cabling
Managed services
Second data centre
Data centre checklist
Sidebar: HP develops smart rack
About RMIT

HP develops smart rack

Business executives and bureaucrats are salivating over the potential labour-saving benefits of RFID identification technology, and soon technology workers may find reason to be enthusiastic, too.

HP is working on a smart rack that incorporates RFID systems to make data centres easier to manage. The racks, which could hit the market within the next two years, can instantly take stock of servers and alert staff to problems such as overheating.

A group of researchers here at HP Labs demonstrated a smart rack prototype this week during a media tour of the company's new RFID Demo Center.

Each shelf in the rack is equipped with an RFID reader designed to read high-frequency signals from servers with special chips storing the machine's unique ID number. By giving data centre managers the ability to take instant inventories, smart racks could eliminate labour-intensive stock checks requiring the physical inspection of each rack, says Salil Pradhan, HP Labs' chief technologist for RFID.

The next-generation server racks should also reduce the risk of lost or misplaced machines whose hard drives store important information. In addition, they could help streamline the servicing and maintenance of gear by keeping better work records for each machine. HP is working on temperature sensitive RFID chips to help quickly spot heat problems and avoid outages -- Alorie Gilbert.

This article was first published in Technology & Business magazine.
Click here for subscription information.



Contents
Introduction
Security
Building management
Cabling
Managed services
Second data centre
Data centre checklist
Sidebar: HP develops smart rack
About RMIT

About RMIT IT Test Labs

RMIT IT Test Labs
RMIT IT Test Labs is an independent testing institution based in Melbourne, Victoria, performing IT product testing for clients such as IBM, Coles-Myer, and a wide variety of government bodies. In the Labs' testing for T&B, they are in direct contact with the clients supplying products and the magazine is responsible for the full cost of the testing. The findings are the Labs' own -- only the specifications of the products to be tested are provided by the magazine. For more information on RMIT, please contact the Lab Manager, Steven Turvey.

Newsletters

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.
See All