X
Business

A data disaster? How enterprise customers and data center operators are prepping for IoT

It'll be harder to find data management solutions that scale. One CTO's thoughts on the uncertain road forward.
Written by Greg Nichols, Contributing Writer

The Gartner report is now doctrine. There will be 6.4 billion connected "things" by 2020. In 2016, 5.5 million new things will be connected every day.

It's no secret that that's going to pose enormous problems for data center operators, who not only have to deal with a truckload (technical term) of new data coming from enterprise customers who rely on connected devices for things like robotic warehouse automation and fleet management, but with data that needs to be accessed and handled in unique ways.

Data centers are now having to rethink how they manage security, servers, storage, and network assets. Here's the diplomatic understatement of the century from Gartner's Joe Skorupa: "Data center managers will need to deploy more forward-looking capacity management in these areas to be able to proactively meet the business priorities associated with IoT."

Lay interpretation: Batten the damn hatches!

To get some ground-level insights, I got on the phone with Mike Fuhrman, CTO of Peak 10, a national IT infrastructure and cloud services provider. Peak 10 just opened a data center facility in Raleigh, NC, number 27 for the company nationwide. Mike sees big challenges and changes on the horizon with IoT. Here are his thoughts.

WAN links in data centers today are designed for "moderate" bandwidth requirements, right? What does that mean going forward?

mike-f.jpg

Mike Fuhrman, CTO Peak 10

Matt Sweadner, Photography

That's a huge issue. We've designed and built WAN links based upon the historical usage of applications and the historical usage of database transactions we've grown accustomed to. When we think about IoT, the amount of data, the amount of transactions that have to take place once it gets transferred, that changes the usage.

And I don't know if anyone knows what that means. I think there's going to be dramatic change as adoption and usage changes. We're smart enough to know change is coming, but we still don't know how big a pipe to build, or what different pipes need in terms of prioritization, classification, further intelligence.

So a lot of uncertainty. How do you feel your way forward when the stakes are so high?

I think we'll find our solutions as we build out more deployments, as we start modeling different kinds of connected devices to bandwidth, and as we start to break things out by particular sector. So you really need different models for agriculture, for healthcare, because those customers have different data needs.

One thing I'll point out, I don't think all the data needs to go into a single data center, which is the way we think of things today. Ultimately you'll see a mix of centralization and decentralization of data. Let's take the example of a manufacturing company that's putting more robots into manufacturing operations. All of those will be network connected going forward, and they'll be processing connections and sending off telemetry data twenty-four hours a day, seven days a week. Analytics have to be done on that data, such as health and welfare diagnostics so that a manager can know if a robot arm starts to wear out before it takes down a whole line.

Well, I don't think that data needs to go back to a central place for analysis. At the edge, for example, each plant could collect and analyze a certain amount of localized data from robots in that plant without sending them back to a central place to be analyzed. So you're going to see a mix and match of centralized and decentralized solutions.

How are data center operators like Peak 10 going to come to grips with that new paradigm?

One of the key things is an ongoing expansion of our data center footprint. So as an example, our customers are not choosing from one or two or three places to transfer and analyze data. We just opened our 27th data center in Raleigh, and our expansion is ongoing. So I think overall you're going to see more geographical choice in the market.

The other shift is we're investing in the ability to do cheap storage. Object based storage. A lot data is unstructured. We're talking about pictures, emails. These are things that need to be fed into more modern unstructured databases so analytics can be conducted on them, but that don't necessarily need to be in high perf, high capacity storage arrays that cost a lot of money. We're focusing on building out solutions that are geographically redundant, scalable, and cheap so you can take unstructured data and put it in object storage.

Traditional enterprise storage vendors are going to have to move to cheaper and faster storage devices. The old model where the tradition is to invest in high performance, high capacity simply doesn't scale. Storage vendors know that. The enterprise storage providers that understand those trends, move to embrace them, and are willing to disrupt their own business model are going to be the winners. I think there will be fewer left in five years, because some are going to miss the tech swings and be on outside, or else they won't be able to disrupt their own technology base to move to new technology.

Are enterprise customers going to have to change the way they think about data storage?

Yes, I don't think every piece of data is going to be backed up the way they're backed up today. The enterprise view of backup and retention is this: I've got 30 days of retention on every piece of data backed up, I'm able to quickly retrieve that data, and it's backed up in two or three different forms of media, maybe on spindle drive and a tape drive in a locked facility.

Well that won't fly anymore. The volume of data is going to be too much. I think you're going to have a place to store it, sure, but backup and retrieval is going to change. We want it to be safe and redundant. That's what we're driving for. So we want to build large volumes of cheap data storage, and then introduce geographic redundancy and scale.

For some customers, too, we're seeing new challenges with compliance and regulation based on their geography or industry. I met with a customer yesterday in an agricultural business and they have operations in over a hundred countries. Each has its own regulatory environment with regards to data. We're going to see larger customers that span industries and geographies have to deal with new complexities when they think about storing and transferring their data.

Editorial standards