SAN FRANCISCO — As more content is being migrated to the cloud and more mobile devices are connecting to the Internet, the rate at which data is accumulating is growing incredibly faster than the amount of storage space being developed to contain it.
During a panel discussion at IBM’s offices in San Francisco on Wednesday morning, Steve Wojtowecz, vice president of storage software development at IBM, stated that there are going to be over a trillion devices connected to the internet by 2015.
According to a survey of 255 IT professionals that polling agency Zogby International interviewed on behalf of IBM about storage spending priorities and organizational needs, nearly half of the respondents (43 percent) admitted that they are concerned about managing big data.
Here’s the breakdown on how those respondents plan to address their problems:
One way that IBM plans on meeting storage demands is going an industry-specific route. So far, IBM has outlined projects that focus primarily on healthcare and cosmology, and here are summaries about two of them:
Of course, there is are many more financial and sustainability motives as well. For example, at IBM’s data centers in Boulder, Colo., IBM has reduced block storage facilities by 50 percent, re-diverting that money to the building out the cloud and creating other business opportunities.
At its Rochester, Minn. campus, IBM has installed approximately 240,000 sensors throughout the building on everything from air ducts to toilets. Data is collected from 15 percent of the sensors every 15 minutes, which equates to roughly 20 million data points in a week. Not only does that provide a good deal of information about the facilities that can be used to manage the campus more efficiently, but it also racks up a lot of data that needs to be stored somewhere.
“We need to understand how to use that data in a way that is beneficial to IBM,” Wojtowecz said, “To me this is the start of a smart building.”
Although virtually every industry is going to need to sort out a plan to meet growing storage demands, the entertainment field is one that is particularly eye-catching.
Peter Ward, former senior vice president of information operations and content licensing at Sony Pictures, explained during the panel discussion that shooting movies in digital, and even more so in 3D these days, requires storage to be factored into the budget. Many film sets, Ward said, now have data centers in trailers, some of which are transferring footage back to Hollywood studios at the same time.
To get a better idea of just how much storage a single average Hollywood film requires, Ward pointed out that a film, including cutting room floor content, can take up a petabyte of space. However, the average finished project is about 10 to 20 terabytes.
Although major studios, and likely many global corporations, have already or are now figuring out their storage needs, it still remains a problem for the vast majority of businesses.
“There are lots of tools out there,” Andrew Reichman, a principal analyst at Forrester Research, affirmed, but that “what’s lacking is where to apply the tools.”
Acknowledging that even many Fortune 500 companies are suffering, Reichman explained that many businesses are following their original approaches — which was largely to do the same thing and just add more storage somewhere. But that doesn’t solve the problem as data accumulates at exponential rates and older storage systems aren’t adaptable to scaling.
There isn’t one right answer, Reichman said, but he added that analytics and the infrastructure towards enabling these changes need to improve faster.
This post was originally published on SmartPlanet.