Never mind trying to tame "big data, " is this something that can even be measured? How big is the big data market anyway? Deloitte is attempting to do just that.
The first question is what, exactly, is "big data"? In a recent interview (video posted below), Duncan Stuart, director of research for TMT at Deloitte Canada, defined it as 5 petabytes or more.
Many in the industry define big data not just by volume, but also by velocity and variety. But 5 PB is a good, simple threshold for current measurements. It's also a moving target -- bear in mind that 5 PBs may be what you find in a tablet computer three years from now.
Such large, multi-petabyte sites are likely to be proliferating. In a survey I helped conduct last fall as part of my work with Unisphere Research/Information Today Inc., nine percent of the companies participating reported data stores exceeding 1 petabyte. For comparative purposes, a petabyte is 1,000 times bigger than those 1-terabyte databases that made news just a decade ago.
In its report, Deloitte spells out the challenges with sizing the big data market -- there are varied definitions of what big data is, it is still early in the adoption cycle of big data technologies, and most of the companies who are doing big data do not disclose their spending.
Nevertheless, Deloitte pegs the size of the big data market at about $1.3-$1.5 billion in 2012. The consultancy also predicts that this year, we'll see big data experience accelerating growth and market penetration:
"As recently as 2009 there were only a handful of big data projects and total industry revenues were under $100 million. By the end of 2012 more than 90 percent of the Fortune 500 will likely have at least some big data initiatives under way."
But the industry is still in its infancy, Deloitte cautions. "Big data in 2012 will likely be dominated by pilot projects; there will probably be fewer than 50 full-scale big data projects (10 PBs and above) worldwide."
There are compelling reasons for companies to pursue big data. "Big data can see through time, big data basically allows you to see everything all at once, and in much finer detail," says Stuart. "Instead of looking at my customer's behavior once a month, I can look at it every minute of every day. That kind of insight is very, very powerful. It allows me to serve my customer better -- either very large or very fast or both, requires the big data toolset."
Challenges include the fact that solutions are still maturing -- "software is still being written," says Stuart. A looming skills shortage may also make big data projects difficult to bring to reality. Up to 140,000 to 190,000 skilled big data professionals will be needed in the US alone, over the next five years.
In addition big data will also require businesses to align work flows, processes and incentives to get the most out of it. In addition, Stuart says, it is important to note that enterprises should not concentrate on big data at the expense of 'current data,' or business information as normal. "There is still a lot of value left to be extracted from the information inside their traditional databases," he says. "Can I solve this using traditional relational database tools or traditional BI tools? Use the right tools where the need is."
The rise of big data also has the full attention of the venture capital community, Deloitte notes in its report. "Big data companies are attracting funding rounds of over $50 million, big data venture funds are being created, and large existing software players are validating the markets by partnering with or acquiring outright early stage leaders in the space."
(Illustration by Joe McKendrick.)
This post was originally published on Smartplanet.com