The amount of data that must be collected and processed to drive meaningful, real-time smart grid applications is nothing short of daunting. That's a reality that big-data software company Versant is eager to tackle.
The developer already is at the center of several academic data analysis projects related to global warming research. More recently, it joined the IntelliGrid Research Program hosted by the Electric Power Research Institute, with an eye toward helping address the smart grid's data management challenges.
That problem is twofold -- requiring utilities to build a holistic view of consumption trends across both traditional generation sources and to keep tabs on distributed sources that rely on renewable energy such as solar or wind generation.
"Utilities are poised for a data management revolution," said Dirk Bartels, vice president of strategy and marketing for Versant, in a statement about the EPRI relationship. "The smart grid will transform utilities' operational and information models to resemble those of modern telecommunications companies with large interconnected networks."
Those models will require utilities to rethink how they approach reliability, speed, accessibility and security. That will in turn require data analytics technologies that go far beyond traditional relational databases, according to Versant's strategy. The complexities of the smart grid will be akin to those in complex trading systems or financial transaction processes, the company believes.
When I spoke recently with Robert Greene, Versant's vice president of technology, he said that some companies, such as smart building technology company Echelon, already use Versant processing algorithms to help analyze power consumption trends and other energy efficiency metrics.
"These are extreme data management scenarios," Greene said.
It actually makes sense that Versant would gravitate to the smart grid side of the climate-related data management game, since the company also has been involved in a number of real-time data analysis projects focused on understanding weather dynamics and their relationship with other variables, such as metrics for the Greenland ice sheet.
For example, Versant is behind a project at the National Snow and Ice Data Center (NSIDC) at the University of Colorado.
The idea there is to correlate image data with other known data sets over time to predict what impact changes will have on things like water supply or other gas emissions. The database being used for the NSIDC project includes more than 10 billion persistent objects, which is a pretty mind-boggling number. Over time, the database will be made publicly available, so that academics and the general public will be able to run queries against it, Greene said.
"We have been putting satellites into orbit from way back in the 1970s, but we really didn't have a lot of place to process this data," he said. "We stored these images in the archives. We can look at what the polar ice caps looked like. But the task of being able to slice, dice and query is very, very difficult."
In the future, you're likely to see Versant involved in projects that have to do with predicting the possible impact of weather -- such as the business and financial impact of extreme weather scenarios, Greene said.
"We can study cloud formations and how they move over time, predicting where you could have the greatest impact from a storm that is coming through," he said. "You could, potentially, guard against energy outages with this sort of information. That is where we distinguish ourselves."
Who knows, maybe so called big data analysis technologies will answer the climate debate question once and for all. Personally, I think the idea of finding a way to minimize the human and economic impact of catastrophes related to extreme weather has far more appeal.