That is on top of the millions of dollars already being spent on high-performance computing technology that can help analyze scads of data being created and collected by sensors, surveillance cameras, spreadsheets, Internet analytics tools, data warehouses and other manner of other systems for collecting information in a digital format.
The far-reaching initiative could have implications for defense, the pace of academic and corporate innovation, and for education, according to the Obama Administration.
The coordinated effort represents the government's attempt to make better use of the massive data sets that are at its disposal -- and that are growing literally by gigabytes (if not terabytes) every day. By automating the hard-core raw analysis, the hope is that scientists, security experts, law enforcement officials, first responders, and healthcare professionals will be able to make better informed decisions and predictions.
"By improving our ability to extract knowledge and insights from large and complex collections of digital data, the initiative promises to help accelerate the pace of discovery in science and engineering, strengthen our national security, and transform teaching and learning," wrote Tom Kalil, deputy director for policy at the Office of Science and Technology Policy, in a blog describing the initiative.
The fact sheet published with the program outlines an astonishing array of ways in which computing technology for better analyzing massive data sets could impact many walks of life in the United States and pretty much every federal agency that you could imagine including the Department of Defense, the Department of Homeland Security, the Department of Energy, the Department of Veteran Affairs, Health and Human Services, the Food and Drug Administration, the National Archives & Records Administration, the National Aeronautics & Space Administration, the National Institutes of Health, the National Science Foundation, the National Security Agency and the United States Geological Survey.
Yep, every single one of those agencies is investing in technology aimed at accelerating the pace of automation and innovation.
For example, the DoD, which is spending $250 million on big data annually (including $60 million on new projects) views big data technology as a means of extracting better information from intelligence gathered in all different languages.
"The Department is seeking a 100-fold increase in the ability of analysts to extract information from texts in any language, and a similar increase in the number of objects, activities and events that an analyst can observer," the fact sheet noted. Big data technology could also help humans detect anomalies that could point to cyber-espionage or real-world security threats, the DoD noted.
Programs at the Office of Science include a next-generation networking research collaboration intended to keep all this data from clogging up networks as it is analyzed and shared with academia and the business world.
The healthcare-related big data initiatives are especially profound and numerous.
For example, the Center for Medicare and Medicaid Services is developing a data warehouse based on Microsoft Hadoop for figuring out how to analyze and report on "accumulated" data. The Cancer Genome Atlas project at the National Cancer Institute, which is studying the molecular nature of cancer, will be working with several petabytes of raw data by 2014.
Imaging is another big focus for many of the healthcare-related projects, including one that seeks to make it easier for patients to control how images are shared with doctors, hospitals and other caregivers.
There are those among us that view big data projects as further evidence of a big brother age in which the government and businesses can gather an unsettling amount of data about individuals, to be used for various altruistic and profit-motivated purposes.
But you have to admit that something really needs to be done to get our arms around the flood of digital data and swirls around us every day. Otherwise, there really isn't much use collecting it in the first place.
(Images courtesy of the National Science Foundation. Credit for Hurricane Ike simulation: Gregory Johnson, Romy Schneider, John Cazes, Karl Schulz, Bill Barth, Frank Franks, Fuqing Zheng and Yonghui Weng. Photo of concurrent monitors, Falko Kuester)
This post was originally published on Smartplanet.com