Reid Hoffman, founder of business-minded social networking service LinkedIn, recently said the following:
If Web 1.0 involved go search, get data and some limited interactivity, and if Web 2.0 involves real identities and real relationships, then Web 3.0 will be real identities generating massive amounts of data.
Riding on that concept, entrepreneur and Singularity University professor Vivek Wadhwa wrote at TechCrunch:
The information that Reid described is just the tip of the iceberg. We are already gathering a thousand times more data than that. The growth is exponential, and the innovation opportunities are even bigger than Silicon Valley can imagine they are.
Wadhwa goes on to describe a Web 3.0 (or "Semantic web") world, in which data on facts, personal habits and social inclinations encompass every area of our lives.
- Healthcare. Standardized, electronic health records offer instant access to a patient's medical history as well as visibility into trends across populations through the aggregation of this data. And with dramatically lower costs, work into the human genome is more advanced than ever before.
- Government. The efficiency of government services and our socioeconomic situation can be better understood through the use of data on poverty, education and more.
- Security. The inexpensive ubiquity of video capture gives us worldwide visibility. (Even if we can't really detect what's actually in a video without textual tags.)
Now combine and cross-reference those datasets. "We are entering an era of crowd-sourced, data-driven, participatory, genomic-based medicine," Wadwha writes.
Or, the "Information Age" in its strictest definition.
It's true that we can't begin to imagine what innovations will come from all this data. But in his essay describing the phenomenon, Wadwha glosses over a major component of it: sifting through that data.
If Web 3.0 is boundless amounts of data spewing forth from real, connected entities (people, places and things), Web 4.0 -- or whatever you'd like to call the next step -- must be the development of the intelligence necessary to discern which data is of value and which is just noise.
Theoretically, you could say that every piece of data is a valuable insight for something. That's true, but our systems won't be robust enough to handle every meaning, every intent and every mistake.
After all, does that minor cold (date, time, location, duration, severity) I had last week really impact my overall health picture? Probably not, short of my white blood cell counts to do away with it as an indicator of the status of my immune system.
But unless we know what kind of germ gave me that cold and where I got it from and how and how much sleep I've been getting and what my diet and exercise routine looks like, we won't really have a data point that's significant. We won't really know if it's significant or not. Perhaps one day. But not now.
So it seems to me that without the full picture -- and that effort goes on to infinity -- we'll be collecting lots and lots of less-than-valuable data, making it harder to figure out just what we want to do with it all.
This period of history has been called the Information Age because it makes available instant access to knowledge that would have been difficult or impossible to find previously. I would argue that we are way beyond this; we’re at the beginning of a new era: the New Information Age.
But it's not that easy. Access to data and understanding it are two entirely different ballgames, and Wadwha only hints at all the interesting places from which we're collecting data.
So when the semantic web is fully realized -- moving beyond text and toward deciphering images, audio, video and actions natively, automatically and in real time, then giving that data point value compared to all the other data points collected -- it seems to me that the Information Age will have only just begun.
Succinctly: it's one challenge to get everyone in the same room. It's another to derive value from the exercise.
To use a Wadwha example, Mark Zuckerberg created Facebook as the first public, widely available, global, self-updated information platform centered on people. But his company is just beginning to understand the implications and applications of that platform.
(What does the world's insistence on "poking" other people say about cultural and social norms? Does it really matter, if we can target ads to those people and make money on them?)
Think about it: what would and could you do if you knew what everyone in the world was doing (and where) right now?
Illustration: Andrew M./DeviantArt
This post was originally published on Smartplanet.com