Saturday morning at the Singularity Summit at Stanford University. All 12 panelists for the day are seated in order of their scheduled presentations, with an audience of at least a thousand seated in the Memorial Auditorium on campus. Very orderly and probably not very comfortable for the panelists who don't present for hours.
See image gallery for a closer look at event's participants.
If you aren’t familiar with the concept of singularity, here is the elevator pitch:
Sometime in the next few years or decades, humanity will become capable of surpassing the upper limit on intelligence that has held since the rise of the human species. We will become capable of technologically creating smarter-than-human intelligence, perhaps through enhancement of the human brain, direct links between computers and the brain, or Artificial Intelligence. This event is called the "Singularity" by analogy with the singularity at the center of a black hole - just as our current model of physics breaks down when it attempts to describe the center of a black hole, our model of the future breaks down once the future contains smarter-than-human minds. Since technology is the product of cognition, the Singularity is an effect that snowballs once it occurs - the first smart minds can create smarter minds, and smarter minds can produce still smarter minds.—Singularity Institute for Artificial Intelligence
The first speaker was Ray Kurzweil (pictured below), the progenitor of the Singularity, who reprised his recent 672-page book, The Singularity Is Near : When Humans Transcend Biology. He whizzed through the charts from the book, showing how law of accelerating returns is leading to the transformation of humanity. Kurzweil has concluded that intelligence will become more nonbiological and increase by the trillions. He writes, "In this new world, there will be no clear distinction between human and machine, real reality and virtual reality. We will be able to assume different bodies and take on a range of personae at will. In practical terms, human aging and illness will be reversed; pollution will be stopped; world hunger and poverty will be solved. Nanotechnology will make it possible to create virtually any physical product using inexpensive information processes and will ultimately turn even death into a soluble problem."
Here's Kurzweil's take on the impact of accelerating returns:
An analysis of the history of technology shows that technological change is exponential, contrary to the common-sense "intuitive linear" view. So we won't experience 100 years of progress in the 21st century -- it will be more like 20,000 years of progress (at today's rate). The "returns," such as chip speed and cost-effectiveness, also increase exponentially. There's even exponential growth in the rate of exponential growth. Within a few decades, machine intelligence will surpass human intelligence, leading to The Singularity -- technological change so rapid and profound it represents a rupture in the fabric of human history. The implications include the merger of biological and nonbiological intelligence, immortal software-based humans, and ultra-high levels of intelligence that expand outward in the universe at the speed of light.
By reverse engineering the brain and leveraging pattern recognition, Kurzweil expects to develop artificial intelligence far beyond the human mind in a few decades. "The bulk of human intelligence is based on pattern recognition...it's the quintessential example of self organization," Kurzweil said. He gave an example of pattern recognition applied to large databases, out symbolic rules, to self discover real-time language translation, which he expects to be available in cell phones in the next few years.
Reverse engineering is not thoughtlessly putting brain software on a computational substrate, Kurzweil said, but getting 'hints from reverse engineering. He said the brain's genome, which describes its design, could be compressed to about 20 megabytes of data. "It's a level of complexity we can handle," Kurzweil said. "The cerebellum has trillions of incredibly tangled bundles, but only tens of thousand of bytes in the genome."
He gave an example of reverse engineering the auditory cortex to derive principles of operation that can be expressed as mathematics and simulated. He demonstrated a reader for blind people that took advantage of speech and vision research, distinguishing between cats and dogs and reading from his book. Biotech is also key to Kurzweil's vision. He cited efforts to create artificial blood 'resprocites' by the late 2020s that would allow people to sit at the bottom of a swimming pool for an hour or sprint for 15 minutes without getting winded. By 2020, you should be able to have the power of the human brain in a personal computer for $1,000.
Kurzweil acknowledged that Singularity could lead to an unappealing or cataclysmic future, but he believes his vision will have a soft landing. If the technologies were considered too dangerous, it would require a totalitarian society, would deprive people of the benefits of technology innovation and drive it underground. In his view, narrow relinquishment of dangerous information and investing in defenses is a morely likely, or hopeful, outcome.
Kurzweil and Hofstader
Douglas Hofstader followed Kurzweil, offering his critique of Singularity. Hostader, professor of Cognitive Science and Computer Science Adjunct Professor of History and Philosophy of Science, Philosophy, Comparative Literature, and Psychology at the University of Indiana and the author of Gödel, Escher, Bach: An Eternal Golden Braid, doesn't buy into the whole Singularity vision.
He expressed the 'human' concern of uploading ourselves into cyberspace, becoming software entities inside of computing hardware as our destiny. "If that’s the case how will the entire world, enviroment in which we live be modeled," he asked. "What does it mean for humans to survive in cyberspace, and what is the core of a person. It's not clear what a human being would be in such an environment."
Hofstader said he asked many of his friends, "highly informed intellectual people," and their reactions to Singularity were from it's "nutty" to "scary" to 'I don’t know." It could be reasonable or probable, but none of the people he queried had read the book. "You get the feeling the scientific world not taking this seriously. I don’t see serious discussions among physicists when they get together, and most are skeptical," he said.
Hofstader proclaimed that he was less skeptical than those he discussed the topic with, but said that the ideas said in book marred by blurring with too much science fiction, calling it "wild beyond any speculation I am willing to accept."
"I see large a number of things that are partially true, blurry," Hofstader continued. "I can't put a finger on where it's wrong, but when multiply them together, you get down to small number...maybe 1 in 1000 of what Ray is talking about taking place."
"When listening to Ray, I feel like I am listening to one side of a divorce...I would like to hear serious scientist giving it a serious response. It's all to Ray's credit...he raised important issues. We are about to be transformed in incredible ways, and have to take these ideas seriously."
Hofstader illustrated his points with some of his own cartoons.
These are big ideas and so far in this conference there hasn't been any further discussion or debate to bring different viewpoints on Singularity into focus. My own take is that capturing the mechanisms of the human brain is inevitable. The question is whether the mechansms are the enough to replicate the range of human behavior, and how that man-machine relationship will play out.
As my friend futurist Paul Saffo said, "If we have superintelligent robots, the good news is that they will view us a pets; the bad news is they will view us food."
Kurzweil's response came near the end of the day long event...