Scientists, furturists and big thinkers increasingly worry about a key point in the future of computing: the singularity, the moment when mankind could be mentally dwarfed by its own machines, and we become an inferior species.
Utopians see this as the crossover point, where consciousness ascends to a higher state. Dystopians see-well, Skynet. Either way, the singularity is a very real possibility to some very smart people. Here's everything you need to know.
The word singularity was first used, in this tech sense, in 1958 by mathematical physicist John von Neumann. He was in a conversation with Manhattan Project math wiz Stanislaw Ulam.
Summarizing the concept, Ulam noted the "ever accelerating progress of technology...which gives the appearance of approaching some essential singularity in the history of the race, beyond which, human affairs, as we know them, could not continue."
And consider: If tech innovation seemed blindingly fast in '58, it's done nothing but speed up since.
Singularity acolytes invariably back up their theory by citing Moore's Law. Postulated in 1965 by Intel co-founder Gordon E. Moore, the law says that computer processing power doubles roughly every two years.
If your brain's power were to double every two years for 50 years, you'd be way smarter by now. Possibly even superintelligent, rule-the-world smarter.
"We need to be super careful with AI. Potentially more dangerous than nukes," Elon Musk has Tweeted."Hope we're not just the biological boot loader for digital superintelligence.
"Unfortunately, that is increasingly probable."
The term "technological singularity" was popularized by mathematician and sci-fi writer Vernor Vinge.
Vinge estimates the singularity could be upon us by 2045. Yikes! Or hooray, depending.
If Vinge popularized the concept, he's not the singularity squad's biggest cheerleader.
That would be futurist Ray Kurzweil, author of The Singularity Is Near (2005), who thinks the big day could be here as soon as 2029 and that by 2045 we will have increased our brainpower a billion fold.
At the 2012 Singularity Summit (yes there is such a thing) Oxford research fellow Stuart Armstrong presented some results from his survey of experts and lay people predicting the date of the singularity.
Happily, they were all over the map--with the vast majority in the range of 5 to 100 years.
"There will be no distinction, post-Singularity, between human and machine or between physical and virtual," Kurzweil writes.
Sound crazy? Hey, he's head of engineering at Google, so what could he know?
Advances in DNA research, digital brain modeling and nanotechnology suggest that one possible way for the singularity to arrive is through mind uploading into a computer simulation or physical robot container.
It's mostly speculative at this point, but only mostly; the Blue Brain project in Switzerland is currently working on a complete, simulated mapping of a human brain.
Cognitive science professor Steven Pinker told the Institute of Electrical and Electronics Engineers that there isn't the slightest reason to believe in a coming singularity.
The fact that you can imagine something, he notes, "is not evidence that it is likely or even possible. Look at domed cities, jet-pack commuting, underwater cities, mile-high buildings and nuclear-powered automobiles--all staples of futuristic fantasies when I was a child that have never arrived."
Geometry of Mind series. Design composed of human head and fractal elements as a metaphor on the subject of human mind, consciousness, brain, reason, logic and creativity
But if the tech singularity does happen, the aftermath is fraught with perils, according to Nick Bostrom, author of the best-selling, Superintelligence: Paths, Dangers, Strategies (2014).
Post singularity, Bostrom theorizes that machines could swiftly advance through recursive self-improvement, creating a subsequent intelligence explosion--smart machines creating smarter machines and so on.
Bostrom sees the prospect of runaway A.I. as a possible extinction event for us, potentially more dangerous than nukes. "Before the prospect of an intelligence explosion," he writes, "We humans are like small children playing with a bomb."
"The development of full artificial intelligence could spell the end of the human race," Hawking told the BBC. "Once humans invent artificial intelligence, it would take off on its own, and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded."
If all of this is a bit too much to assimilate, fear not. There's a place where you can get yourself schooled in exponential tech. At Singularity University, located at NASA Research Park in Silicon Valley, you can learn on a campus that includes a giant wind tunnel, a huge supercomputing center and a flight simulator. Cool.
Champion of the singularity movement Ray Kurzweil is co-founder and chancellor of Singularity U., which offers programs in A.I. & Robotics, Biotech & Bioinformatics, Nanotechnology and much more.
Catch their flavor at the Singularity News Hub, where you'll find stories like "Hacking Talent in the Age of the Exponential Human" and "Why Email Is Broken and What Will Replace It."