X
Tech

Joy warns of tech Armageddon

Sun's tech chief warns that microcomputing and genetic engineering could pose very real dangers to humanity and the ecosystem
Written by Matthew Rothenberg, Contributor

Citing "Unabomber" Theodore Kaczynski and other information age Cassandras, Sun Microsystems' co-founder and chief scientist, Bill Joy, is now warning of the potential dangers in the computer technologies he helped create.

In an article scheduled for publication on Tuesday in the April edition of Wired magazine, Joy cautions that the convergence of genetic engineering and computer technology could pose a very real threat to humanity and the ecosystem.

Joy quotes from a passage of Kaczynski's Unabomber "Manifesto", which posits a scenario where computers become sufficiently sophisticated to make their own decisions, and where any human control over the systems devolves to a "tiny elite" who will either find means to eliminate the masses or -- "if the elite consists of soft-hearted liberals" -- tend to their physical needs while eliminating their free will through biological or psychological engineering.

"These engineered human beings may be happy in such a society, but they will most certainly not be free," Kaczynski wrote. "They will have been reduced to the status of domestic animals."

Joy is quick to decry Kaczynski's bombing campaign as a "criminally insane" act. Nevertheless, Joy writes, "as difficult as it is for me to acknowledge, I saw some merit in the reasoning in this single passage. I felt compelled to confront it."

In Joy's view, both past precedent and current technology point to the danger that human creations may present. For example, the emergence of antibiotic-resistant bacteria may be merely a foretaste of "Borg-like disasters" that could occur if the latest scientific advances spin out of control. "Accustomed to living with almost routine scientific breakthroughs, we have yet to come to terms with the fact that the most compelling 21st century technologies -- robotics, genetic engineering and nanotechnology -- pose a different threat than the technologies that have come before," Joy writes.

"Specifically, robots, engineered organisms and nanobots [molecular-sized, intelligent machines that can be programmed to alter microscopic structures such as DNA] share a dangerous amplifying factor -- they can self-replicate. A bomb is blown up only once, but one bot can become many, and quickly get out of control. Much of my work over the past 25 years has been on computer networking, where the sending and receiving of messages creates the opportunity for out-of-control replication. But while replication in a computer or a computer network can be a nuisance, at worst it disables a machine or takes down a network or network service. Uncontrolled self-replication in these newer technologies runs a much greater risk: A risk of substantial damage in the physical world."

According to Joy, current advances in molecular electronics mean that by the year 2030, "we are likely to be able to build machines in quantity a million times as powerful as the personal computers of today", and imbue them with human-level intelligence.

"In designing software and microprocessors, I have never had the feeling that I was designing an intelligent machine," Joy writes. "The software and hardware is so fragile and the capabilities of the machine to 'think' so clearly absent that, even as a possibility, this has always seemed very far in the future. But now, with the prospect of human-level computing power in about 30 years, a new idea suggests itself: I may be working to create tools which will enable the construction of the technology that may replace our species. How do I feel about this? Very uncomfortable. Having struggled my entire career to build reliable software systems, it seems to me more than likely that this future will not work out as well as some people may imagine. My personal experience suggests we tend to overestimate our design capabilities."

Joy offers some qualified words of optimism that humanity will be able to alter this course by establishing curbs modelled after global governments' handling of weapons of mass destruction. "We are being propelled into this new century with no plan, no control, no brakes. Have we already gone too far down the path to alter course? I don't believe so, but we aren't trying yet, and the last chance to assert control -- the fail safe point -- is rapidly approaching. We have our first pet robots, as well as commercially available genetic engineering techniques, and our nanoscale techniques are advancing rapidly. We are getting a belated start on seriously addressing the issues around 21st century technologies and further delay seems unacceptable."

What do you think? Tell the Mailroom and read what others have to say.

Editorial standards