X
Innovation

Singularity Summit 2007: A change is gonna come

Guest post: This weekend I am at the Singularity Summit 2007 in San Francisco at the Palace of Fine Arts. About 800 people showed up to hear about the issues related to a future in which humans won't be the driving force in delivering scientific and technological innovations, eclipsed cognitively by "posthumans" or machine intelligences.
Written by Dan Farber, Inactive

Guest post: This weekend I am at the Singularity Summit 2007 in San Francisco at the Palace of Fine Arts. About 800 people showed up to hear about the issues related to a future in which humans won't be the driving force in delivering scientific and technological innovations, eclipsed cognitively by "posthumans" or machine intelligences. I talked to four of the speakers prior to the event--the podcasts are here.

chris1.jpg
I am joined by Chris Matyszczyk, who will be offering his views of the Summit. Chris has spent most of his career as an award- winning creative director in the advertising industry. He is perhaps most well known for his advertising campaign against domestic violence in Poland, which had a major impact on cultural behavior. He has also been a journalist, covering the Olympics, SuperBowl and other sporting events. He brings a refreshingly, non-techie, and humorous, perspective to the Singularity Summit. Check out his "Pond Culture" blog.

Slimy, wet, gray stuff.

Such are our brains, according to Eliezer Yudkowsky. [Listen to Dan's podcast interview with Yudkowksy about "friendly AI"] Right now, artificial intelligence has not even caught up with the slimy, wet, gray stuff of a village idiot.

Which means it will be at least a couple of years before a machine becomes the Commissioner of Baseball.

There will be an explosion, although no one is quite sure when this explosion will be. But it’s coming. Because computers are getting more citius,

more altius, and significantly more fortius. And because there are a lot of clever people involved in this enterprise.

There are already three schools of thought.

Oh, don’t expect me to understand what these three schools of thought are actually saying. I never attended an American university.

But I’m reminded of my time living in my ancestral home, Poland, when it was explained to me (by a professor, as it happens) that whenever two Poles meet, there are three opinions.

I sense that Eliezer Yudkowsky’s greatest fear is that humans, driven by their egos and the pressure of their bankers, won’t do the right thing and pool their great minds for the good of the world. Instead, they will race each other, contradict each other and derail the optimal progression of development.

The first question asked of Professor Yudkowsky when he finished his talk was “are you optimistic?” His reply was so swift that it almost seemed as if someone has fired a laser into his brain from off the stage. “I think that question is more appropriate for my talk tomorrow,” he said, with just a hint of nervousness.

Who does he need to talk to before he can answer this question? His lawyer? His banker? His conscience?

I have a suspicion that tomorrow he will produce a robot that will do his entire talk for him. A robot that will appeal to all the minds and all the machines involved in this enterprise, one that is the most important in the history of the world.

The robot will be called Bud.

Editorial standards