Singularity: Technology, spirituality and the close box

Singularity: Technology, spirituality and the close box

Summary: Ray Kurzweil responded to various critiques and questions, and stuck to his PowerPoint slides of datapoints that convey his Singularity theory. He emphasized that because of the accelerating pace of change, technology will be able to solve all problems, from the practical problems of climate change and energy efficiency to cutting poverty and disease On the software side, 10 to the 16th calculations per second should be sufficient to reach human level intelligence.

TOPICS: Health

Ray Kurzweil responded to various critiques and questions, and stuck to his PowerPoint slides of datapoints that convey his Singularity theory. He emphasized that because of the accelerating pace of change, technology will be able to solve all problems, from the practical problems of climate change and energy efficiency to cutting poverty and disease On the software side, 10 to the 16th calculations per second should be sufficient to reach human level intelligence. and bridging the digital divide--but only through the scale the new technologies will bring.

On radical life extension--including immortality--mastering and reprogramming who we are in terms of health processes through biotechnology is on the horizon, Kurzweil said that biology and medicine in post information era are about reprogramming biology to eliminate health problems, he said. "Within 15 years add a year to life expectancy every year," he predicted, which supports the "real goal of life is to expand human knowledge."

He added that "creating communities is what holds people together and enhance human relationships, and I would like more time to partake of that."


See image gallery for a closer look at event's participants.

Kurzweil's notion of community includes the more extreme concept soft loading the contents of a brain onto a computing substrate. It's difficult to imagine the social dynamics of communities with computational intelligences as separate entities, full rendered proxies for a biological being or some merged entity with one million time the speed of human brain function.

"The whole uploading idea isn't necessary," Kurzweil said. "The idea is to pass the Turing test. We don't need your body and brain...we have this new better one in a computing substrate. We can observe and scan from inside and see what going in sufficient detail....One can make an argument that it is a different person, but this scenario not integral. We are talking about capturing strong AI, and it's becoming less narrow over time."Integrate the compute with who you are ...more convenience.

Kurzweil dismissed Hofstader's critique, saying that there are some limitations in the accelerating pace of technology, but they are "not very limiting."

As a final piece, the panelists were asked whether they thought the shift to Singularity would have a happy ending. The majority of panelists were optimistic, or weren't willing to guess. Predictions on when human level machine intelligence would exist ranged from 2029 (Kurzweil)  to 2100 (Hofstader, although he said he is not a 'futurist'). Kurzweil also doesn't believe that even a world catastrophe, which would be painful of course, would disrupt the trends on which Singularity is based.

Eliezer Yudkowsky of the Singularity Institute for Artificial Intelligence doesn't believe that human level AI can be predicted, even if know how to build real AI, how much work is involved and which organization and people are doing  it. "We have to be careful not to mistake our ignorance for knoweledge about it," said, adding that AI doesn't fit into a Moore's Law expression. "We need more public funding for specialists who can think about these things full time," Yudkowsky said.

Kurzweil responded that the hardware and software are two sides of the problem. On the software side 10 to the 16th calculations per second should be sufficient to reach human level intelligence. "It's a matter of getting to the right level of representation of brain's a fairly complex system but at level we can handle." Hardware will follow Moore's Law, continue to deliver cheaper, faster, smaller, cooler systems.

Author Bill McKibben, who was remotely connected to the Summit from his country home in the Adirondacks, advised the Singularists to slow down. "One of ways we fool ourselves is thinking the past a good predictor of future. It's a natural human tendency to extrapolate out. That's possible and how lots of people lose their shirts in the stock market. The other possibility is that there are threshold points at which point it makes sense to say enough for now. We are clearly in one of those points," McKibben said. 

He went on to say that in the last 150 things have gotten faster, and physical and social disintegration aof the world is around us. "We are living in a society with increasing depression. In the real world, to the degree that it's possible to slow down the progress of technology change and instead think hard about how to summon human abilities and try to bring them to the fore." 

Cory Doctorow made the important point that "hitting the close box quickly is going to be one of the more important skills."

McKibben also adress the intersection of Singularity and spirituality. "It is taking something that is a deep part of the human experience and somehow confusing quality with quantity. In the end, the spiritual response has a great deal to do in that there is something important about human mortality. It would be an interesting process, and to a certain amount in conflict [with ideas of Singularity]."

"Technology can empower us on the positive side," Kurzweil resoonded. "We can see both the promise and perils, but at same time social networks and blogs are democratizing. Some of true values of religion, such as the golden rule, we keep in mind. The story of the 21st century hasn't been written."


The most important inventions of the last century, according to McKibben, were the wilderness area and the practical invention of non-violence as a political technique--the unique human ability to reign oneself in. That statement drew the greatest applause of the day.

Kurzweil responded that free enteprise competition makes it hard to slow down, and he clearly wants to continue marching forward. 

McKibben chimed in that governments could intervene. "If we decided that the endless consolidation of agriculture was producing results we don't like in ecological terms and human cost, we could pass series of laws to slow that down. We aren't beyond that possilbity--we still live in world in which we can make a difference."

Doctorow cited a characterization of Singularity as the "rapture of nerds," and that the attraction of Singularity is partly pyschological, transcending problems of the world, wiping the slate clean. Kurzweil countered that the idea of Singularity doesn't just emerge from anxiety. "It's applying knowledge to solve problems, and we have to reach to transcend the problems in real, rather than imaginary, ways." 

It boils down to whether you believe in the Singularity concept as a scientific endeavor, and how it will impact individuals and society. It's not hard to believe in the substance of the science aspects of Singularity. The question about how it will change our species, society and public policy beyond the notion of increasing life span, pervasive virtual reality and smart agents to do our bidding in that realm or in the worst case scenario over time to enslave us. As McKibben suggests,  the concepts of Singularity need to be part of the big conversation.

Kurzweil concluded the session: "It's a complex topic." Indeed... 

Topic: Health

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Love, marriage and sex with AI-based robots

    I agree with the gist of Ray Kurzweil's prognosis for the future, a future in which AI will dramatically change the lives and opportunities for the majority of people on our planet. One topic within this area that has, as yet, been relatively unexplored, is how human-robot relationships will develop when robots become super-intelligent and acquire emotions. My take on this is that an increasing percentage of the adult population will enjoy emotional relationships with robots, including love and sex, and that by the middle of this century or thereabouts the first human-robot marriages will be taking place.

    I asked Ray Kurzweil, during a radio phone-in show a few months ago, when he thought this was likely to happen. His response was that it would be around the time that robots (or computers) pass the Turing test, which he predicts will be in 2029, even earlier than my own prediction.

    I discuss this whole area in my new book "Robots Unlimited", and would be very interested to hear the views of visitors to this site on this fascinating topic.

    David Levy
    London, England.
    David Levy
    • Who needs people?

      Wow, in a world where we systematically isolate ourselves from
      other humans, why on earth would we want to take steps that
      would virtually eliminate all need for interaction with other
      humans? I think I'd rather throw my computer out the window
      than risk it replacing human relationships. Besides, think about
      it - endowing robots with superior intellect and HUMAN
      emotions? Are kidding me? That would be disaster waiting to
      happen! What if the robot rejects me for the same reasons my
      ex-girlfriend did becasue it has human emotions now? Or what
      if a robot isn't attracted to me? And if a robot can think and act
      like a human, is it ethical to "own" it? Isn't that slavery? What if
      my intelligent robot initiated a support group with other super-
      intelligent human-emotion-laiden robots and they decided we
      were all a bunch of uneccessary piles of flesh and bone? Don't
      we as humans resort to irrational conclusions and actions based
      on emotion? So WHY on earth would we want to create
      machines smater than we are with emotions?

      I don't want to live in a world where people who are too afraid to
      seek real human interaction can replace it with a robot. Heck -
      if you carry this conversation to it's logical conclusion, why not
      just do away with these inefficient and slowly dying human
      bodies and plug our personalities into a massive programmed

      I'm afraid I have to agree with McKibben's points here. There's a
      point to our mortality - call it term limits in an odd sort of way if
      you will. We keep creating machines that can do more and more
      for us, which to a certain extent is good - as long as we don't
      just get more and more lazy as a result . However, I draw the
      line at using machines to replace all of the things that define
      what it means to be human.
  • Didgitizing Reality

    I do not know enough about this subject to endorse or discredit one side or the other. It occurs to me though that for all our increases in speed we forget that computing is only a sample of reality (with extrapolations in between). As we learn enough about the speed it will take to traverse the gulf will we then realize our limitations? (Not unlike quantum physics and not that we shouln't try.)
  • Singularity

    The problem is not whether we are going to develop more sophisticated technologies to help people think and do better and faster, but when and who is going to get this ability before whom. Kurzweil may be exaggerating the extent and time frame of human evolution or revolution but he is not exaggerating that it will cause quite an effect as it happens.

    If this happens where others have opportunities to participate, understand and control, then the world may accept it. But if it's done in a revolutionary way, with a super race coming into being with little opportunity to adjust, then we will be pets and food.

    Most science fiction books, movies, and TV programs deal with these moral dilemmas all the time. The results are as varied as the nature of people who write them.
    • I fail to see the upside in developing....

      A species (of whatever) that is smarter than us. knocking yourself off the top of the food chain just doesn't seem to be a very "intelligent" move.

      Is this simply going to happen because someone predicts it's inevitable and there are way too many moronic smart people that want to do it first?

      Again, just because you CAN do something does not mean you SHOULD.