X
Tech

Scientist computes limits of the universe

MIT physicist Seth Lloyd estimates how much computing power would be needed to accurately simulate the entire universe throughout its history. And we're struggling with just DNA.
Written by Robert Lemos, Contributor
A Massachusetts Institute of Technology physicist, known for calculating the absolute, physical limit of a laptop computer's storage potential, has now imagined what it would take for a computer to accurately simulate the entire universe throughout its history.

A report in this week's Nature magazine says Seth Lloyd estimated that such a computer would have to contain 10 to the 90th bits of information and perform 10 to the 120th operations on those bits to model the universe in all its various incarnations since the big bang.

The second figure was drawn from Lloyd's idea that a fundamental particle's move from one quantum state to another can be seen as a computation, and that the universe itself can thus be viewed as a giant computer, the Nature report stated.

Numbers of such size are nearly impossible to comprehend. But the total information required to model the universe is 10 billion times greater than the number of elementary particles--neutrons, protons, electrons and photons--in the universe, the Nature report said. Lloyd could not be reached for comment.

The idea of cosmic calculators is not new.

One example appears in the popular comic novel "The Hitchhiker's Guide to the Galaxy" by Douglas Adams. In that book, Adams dreamed up a race of alien mice that--after learning that the answer to "Life, the Universe and Everything" was "42"--created a planetary computer, Earth, to find out what the question was.

Unfortunately, the Earth was destroyed in the opening chapters of the book.

Lloyd, a quantum physicist and professor of mechanical engineering at MIT, has made a habit of discovering the limits of computers.

In an August 2000 paper in Nature, the professor discovered that a computer the size of a laptop could theoretically store as much as 10 to the 31st bits of information, about 100 quintillion times more information than the approximately 100 billion bits, or 12.5GBs, that today's standard laptop can hold.

To achieve such storage capacity, the laptop would use Einstein's E=mc2 and turn all matter into energy, making the compact computer more like a piece from the interior of a star.

Editorial standards