X
Tech

Energy, IBM working on 20 petaflop computer

If if weren't for nuclear weapons, we wouldn't have superfast supercomputers, would we? Now, thanks to the Energy Dept.
Written by Richard Koman, Contributor

If if weren't for nuclear weapons, we wouldn't have superfast supercomputers, would we? Now, thanks to the Energy Dept. and IBM, the latest supercomputer will run at an astounding 20 petaflops. This machine, Sequoia, is slated to boot up in 2012 at Lawrence Livermore.

With that much power, Wired notes, Sequoia could do lots of different things, but for now the baby steps will be simulating nuclear explosions. In the future, consider the simulations that could be run for climate change or protein interactions.

"Every time you do predictive science, the next question is: How confident are you in that prediction? It turns out that's a very easy question to ask and a very profound question to try to answer," said computer scientist Mark Seager of Lawrence Livermore National Laboratory [pictured here with a Lawrence Livermore computer]. "The way that we do that is by running a whole bunch of simulations. Instead of just one simulation, you do 4,000."

Fun specs: 1.6 million processing cores, 1.6 petabytes of memory, 96 racks, 98,304 computing nodes, footprint only 3,400 square feet. And energy-wise, it's a dreamboat. It will draw six megawatts of power a year, equivalent to that consumed by just 500 homes.

With this much power, computer simulation will stand side-by-side experimental science, Seager thinks, with its model of theory and experiment. In the future, it may be theory and simulation.

"Scientific simulation is the telescope of the mind," Seeger [sic] said. "We work with highly non-linear systems that have very complicated mathematics and models. It's just too difficult to hold all that in our brain and analyze it. So by simulating them, we're extending our brains' capabilities."

Editorial standards