X
Tech

Stellar tech to ignite big-bang project

Swiss scientists at CERN, the laboratory where the World Wide Web was developed, are enlisting the help of luminaries, such as Intel, HP and IBM, to test the big-bang theory.
Written by Ian Fried, Contributor
Scientists in Switzerland are building a machine to test the big-bang theory of how the universe began. But first they have to construct a computer network that can handle the volumes of data the device will produce.

The new, more powerful particle accelerator, known as the Large Hadron Collider, is being built at CERN, the same Swiss laboratory where Tim Berners-Lee developed the World Wide Web. With such a tool, scientists say that they will either be able to produce the same particles thought to have existed when the universe was formed, or they will have proved that such particles just don't exist.

But before they can test their theory, the scientists will need a computer network capable of processing and storing the massive amounts of data that will begin spewing from the collider once it starts smashing particles together in 2007. As a result, researchers at CERN created Openlab, a grid computing network designed to test out the type of equipment that is likely to be standard by the middle to end of this decade when the project really gets underway.

"The hope is that we, together with partners, solve whatever bugs there are," said Francois Grey, development manager for the Openlab effort. Grey notes that the particle accelerator is likely to be up and running for a decade, meaning they not only need to predict what will be the latest and greatest technology in 2007, but also which architectures will persevere throughout the life of the project.

The institute's first partners were Hewlett-Packard and Intel, which are delivering Itanium 2-based systems, as well as Enterasys Networks, which is providing 10-gigabit-per-second networking gear. On Wednesday, IBM is announcing that it is joining the effort, providing six xSeries servers, 20 terabytes of storage as well as IBM's new Storage Tank storage management software.

The computing network is designed to link thousands of scientists who will use the accelerator to try to prove the existence of particles known as Higgs bosons by recreating the conditions thought to have existed shortly after the big bang occurred. Higgs refers to Peter Higgs, the physicist who first theorized the existence of such particles, while bosons refer to the class of particles named for another physicist, S.N. Bose. In any case, Grey said scientists are pretty sure the collider will produce the conditions they need to create the particles, if they exist at all.

"We are talking about, over the span of the project, billions of dollars," of investment, Grey said. "You wouldn't make this investment on a hunch."

The project involves a lot of data, which should allow IBM fertile ground to test Storage Tank, said Jai Menon, an IBM Research Fellow that helped develop the software.

"Where they are really leading edge or bleeding edge is the size of the data the are creating," Menon said. "They are talking about maybe generating 5 petabytes (5 million gigabytes) of data a year once they are up and running.

But even with its large budget, CERN can't build its computing effort from scratch. Indeed, the agency is looking to tap existing computing and storage resources to supply much of the network's capacity.

That is what makes the project interesting to IBM, says Tom Hawk, the general manager of the company's grid computing business. CERN's need to find a cost-effective way to manage a massive computing project while using existing computing resources is the same issue that business customers face, Hawk said, adding "not that we aren't really excited about particle physics."

For HP and Intel, the project is a chance to promote Itanium as the next generation of mainstream server technology. Although many large companies are still testing Itanium-based servers, HP and CERN are betting that they will be far more widespread by the time the particle accelerator is ready.

"Itanium will be mainstream, almost commodity," promised Michel Benard, a Geneva-based manager for HP's university relations program.

Being included in the project is something of a coup, since there are plenty of companies touting their technology as the next big thing, said Charles King, research director of market researcher The Sageza Group in Mountain View, Calif.

"There certainly have been a lot of pretenders to the throne," King said. "The CERN project provides a really interesting crucible of sorts...just being involved in the project is going to be validating to a certain extent."

Plus, King said, IBM and the other participants could not ask for a better testing ground. "It's one thing to do it in your own test lab," he said. "It's another thing to be used by a third party in a pretty hardcore environment where (the technology) is being pushed to its limits."

Even though CERN is just receiving the first of the giant pieces that will form the accelerator, the computing network needs to be built by 2005, Grey said. Although the accelerator will not yet be running, the scientists involved in the project will have to begin checking all of the simulation software they will use to measure and replicate the results produced by the accelerator itself.

Editorial standards