X
Innovation

Grid computing to solve Big Bang queries?

The world's biggest grid is in production, with the hope of confirming some unexplained theories of the universe
Written by Steve Ranger, Global News Director

The mysteries of dark matter, multiple dimensions and even the conditions following the Big Bang could be solved with the help of the world's biggest computer grid — a big chunk of which is being built in sleepy Oxfordshire.

The Large Hadron Collider (LHC) Computing Grid, being constructed at CERN, near Geneva, will be the largest scientific instrument on the planet and will need hugely powerful computing to process the 15PB of data that it will produce each year.

The LHC will smash protons and ions in head-on collisions to help scientists understand the structure of matter.

Discovering new types of particles can only be done by statistical analysis of the massive amounts of data the experiments will generate — which is where the LHC Computing Grid project comes in.

Although the LHC won't be up and running until 2007, work has already begun on the grid, with the UK being one of the largest contributors.

Of the 150 grid sites around the world, 18 are in the UK. Much of the UK work is being done at the Rutherford Appleton Laboratory (RAL) in Oxfordshire. See pictures here.

Because of the scale of processing needed, grid is the best way to go, explained John Gordon, deputy director of the Council for the Central Laboratory of the Research Councils e-Science Centre at RAL. "The computing has been planned for years; we've been looking at distributed computing for a long time," said Gordon.

"They couldn't afford to do all the computing at CERN so we knew we would have a big distributed computing problem of sifting the data around the world and finding it again. It's the biggest production grid in the world," Gordon added

The grid will use a four-tier model — data will be stored on tape at CERN, the Tier-0 centre. From there, data will be distributed to Tier-1 sites which have the storage and processing capacity to cope with a chunk of the data. These sites make the data available to the Tier-2s, which are able to...

For more, click here...

...run particular tasks. Individual scientists can then access data from Tier-3 sites which could be local clusters or individual PCs.

RAL hosts the UK's Tier-1 site, with the universities of Lancaster and Edinburgh and Imperial College operating Tier-2 sites.

While real data won't start flowing until 2007, scientists are already using lots of processing power on simulations. "They need to know what they are looking for so they do lots of simulations," Gordon said.

Commodity hardware and open source software are being used to keep costs down. "Because it's worldwide we are all looking at open source", said Gordon. "All the grid stuff is done in open source, that's taken for granted. Grid should use standard protocols, it's across administrative domains."

Network bandwidth will also be key — at the moment it has a 2Gbps dedicated link to CERN — the same amount of bandwidth RAL uses for all the rest of its Internet traffic — and the plan is to build a dedicated fibre-optic network between the sites.

"What we are looking at is setting up a network of private light-paths to Tier-1 sites," said Gordon.

Managing the huge number of files the experiments will generate is another problem the team is working on, according to Gordon. "You end up with millions of files and the problem comes in handling them and that's where the data management comes in. Data management is key."

But beyond all the exciting technology, much of the work will be in persuading different organisations to share, Gordon said. "A lot of it is sociological — you are persuading people that they gain by connecting all their computers together. It's about collaboration; it's not about people sitting in London using computers all over the world, it's about groups of people working on the same problem."

Editorial standards