Scientists are currently constructing the World’s largest particle accelerator. Its aim is to simulate the Big Bang that created the Universe. When it is ready in 2007, it will run under the Swiss countryside in a huge circle.
The particles fired down its length will generate enough data in a year to fill a pile of CDs the size of the Eiffel Tower. What to do with that data and how to share it with scientists around the world is the essence of the Grid, the name given to the international network of supercomputers that promises to revolutionise not just the way we use the internet, but computing itself.
For now the Grid is composed of noisy, air-conditioned computer centres dotted around the world, full of PCs all linked together to make a giant super-computer. Scientists are currently limited in what they can do by their own hardware’s memory, software and processing power.
Once the Grid is up and running anyone hooked up to it will have all the programs, power and storage they could dream of.
Speed of light
“The idea is that even in a small institute where a physicist may need to have very large computing power for his particular problem, he can go off from his desk, launch the job and get the job done using computer centres all over the world, get the results back and write the publication afterwards,” said Wolfgang von Ruden, head of information technology at Cern.
When it is ready, Cern’s Large Hadron Collider will be sunk underground, where particles will smash into each other just below the speed of light. The accelerator will test the Big Bang theory (Image: Cern) Just as Cern has used contractors to find the tools and talent for the job of putting the collider together, so the Grid will employ a similar agent.
It will use software designed to sniff out where in the world are the computing resources – software, memory, processing power – for a particular task. In short, computing is becoming a utility to be piped into your home or office, like electricity or gas.
“If you understand the web…