Skip the navigation

Collider probing mysteries of the universe at the speed of light

Worldwide computer grid helps scientists make sense of data coming from collider experiments

September 9, 2008 12:00 PM ET

Computerworld - With the world's biggest physics experiment ready to fire up tomorrow, scientists from around the world are hoping to find answers to a question that has haunted mankind for centuries: How was the universe created?

The Large Hadron Collider (LHC), which has been under construction for 20 years, will shoot its first beam of protons around a 17-mile, vacuum-sealed loop at a facility that sits astride the Franco-Swiss border. The test run of what is the largest, most powerful particle accelerator in the world, is a forebear to the coming time when scientists will accelerate two particle beams toward each other at 99.9% of the speed of light.

Smashing the beams together will create showers of new particles that should re-create conditions in the universe just moments after its conception.

Tomorrow's test run is a critical milestone in getting to that ultimate undertaking. And a worldwide grid of servers and desktops will help the scientific team make sense of the information that they expect will come pouring in.

"This will move the limit of our understanding of the universe," said Ruth Pordes, executive director of the Open Science Grid, which was created in 2005 to support the LHC project. "I'm very excited about the turning on of the accelerator. Over the next two years, our grids will be used by thousands of physicists at LHC to make new scientific discoveries. That's what it's all for."

Pordes noted that the U.S. portion of the global grid is a computational and data storage infrastructure made up of more than 25,000 computers and 43,000 CPUs. The mostly Linux-based machines are linked into the grid from universities, the U.S. Department of Energy, the National Science Foundation and software development groups. Pordes also said the U.S. grid offers up about 300,000 compute hours a day with 70% of it going to the particle collider project.

Harvey Newman, a physics professor at the California Institute of Technology, told Computerworld that there are about 30,000 servers and more than 100,000 cores around the world hooked into grids that support the LHC project.

"The distributed computing model is essential to doing the computing, storage and hosting of the many petabytes of data from the experiments," said Newman. "Coordinating data distribution, processing and analysis of the data collaboratively by a worldwide community of scientists working on the LHC are key to the physics discoveries. Only a worldwide effort could provide the resources needed."

The computer infrastructure is critical to the work being done in the particle collider, which is a tunnel buried 50 meters to 150 meters below the ground. The tunnel, or tube, is designed to facilitate and control a head-on collision between two beams of the same kind of particles -- either protons or ions. Traveling through a vacuum comparable to outer space, the beams are guided around the tube by superconducting magnets.



Our Commenting Policies