Skip the navigation

Planet-Scale grid

A particle collider leads data grid developers to unprecedented dimensions.

By Patrick Thibodeau
October 10, 2005 12:00 PM ET

Computerworld - In 2007, scientists will begin smashing protons and ions together in a massive, multinational experiment to understand what the universe looked like tiny fractions of a second after the Big Bang. The particle accelerator used in this test will release a vast flood of data on a scale unlike anything seen before, and for that scientists will need a computing grid of equally great capability.


The Large Hadron Collider (LHC), which is being built near Geneva, will be a circular structure 17 miles in circumference. It will produce data in the neighborhood of 1.5GB/sec., or as many as 10 petabytes of data annually, 1,000 times bigger than the Library of Congress' print collection. The data flows will likely begin in earnest in 2008.


As part of this effort, which is costing about 5 billion euros ($6.3 billion U.S.), scientists are building a grid using 100,000 CPUs, mostly PCs and workstations, available at university and research labs in the U.S., Europe, Japan, Taiwan and other locations. Scientists need to harness raw computing power to meet computational demands and to give researchers a single view of this disbursed data.


This latter goal—creating a centralized view of data that may be located in Europe, the U.S. or somewhere else—is the key research problem.


Centralizing the data virtually, or creating what is called a data grid, means extending the capability of existing databases, such as Oracle 10g and MySQL, to scale to these extraordinary data volumes. And it requires new tools for coordinating data requests across the grid in order to synchronize multiple, disparate databases.












Tony Doyle, project leader of Grid Particle Physics (GridPP) project
Tony Doyle, project leader of Grid Particle Physics (GridPP) project

"It's all about pushing the envelope in terms of scale of robustness," says Tony Doyle, project leader of Grid Particle Physics (GridPP) project, a U.K.-based scientific grid initiative that's also part of the international effort to develop the grid middleware tools.


Researchers believe that improving the ability of a grid to handle petabyte-scale data, split up among multiple sites, will benefit not only the scientific community but also mainstream commercial enterprises. They expect that corporations—especially those involved in fields such as life sciences—will one day need a similar ability to harness computing resources globally as their data requirements grow.


"If this works, it will spawn companies that will just set up clusters to provide grid computing to other people," says Steve Lloyd, who chairs the GridPP Collaboration Board, based at the Rutherford Appleton Laboratory in Oxfordshire, England. GridPP is working with the international team to develop the grid the LHC will use.


CERN, the European laboratory for particle physics, is leading the LHC and its grid effort. From CERN's facility near Geneva, the data produced by the particle accelerator will be distributed to nine other major computing centers, including the Brookhaven National Laboratory and the Fermi National Accelerator Laboratory in the U.S., says Fabio Hernandez, grid technical leader at one of the major project sites in France.



Our Commenting Policies