A particle collider leads data grid developers to unprecedented dimensions.
Computerworld - In 2007, scientists will begin smashing protons and ions together in a massive, multinational experiment to understand what the universe looked like tiny fractions of a second after the Big Bang. The particle accelerator used in this test will release a vast flood of data on a scale unlike anything seen before, and for that scientists will need a computing grid of equally great capability.
The Large Hadron Collider (LHC), which is being built near Geneva, will be a circular structure 17 miles in circumference. It will produce data in the neighborhood of 1.5GB/sec., or as many as 10 petabytes of data annually, 1,000 times bigger than the Library of Congress' print collection. The data flows will likely begin in earnest in 2008.
As part of this effort, which is costing about 5 billion euros ($6.3 billion U.S.), scientists are building a grid using 100,000 CPUs, mostly PCs and workstations, available at university and research labs in the U.S., Europe, Japan, Taiwan and other locations. Scientists need to harness raw computing power to meet computational demands and to give researchers a single view of this disbursed data.
This latter goalcreating a centralized view of data that may be located in Europe, the U.S. or somewhere elseis the key research problem.
Centralizing the data virtually, or creating what is called a data grid, means extending the capability of existing databases, such as Oracle 10g and MySQL, to scale to these extraordinary data volumes. And it requires new tools for coordinating data requests across the grid in order to synchronize multiple, disparate databases.
Tony Doyle, project leader of Grid Particle Physics (GridPP) project
Researchers believe that improving the ability of a grid to handle petabyte-scale data, split up among multiple sites, will benefit not only the scientific community but also mainstream commercial enterprises. They expect that corporationsespecially those involved in fields such as life scienceswill one day need a similar ability to harness computing resources globally as their data requirements grow.
"If this works, it will spawn companies that will just set up clusters to provide grid computing to other people," says Steve Lloyd, who chairs the GridPP Collaboration Board, based at the Rutherford Appleton Laboratory in Oxfordshire, England. GridPP is working with the international team to develop the grid the LHC will use.
CERN, the European laboratory for particle physics, is leading the LHC and its grid effort. From CERN's facility near Geneva, the data produced by the particle accelerator will be distributed to nine other major computing centers, including the Brookhaven National Laboratory and the Fermi National Accelerator Laboratory in the U.S., says Fabio Hernandez, grid technical leader at one of the major project sites in France.
- Data Warehouse Augmentation: The Queryable Data Store While organizations have, to date, been busy exploring and experimenting, they are now beginning to focus on using big data technologies to solve...
- Rebranded Quadmark revamps its IT solutions with Google Apps Switching to Google Apps halved Quadmark's IT admin costs while achieving 10% time savings per employee. The global consulting firm now spends 80%...
- CrashPlan PROe Security Because mobile laptops often are connected to unsecured networks, a very high standard of security is required to ensure privacy.
- Protecting Digitalized Assets in Healthcare Healthcare providers face an urgent, internal battle every day: security and compliance versus productivity and service. For most healthcare organizations, the fight is...
- Live Webcast LIVE EVENT: 5/7, The End of Data Protection As We Know It. Introducing a Next Generation Data Protection Architecture. Traditional backup is going away, but where does this leave end-users?
- LIVE EVENT: 5/7, The End of Data Protection As We Know It. Introducing a Next Generation Data Protection Architecture. Traditional backup is going away, but where does this leave end-users?
- Make or Break: New Auto Products Must Go To Market On Time This Webcast quantifies the value of time to market for the auto industry and highlights how Primavera Enterprise Portfolio Management can help organizations. All Data Storage White Papers | Webcasts