A particle collider leads data grid developers to unprecedented dimensions.
Computerworld - In 2007, scientists will begin smashing protons and ions together in a massive, multinational experiment to understand what the universe looked like tiny fractions of a second after the Big Bang. The particle accelerator used in this test will release a vast flood of data on a scale unlike anything seen before, and for that scientists will need a computing grid of equally great capability.
The Large Hadron Collider (LHC), which is being built near Geneva, will be a circular structure 17 miles in circumference. It will produce data in the neighborhood of 1.5GB/sec., or as many as 10 petabytes of data annually, 1,000 times bigger than the Library of Congress' print collection. The data flows will likely begin in earnest in 2008.
As part of this effort, which is costing about 5 billion euros ($6.3 billion U.S.), scientists are building a grid using 100,000 CPUs, mostly PCs and workstations, available at university and research labs in the U.S., Europe, Japan, Taiwan and other locations. Scientists need to harness raw computing power to meet computational demands and to give researchers a single view of this disbursed data.
This latter goalcreating a centralized view of data that may be located in Europe, the U.S. or somewhere elseis the key research problem.
Centralizing the data virtually, or creating what is called a data grid, means extending the capability of existing databases, such as Oracle 10g and MySQL, to scale to these extraordinary data volumes. And it requires new tools for coordinating data requests across the grid in order to synchronize multiple, disparate databases.
Tony Doyle, project leader of Grid Particle Physics (GridPP) project
Researchers believe that improving the ability of a grid to handle petabyte-scale data, split up among multiple sites, will benefit not only the scientific community but also mainstream commercial enterprises. They expect that corporationsespecially those involved in fields such as life scienceswill one day need a similar ability to harness computing resources globally as their data requirements grow.
"If this works, it will spawn companies that will just set up clusters to provide grid computing to other people," says Steve Lloyd, who chairs the GridPP Collaboration Board, based at the Rutherford Appleton Laboratory in Oxfordshire, England. GridPP is working with the international team to develop the grid the LHC will use.
CERN, the European laboratory for particle physics, is leading the LHC and its grid effort. From CERN's facility near Geneva, the data produced by the particle accelerator will be distributed to nine other major computing centers, including the Brookhaven National Laboratory and the Fermi National Accelerator Laboratory in the U.S., says Fabio Hernandez, grid technical leader at one of the major project sites in France.
- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- Using VM Archiving to Solve VM Sprawl This CommVault whitepaper discusses how archiving virtual machines can mitigate VM sprawl with a comprehensive approach to VM lifecycle management.
- Keep Your Network Available, Efficient and Secure Make the most of your network by working with experts who "get it." CDW and F5 have partnered to keep networks highly optimized....
- VCE Converged Infrastructure Enables Continuous Operation for Swiss Power Plant Read how Vblock™ Systems, running in active-active mode, enabled KKL to transform its twin data centers in just two months, enable continuous operations,...
- The Future of IT: A Customer First Approach Explore how customer-first policies can make use of social, mobile and cloud technologies to give workers the freedom and flexibility they desire to...
- Make or Break: New Auto Products Must Go To Market On Time This Webcast quantifies the value of time to market for the auto industry and highlights how Primavera Enterprise Portfolio Management can help organizations.
- IBM Flash Webcast: Optimizing your Datacenter for Efficient Storage & ROI Register for this webcast to learn the benefits of flash storage from IBM Customer, Leonardo Irastorza of Royal Caribbean Cruise Ltd and Storage... All Data Storage White Papers | Webcasts