Collider probing mysteries of the universe at the speed of light

Worldwide computer grid helps scientists make sense of data coming from collider experiments

With the world's biggest physics experiment ready to fire up tomorrow, scientists from around the world are hoping to find answers to a question that has haunted mankind for centuries: How was the universe created?

The Large Hadron Collider (LHC), which has been under construction for 20 years, will shoot its first beam of protons around a 17-mile, vacuum-sealed loop at a facility that sits astride the Franco-Swiss border. The test run of what is the largest, most powerful particle accelerator in the world, is a forebear to the coming time when scientists will accelerate two particle beams toward each other at 99.9% of the speed of light.

Smashing the beams together will create showers of new particles that should re-create conditions in the universe just moments after its conception.

Tomorrow's test run is a critical milestone in getting to that ultimate undertaking. And a worldwide grid of servers and desktops will help the scientific team make sense of the information that they expect will come pouring in.

"This will move the limit of our understanding of the universe," said Ruth Pordes, executive director of the Open Science Grid, which was created in 2005 to support the LHC project. "I'm very excited about the turning on of the accelerator. Over the next two years, our grids will be used by thousands of physicists at LHC to make new scientific discoveries. That's what it's all for."

Pordes noted that the U.S. portion of the global grid is a computational and data storage infrastructure made up of more than 25,000 computers and 43,000 CPUs. The mostly Linux-based machines are linked into the grid from universities, the U.S. Department of Energy, the National Science Foundation and software development groups. Pordes also said the U.S. grid offers up about 300,000 compute hours a day with 70% of it going to the particle collider project.

Harvey Newman, a physics professor at the California Institute of Technology, told Computerworld that there are about 30,000 servers and more than 100,000 cores around the world hooked into grids that support the LHC project.

"The distributed computing model is essential to doing the computing, storage and hosting of the many petabytes of data from the experiments," said Newman. "Coordinating data distribution, processing and analysis of the data collaboratively by a worldwide community of scientists working on the LHC are key to the physics discoveries. Only a worldwide effort could provide the resources needed."

The computer infrastructure is critical to the work being done in the particle collider, which is a tunnel buried 50 meters to 150 meters below the ground. The tunnel, or tube, is designed to facilitate and control a head-on collision between two beams of the same kind of particles -- either protons or ions. Traveling through a vacuum comparable to outer space, the beams are guided around the tube by superconducting magnets.

According to documents from CERN, as the European Organization for Nuclear Research is known, each of the two beams will contain about 3,000 bunches of particles. Each bunch will hold as many as 100 billion particles. Despite these huge numbers, the particles are so tiny that a collision between any two is quite small. However, since the beams will be traveling at near light speed around the 17-mile tube, they'll cross each other about 30 million times per second, resulting in an estimated 600 million collisions.

If a beam circulates around the tunnel for 10 hours, for instance, it will travel more than 10 billion kilometers, which is the distance it would take to travel to Neptune and back.

With the Big Bang theory, scientists largely believe that more than 13 billion years ago an amazingly dense object the size of maybe a coin expanded into the universe that we know now -- with planets, stars, black holes and life.

Bolek Wyslouch, a professor of physics at MIT who has been working on the collider project for the last seven years, said that a main goal of the experiments is to find the elusive Higgs particle that is believed to be responsible for giving other particles their mass. Though its existence hasn't been proven yet, it's believed that Higgs particles are what give electrons their weight, for instance.

Scientists are also hoping the particle collider will give them information about dark energy and dark matter.

"This is part of the quest to explore our surroundings. It's part of the quest to understand our world and ourselves," said Wyslouch. "We are trying to describe the basic elements of the matter surrounding us -- to understand the basic infrastructure of how things work. The knowledge of this microscopic world can be translated into knowledge of the whole universe -- how it was formed, where all the matter is coming from."

As the time for tomorrow's experiment has neared, rumors have increasingly circulated around the Internet that the experiments might destroy the universe by accidentally creating a black hole that would suck everything and everyone into it.

CERN released a report late last week saying that safety fears about the LHC are "unfounded." CERN Director General Robert Aymar was quoted as saying that any suggestion that there's a risk is "pure fiction."

To express your thoughts on Computerworld content, visit Computerworld's Facebook page, LinkedIn page and Twitter stream.
Fix Windows 10 problems with these free Microsoft tools
Shop Tech Products at Amazon
Notice to our Readers
We're now using social media to take your comments and feedback. Learn more about this here.