Will LHC compute grid think deeply and then say, "42"?

In Thursday's IT Blogwatch, we're wowed by the Large Hadron Collider's enormous compute grid. Not to mention Kevin Kelly's Styrobot...

Sharon Gaudin reports:

With the world's biggest physics experiment ... scientists from around the

CERN control
world are hoping to find answers to a question that has haunted mankind for centuries: How was the universe created? The Large Hadron Collider (LHC) ... under construction for 20 years ... [is] a 17-mile, vacuum-sealed loop at a facility that sits astride the Franco-Swiss border ... buried [150 ft] to [450 ft] below the ground.

...

And a worldwide grid of servers and desktops will help the scientific team make sense of the information that they expect will come pouring in ... The computer infrastructure is critical to the work being done in the particle collider.

...

The U.S. portion of the global grid is a computational and data storage infrastructure made up of more than 25,000 computers and 43,000 CPUs. The mostly Linux-based machines are linked into the grid from universities, the U.S. Department of Energy, the National Science Foundation and software development groups. Pordes also said the U.S. grid offers up about 300,000 compute hours a day with 70% of it going to the particle collider project.
more

Barbara Krasnoff adds:

The tech story of the year -- perhaps the decade -- was nearly lost in all the recent cacophony about Chrome, iPods, and iPhone updates. What am I talking about? The first test runs of the Large Hadron Collider, of course.

...

The Large Hadron Collider includes a facility the size of a small town ... [and] includes a computational and data storage infrastructure made up of tens of thousands of computers around the world. It was built to help us (or, at least, those of us who understand particle physics) understand conditions in the universe just moments after its conception. In other words -- how things work ... (and with apologies to Douglas Adams): life, the universe, and everything.
more

John Naughton looks at CERN's stats:

No black holes — but a data tsunami ... "The Large Hadron Collider will produce roughly 15 petabytes (15 million gigabytes) of data annually ... CERN is collaborating with institutions in 33 different countries to operate a distributed computing and data storage infrastructure: the LHC Computing Grid (LCG). Data from the LHC experiments is distributed around the globe ... After initial processing, this data is distributed to eleven large computer centres – in Canada, France, Germany, Italy, the Netherlands, the Nordic countries, Spain, Taipei, the UK, and two sites in the USA – with sufficient storage capacity for a large fraction of the data, and with round-the-clock support for the computing grid."

...

Hopefully, all of this is not orchestrated by Windows servers.
more

Dr. Douglas Eadline assuages those fears:

The LHC data processing effort is without a doubt a worldwide computer. Using grid, storage, and cluster technology, a world wide computer of the largest scale will jump to life when collisions at a smallest scale take place. There is a certain kind of irony in that kind of experiment. But that is not all. To help build the LHC, the LHC@home project was developed

...

Much of this huge endeavor is based on GNU/Linux, Globus, Condor, and a slew of other middle-ware packages. That “open thing” again, it just seems to make those world changing monumental scientific projects works a little better.
more

Laurianne McLaughlin has more:

As Pierre Vande Vyvre, a project leader for data acquisition for CERN, told us, he had to design a storage system for one of the four experiments, ALICE (A Large Ion Collider Experiment). It's one of the biggest physics experiments of our time, boasting a team of more than 1,000 scientists from around the world.

For one month per year, the LHC will be spitting out project data to the ALICE team at a rate of 1GB per second. That's 1GB per second, for a full month, "day and night," Vande Vyvre says.

For this month, that data rate is an entire order of magnitude larger than each of the other three experiments being done with the LHC. In total, the four experiments will generate petabytes of data.
more

Meanwhile, areReady deals with the end-of-the-world question:

Our current understanding is that black holes DO dissipate, through Hawking Radiation. Tiny black holes fade away almost instantaneously ... [and] tiny black holes are formed all the time. When interstellar dust hits the atmosphere, the resulting energy discharge can form tiny black holes, and fairly often. Most of them dissipate harmlessly.

Wait, there's more! Some black holes DO form when they hit the atmosphere and survive. Know what happens to them? Well, first consider how small a chunk of mass dense enough to be considered a black hole has to be when it's composed of the equivalent of a few protons. We are talking sub-electron size here. These black holes sink to the center of the Earth, but are so small they don't interact with any atoms on the way down. They sit at the center of the Earth, absorbing a new particle every few thousand years.

Events with the power of the LHC happen all the time at the edges of the atmosphere, and if they really had a reasonable capacity to cause a catastrophic event, it would have happened naturally many times over already.

That said, the night before collisions start, I'm having an End of the Universe party.
more

And finally...

Buffer overflow:

Other Computerworld bloggers:

RSS feed icon
Like this stuff? Subscribe to the RSS feed.

Richi Jennings is an independent analyst/adviser/consultant, specializing in blogging, email, and spam. A 22 year, cross-functional IT veteran, he is also an analyst at Ferris Research. You can follow him on Twitter, pretend to be Richi's friend on Facebook, or just use boring old email: blogwatch@richi.co.uk.


Previously in IT Blogwatch:

Copyright © 2008 IDG Communications, Inc.

  
Shop Tech Products at Amazon