Scientists studying Earth system processes, including climate change, are now working with one of the largest supercomputers on the planet.
The National Center for Atmospheric Research (NCAR) has begun using a 1.5 petaflop IBM system, called Yellowstone, that is among the top 20 supercomputers in the world, at least until the global rankings are updated next month.
For NCAR researchers it is an enormous leap in compute capability -- a roughly 30 times improvement over its existing 77 teraflop supercomputer. Yellowstone is a 1,500 teraflops system capable of 1.5 quadrillion calculations per second.
The NCAR-Wyoming Supercomputing Center in Cheyenne, where this system is housed, says that with Yellowstone, it now has "the world's most powerful supercomputer dedicated to geosciences."
Along with climate change, this supercomputer will be used on a number of geoscience research issues, including the study of severe weather, oceanography, air quality, geomagnetic storms, earthquakes and tsunamis, wildfires, subsurface water and energy resources.
The supercomputer gives researchers new capabilities. They can run more experiments with increased complexity and at a higher resolution, according to interviews with researchers.
Scientists will be able to use the supercomputer to model the regional impacts of climate change. A model that is 100 km (62 miles) is considered coarse because the grid covers a large distance. But this new system may be able to reduce resolution to as much as 10 km (6.2 miles), giving scientists the ability to examine climate impacts in greater detail.
People "want to know what [climate change] is going to do to precipitation in Spain or in Kansas," said Rich Loft, the director of technology development at the center.
Loft said they plan to give 11 research projects first crack at the machine "to try to do some breakthrough science straight away and try to shake the machine."
"We want to see what happens when users beat on it instead of just doing acceptance testing," said Loft.
Yellowstone is running in a new $70 million data center. The value of the supercomputer contract was put at $25 million to $35 million. It has 100 racks, with 72,288 compute cores from Intel Sandy Bridge processors.
Scientists have been able to run some of their work on larger systems at other facilities, but they are competing for time with other sciences.
Among the scientists who will be using the NCAR system, is Marika Holland, whose research includes studying climate change in the polar region. The earlier systems are running models at "more of an approximation than we would like," she said.
Similar to understanding precipitation in temperate regions, Holland said the higher resolutions enabled by this new system will allow them to look explicitly at the influence of storms on Arctic Sea ice, as well as ice reductions along the coast and coastal erosion.
The Arctic Sea ice set a new minimum this year, 300,000 square miles less than the previous satellite record in September 2007, of 1.61 million square miles, according to NASA last month.
The loss of sea ice covering during the summer, as well as over the last several years, "has been pretty extreme and more extreme than most of our climate models predict," said Holland.
The work accomplished by scientists through observation, theoretical studies and other scientific efforts builds knowledge that is incorporated into computer models, which then become better predictive tools, said Holland.
There is a lot of understanding and fundamental research that needs to go on, said Holland, "but we also need bigger computers."
Patrick Thibodeau covers cloud computing and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov or subscribe to Patrick's RSS feed . His e-mail address is firstname.lastname@example.org.