Climate change research gets petascale supercomputer
1.5-petaflop IBM Yellowstone system runs 72,288 Intel Xeon cores
Computerworld - Scientists studying Earth system processes, including climate change, are now working with one of the largest supercomputers on the planet.
The National Center for Atmospheric Research (NCAR) has begun using a 1.5 petaflop IBM system, called Yellowstone, that is among the top 20 supercomputers in the world, at least until the global rankings are updated next month.
For NCAR researchers it is an enormous leap in compute capability -- a roughly 30 times improvement over its existing 77 teraflop supercomputer. Yellowstone is a 1,500 teraflops system capable of 1.5 quadrillion calculations per second.
The NCAR-Wyoming Supercomputing Center in Cheyenne, where this system is housed, says that with Yellowstone, it now has "the world's most powerful supercomputer dedicated to geosciences."
Along with climate change, this supercomputer will be used on a number of geoscience research issues, including the study of severe weather, oceanography, air quality, geomagnetic storms, earthquakes and tsunamis, wildfires, subsurface water and energy resources.
The supercomputer gives researchers new capabilities. They can run more experiments with increased complexity and at a higher resolution, according to interviews with researchers.
Scientists will be able to use the supercomputer to model the regional impacts of climate change. A model that is 100 km (62 miles) is considered coarse because the grid covers a large distance. But this new system may be able to reduce resolution to as much as 10 km (6.2 miles), giving scientists the ability to examine climate impacts in greater detail.
People "want to know what [climate change] is going to do to precipitation in Spain or in Kansas," said Rich Loft, the director of technology development at the center.
Loft said they plan to give 11 research projects first crack at the machine "to try to do some breakthrough science straight away and try to shake the machine."
"We want to see what happens when users beat on it instead of just doing acceptance testing," said Loft.
Yellowstone is running in a new $70 million data center. The value of the supercomputer contract was put at $25 million to $35 million. It has 100 racks, with 72,288 compute cores from Intel Sandy Bridge processors.
Scientists have been able to run some of their work on larger systems at other facilities, but they are competing for time with other sciences.
Among the scientists who will be using the NCAR system, is Marika Holland, whose research includes studying climate change in the polar region. The earlier systems are running models at "more of an approximation than we would like," she said.
Similar to understanding precipitation in temperate regions, Holland said the higher resolutions enabled by this new system will allow them to look explicitly at the influence of storms on Arctic Sea ice, as well as ice reductions along the coast and coastal erosion.
The Arctic Sea ice set a new minimum this year, 300,000 square miles less than the previous satellite record in September 2007, of 1.61 million square miles, according to NASA last month.
The loss of sea ice covering during the summer, as well as over the last several years, "has been pretty extreme and more extreme than most of our climate models predict," said Holland.
The work accomplished by scientists through observation, theoretical studies and other scientific efforts builds knowledge that is incorporated into computer models, which then become better predictive tools, said Holland.
There is a lot of understanding and fundamental research that needs to go on, said Holland, "but we also need bigger computers."
Patrick Thibodeau covers cloud computing and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov or subscribe to Patrick's RSS feed . His e-mail address is email@example.com.
- Fujitsu guns for faster supercomputers with new chip
- Money talks, and that's all quantum maker D-Wave has to say
- IBM project aims to forecast and control Beijing's air pollution
- China has the fastest supercomputer, but the U.S. still rules
- ISC: Cray makes Lustre palatable for storage administrators
- SC500: China wins a slowing supercomputer race
- Fujitsu 56 Gbps circuit doubles communication speeds between CPUs
- HP enters supercomputing market with water-cooled Apollo system
- In exascale, Japan stands apart with firm delivery plan
- Here comes a supercomputing app store
Read more about High Performance Computing in Computerworld's High Performance Computing Topic Center.
- Cloud Computing Drives IT and Business Agility Hybrid Cloud Accelerates Time to Value What is the main focus for IT in your organization - cost or agility? Many IT discussions today focus on cost controls rather...
- Infographic:10 Reasons to Choose vCloud Air Looking to create an agile, productive, and efficient IT environment? Read this simple infographic to learn about the benefits that VMware vCloud® Air™...
- Data Visualization Techniques: From Basics to Big Data with SAS Visual Analytics This paper discusses some of the basic issues concerning data visualization, from data size and column composition, to solving unique challenges presented by...
- 5 Hybrid Cloud Starting Points Did you know that more than 50% of organizations are already using or planning a move to hybrid cloud?
- Cloud BI in Action: Recorded Webinar of Customer, Kony, Inc. See how Kony, Inc., a leading enterprise mobility company, is using TIBCO Jaspersoft for Amazon Web Services and Redshift to achieve embedded analytics...
- Cloud BI Overview: Jaspersoft for AWS Check out this overview of Jaspersoft for AWS, to easily and affordably build business intelligence solutions as well as embed visualizations and analytics... All High Performance Computing White Papers | Webcasts