U.S., EU, Russia set aside $13.6M for exascale software work
Look to upgrade open source model that can't produce next generation on its own
Computerworld - An coalition of countries, including the United States, has agreed to fund projects set up to develop software for the next generation of supercomputers, which are expected to arrive in 2019 and be 1,000 times more powerful than the fastest machines today.
Most of the software components that run supercomputers today were built using open source procedures like discussion lists and code repositories, which has left some development gaps.
By agreeing to set aside funds for supercomputer software development projects, the U.S. Canada, France, Germany, Japan, Russia and the United Kingdom, are heeding the arguments of some top researchers who believe that the open source development model alone cannot deal with all the issues posed by exascale technology, or even by the just arrived petascale systems.
The G8 Research Councils in the nations backing this effort this month quietly began a program offering offering 10 million Euros ($13.6 million U.S.) for projects that support exascale software development. Developers have until May to submit preliminary proposals for the money.
Long before the movie Avatar, supercomputers have been creating complex 3-D simulations of natural disasters, climate change and other events. Simulation -- and modeling -- "has become the third pillar of science," said G8 in announcing the availability of the development funds. The G8 specifically singled out climate change, energy, water and environment as a key focus of study for the next generation computing systems.
The challenge of developing software for these new systems "is really daunting," said Jack Dongarra, a professor of computer science at University of Tennessee and a distinguished research staff member at Oak Ridge National Laboratory. Machines that have a quarter of million compute cores today are expected, within the decade, to have as many as 100 million cores.
"We're interested at looking at what is needed in terms of standards, in terms of a real software stack for exascale, and we have to start planning now," said Dongarra.
These exascale systems, capable of million trillion, or a quintillion, calculations per second, are an order of magnitude beyond what today's software can deal with, said Dongarra. There is a lack of programming languages that can deal with parallelism on an exascale level, he said. Dongarra also said the software will have problems related to fault tolerance when handling component failures. Communications delays would also be an issue.
A year ago, Dongarra and Pete Beckman, director of Argonne Leadership Computing, helped form the International Exascale Software Project to help develop roadmaps and coordinate research for exascale systems.
The international agreement to spend on software development projects comes at the same time that nations have been cutting back spending on HPC projects focusing on climate and weather systems. In 2009, worldwide spending on high performance computing climate and weather projects was $353 million versus $392 million in 2008, according to market research firm IDC.
HPC spending on weather and climate projects is expected to increase to $470 million worldwide in 2013, said IDC.
Climate change is increasingly getting more government attention. On Feb. 8, the National Oceanic and Atmospheric Administration announced a reorganization and creation of the NOAA Climate Service to focus climate change issues.
While the funding for high performance computing may be uncertain, the path of supercomputing development is not. Even though the architecture and technology of exascale is still a work in progress, advances in computing power have occurred at predictable points. The first petascale system, running at one thousand trillion (one quadrillion) sustained floating-point operations per second, was produced in 2008 by IBM.
The G8 forecast for the near future is: 10 petaflops by 2013, 100 petaflops by 2016 and one exaflop by 2019.
Patrick Thibodeau covers SaaS and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov, send e-mail to email@example.com or subscribe to Patrick's RSS feed .
Read more about High Performance Computing in Computerworld's High Performance Computing Topic Center.
- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- 4 Customers who never have to refresh their PCs again This paper illustrates a common theme: the combination of desktop virtualization and thin client computing helps organizations deliver an up-to-date user experience more...
- Mobile Devices: The New Thin Clients Get essential guidance for understanding the role thin clients plus virtual desktops play in the enterprise today.
- Taking Windows Mobile on Any Device Taking Windows applications mobile has many advantages, but the process of identifying a solution is complex. Learn how to solve this complex problem...
- PaaS - Powering a New Era of Business IT Why PaaS has suddenly become relevant and irresistible to many organizations. Dive into the opportunities and considerations associated with using PaaS from an...
- Redefine Your IT Operations: Remote Office IT Has Never Been Simpler Join us to see why PC Pro named Dell PowerEdge VRTX the "2013 Server of the Year." PowerEdge VRTX may be just what...
- Meg Whitman presents Unlocking IT with Big Data During this Web Event you will hear Meg Whitman, President and CEO, HP discuss HAVEn - the #1 Big Data platform, as well... All Hardware White Papers | Webcasts