U.S., EU, Russia set aside $13.6M for exascale software work
Look to upgrade open source model that can't produce next generation on its own
Computerworld - An coalition of countries, including the United States, has agreed to fund projects set up to develop software for the next generation of supercomputers, which are expected to arrive in 2019 and be 1,000 times more powerful than the fastest machines today.
Most of the software components that run supercomputers today were built using open source procedures like discussion lists and code repositories, which has left some development gaps.
By agreeing to set aside funds for supercomputer software development projects, the U.S. Canada, France, Germany, Japan, Russia and the United Kingdom, are heeding the arguments of some top researchers who believe that the open source development model alone cannot deal with all the issues posed by exascale technology, or even by the just arrived petascale systems.
The G8 Research Councils in the nations backing this effort this month quietly began a program offering offering 10 million Euros ($13.6 million U.S.) for projects that support exascale software development. Developers have until May to submit preliminary proposals for the money.
Long before the movie Avatar, supercomputers have been creating complex 3-D simulations of natural disasters, climate change and other events. Simulation -- and modeling -- "has become the third pillar of science," said G8 in announcing the availability of the development funds. The G8 specifically singled out climate change, energy, water and environment as a key focus of study for the next generation computing systems.
The challenge of developing software for these new systems "is really daunting," said Jack Dongarra, a professor of computer science at University of Tennessee and a distinguished research staff member at Oak Ridge National Laboratory. Machines that have a quarter of million compute cores today are expected, within the decade, to have as many as 100 million cores.
"We're interested at looking at what is needed in terms of standards, in terms of a real software stack for exascale, and we have to start planning now," said Dongarra.
These exascale systems, capable of million trillion, or a quintillion, calculations per second, are an order of magnitude beyond what today's software can deal with, said Dongarra. There is a lack of programming languages that can deal with parallelism on an exascale level, he said. Dongarra also said the software will have problems related to fault tolerance when handling component failures. Communications delays would also be an issue.
A year ago, Dongarra and Pete Beckman, director of Argonne Leadership Computing, helped form the International Exascale Software Project to help develop roadmaps and coordinate research for exascale systems.
The international agreement to spend on software development projects comes at the same time that nations have been cutting back spending on HPC projects focusing on climate and weather systems. In 2009, worldwide spending on high performance computing climate and weather projects was $353 million versus $392 million in 2008, according to market research firm IDC.
HPC spending on weather and climate projects is expected to increase to $470 million worldwide in 2013, said IDC.
Climate change is increasingly getting more government attention. On Feb. 8, the National Oceanic and Atmospheric Administration announced a reorganization and creation of the NOAA Climate Service to focus climate change issues.
While the funding for high performance computing may be uncertain, the path of supercomputing development is not. Even though the architecture and technology of exascale is still a work in progress, advances in computing power have occurred at predictable points. The first petascale system, running at one thousand trillion (one quadrillion) sustained floating-point operations per second, was produced in 2008 by IBM.
The G8 forecast for the near future is: 10 petaflops by 2013, 100 petaflops by 2016 and one exaflop by 2019.
Patrick Thibodeau covers SaaS and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov, send e-mail to email@example.com or subscribe to Patrick's RSS feed .
Read more about High Performance Computing in Computerworld's High Performance Computing Topic Center.
- Best iPhone, iPad Business Apps for 2014
- 14 Tech Conventions You Should Attend in 2014
- 10 Desktop Apps to Power Your Windows PC
- How to Add New Job Skills Without Going Back to School
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- Case Study: Murphy USA Gains Application Visibility Without Agents Murphy USA has more than 700 stores that share a 10Mbps VSAT link. So when something goes wrong with their applications, it's the...
Red Hat Enterprise Linux - The Original Cloud Operating System
Linux adoption is growing against a number of measures, such as the
number of supercomputers that run Linux and the size of the contributing...
- OpenStack Hype vs. Reality: CIO Quick Pulse Open-source architecture can enable IT departments to build infrastructure-as-a-service (IaaS) clouds running on standard hardware.
- Building a Bridge to the Next Generation Data Center Selecting a widely adopted operating system is a foundational component of a standardization strategy.
- Webinar: Building a Big Data solution that's production-ready Big data solutions are no longer just a nice-to-have.
- Meg Whitman presents Unlocking IT with Big Data During this Web Event you will hear Meg Whitman, President and CEO, HP discuss HAVEn - the #1 Big Data platform, as well... All High Performance Computing White Papers | Webcasts