Obama sets $126M for next-gen supercomputing
'Exascale' arrives for first time in federal budget
Computerworld - WASHINGTON -- President Barack Obama has included funding in his 2012 budget proposal for development of the next generation of supercomputers, an exascale system.
The money is going to the U.S. Department of Energy, which has led in developing the world's fastest computers.
If Congress approves Obama's request, DOE will get $126 million for exascale development, with about $91 million for the DOE's Office of Science and $36 million for the National Nuclear Security Administration.
In seeking this funding, the Obama administration made a little history. A DOE spokesman said it marked the first time that the budget explicitly references "exascale." The DOE had budgeted just over $24 million in 2011 in context of "extreme scale" computing.
Exascale systems are 1,000 times more powerful than the Tianhe-1A, the Chinese supercomputer that was recently ranked as the world's fastest.
The exascale funding is part of an overall DOE advanced-computing request for next year of $465 million, which represents a 21% increase over the 2010 budget, a two-year increase.
The White House isn't comparing spending to the 2011 budget because Congress, for now, is funding the government through Continuing Resolutions, which could change the budget amount for this year. The current funding resolution expires March 4.
In setting aside money for exascale computing, the White House is planning for a predictable future in high-performance computing. Every 10 or 11 years, high-performance computing crosses a barrier, thanks largely to improvements in chip performance.
In 1997, ASCI Red, a computer at DOE's Sandia National Labs, achieved 1.3 teraflops, or one trillion sustained floating-point operations per second. In 2008, IBM's Roadrunner, at DOE's Los Alamos National Laboratory, was the first system to reach one petaflop, capable of more than one thousand trillion (one quadrillion) operations per second.
An exaflop is a million trillion calculations per second, or a quintillion, and is a thousand times faster than a petaflop.
The development of an exascale system is estimated to happen in the 2018-2020 time frame, but it is also contingent on the development of software systems that can utilize what may be 100 million cores.
Supercomputers are used for modeling and simulation, and the larger the systems, the higher the resolution. An exascale system, for instance, may be able to simulate the workings of an entire human cell as well as improve forecasting and understanding of climate change.
Also, the advances needed to build these systems, such as faster networking, may ultimately find their way into business-class servers.
The DOE has not yet said how exascale funding will be used, but the supercomputing research community has active research efforts under way. In the interim, DOE is now building 10 petaflop systems, such as the recently announced IBM system planned at Argonne National Laboratory.
Patrick Thibodeau covers SaaS and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov or subscribe to Patrick's RSS feed . His e-mail address is firstname.lastname@example.org.
- In exascale, Japan stands apart with firm delivery plan
- Here comes a supercomputing app store
- An HPC champion helps Trek Bicycle shift gears
- D-Wave pitches quantum co-acceleration to supercomputing set
- Why the U.S. may lose the race to exascale
- Top500 shows growing inequality in supercomputing power
- Supercomputing's big problem: What's after silicon?
- Cray brings Hadoop to supercomputing
- Intel rushes to exascale with redesigned Knights Landing chip
- China still has the fastest supercomputer in the world
Read more about High Performance Computing in Computerworld's High Performance Computing Topic Center.
- Best iPhone, iPad Business Apps for 2014
- 14 Tech Conventions You Should Attend in 2014
- 10 Desktop Apps to Power Your Windows PC
- How to Add New Job Skills Without Going Back to School
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- Case Study: Murphy USA Gains Application Visibility Without Agents Murphy USA has more than 700 stores that share a 10Mbps VSAT link. So when something goes wrong with their applications, it's the...
Red Hat Enterprise Linux - The Original Cloud Operating System
Linux adoption is growing against a number of measures, such as the
number of supercomputers that run Linux and the size of the contributing...
- OpenStack Hype vs. Reality: CIO Quick Pulse Open-source architecture can enable IT departments to build infrastructure-as-a-service (IaaS) clouds running on standard hardware.
- Building a Bridge to the Next Generation Data Center Selecting a widely adopted operating system is a foundational component of a standardization strategy.
- Webinar: Building a Big Data solution that's production-ready Big data solutions are no longer just a nice-to-have.
- Meg Whitman presents Unlocking IT with Big Data During this Web Event you will hear Meg Whitman, President and CEO, HP discuss HAVEn - the #1 Big Data platform, as well... All High Performance Computing White Papers | Webcasts