HPC Software Shortfall Limits User Benefits

WASHINGTON -- High-performance computing is emerging as a critical IT need at many large companies that use simulation and virtualization to design and test their products. But there's a growing gap between the hardware and software capabilities in HPC systems.

Although hardware vendors can build systems with hundreds or even thousands of processors, many of the HPC applications developed by software vendors typically utilize only 12 or 16 processors in parallel, according to IT managers who attended a conference here last week. That view was echoed in a newly released IDC report.

"Hardware is getting there," said Thomas Lange, director of corporate research and development, modeling and simulation at The Procter & Gamble Co. "Software is way behind."

If companies such as Cincinnati-based P&G could test new products in fully computer-generated environments, they might be able to reduce development time and bring goods to market more quickly.

But, Lange said, "our need for speed is huge." In P&G's case, simulating even an action so seemingly simple as removing a bottle cap can involve millions of calculations. Because of the current HPC application limits, physical testing of products may still be necessary, Lange noted. "Full virtualization is impossible," he said.

The software shortfall was one finding cited in IDC's report on high-performance applications, which was sponsored by the Defense Advanced Research Projects Agency and the Council on Competitiveness, a Washington-based advocacy group. The report was released in conjunction with the High Performance Computing Users Conference, which was organized by the council.

Most software vendors focus on the technical systems market, which revolves around PCs, workstations and small servers, because that's where most of the demand and revenue is, said Earl Joseph, an analyst at Framingham, Mass.-based IDC. The number of users that want to scale systems across hundreds or thousands of processors isn't large enough to justify the cost of rewriting and testing applications, Joseph said.

Loren Miller, director of IT research, development and engineering at The Goodyear Tire & Rubber Co. in Akron, Ohio, said the packaged HPC applications that he has installed can't scale beyond a 32-processor system, which is used to simulate processes related to tire manufacturing. Miller called that limiting from a usage standpoint.

But he said he's hopeful that vendors will begin to adapt their applications to run on more processors. "I think all it takes is for one of them to get it out there, and we will see a lot of adoption in parallel computing," Miller said.

Some vendors already support large numbers of CPUs. Paul Bemis, vice president of product marketing at Fluent Inc., said the Lebanon, N.H.-based company's fluid dynamics software can scale up to 1,000 processors. But Bemis added that fostering wider adoption of high-performance computing will require making it more accessible to smaller companies.

Fluent began offering its software as an online service two years ago, providing users with access to a 32-processor system. Bemis said he would like to move that service to a computing grid that could scale up to hundreds of CPUs.

"I think there is tremendous opportunity with grid," he said. But, he noted, the middleware needed to support high-performance computing use of grids doesn't exist.

According to the IDC report, many software vendors said they would be willing to partner with government agencies and academic institutions to accelerate the development of HPC applications.

Donald Paul, chief technology officer at Chevron Corp. in San Ramon, Calif., said the key role for government is at the research end. "The key role for industry is to connect into that research," he said.

Copyright © 2005 IDG Communications, Inc.

7 inconvenient truths about the hybrid work trend
Shop Tech Products at Amazon