Skip the navigation

Some say U.S. supercomputing needs a jump-start

They want to see a larger push by the federal government

July 29, 2004 12:00 PM ET

Computerworld - WASHINGTON – Legislation intended to strengthen supercomputing development in the U.S. is being endorsed by a Ford Motor Co. IT official who maintains that the government's emphasis on parallel processing in supercomputing is undercutting research and hurting the country's ability to compete.

The U.S. House of Representatives this month passed two supercomputing-related bills: HR 4218, the High-Performance Computing Revitalization Act of 2004, and HR 4516, which seeks about $200 million in funding for supercomputer development at the U.S. Department of Energy.

The legislation aims to bring a coordinated approach to federal supercomputing development and require U.S. agencies to make supercomputers available to researchers.

Vincent Scarafino, manager of numerically intensive computing in Ford's supercomputing program, has testified in Congress on the need for a larger federal role in supercomputing development.

In an interview, Scarafino said the U.S. has been losing its edge in supercomputing because of a shift, beginning in the mid-1990s, to parallel processing using relatively inexpensive commodity components instead of concentrating on developing new kinds of processors. That has led to reduced investment by the government, he said.

Not Pushing the Envelope

Parallel processing has cut prices and helped increase productivity. But there has been a trade-off, said Scarafino.

"We can do analysis now that is cheaper than it was five years ago, and that's great. But we're not pushing the envelope like we used to," he said. "Instead of learning how to do new things, we're learning how to do old things cheaper." Scarafino compared it to eating one's seed corn.

Parallel processing is also labor-intensive, requiring the expertise of computer scientists to program so problems can be solved simultaneously. In contrast, classic supercomputers that rely on very fast, specially designed vector processors "could be programmed in Fortran," Scarafino said. "They could be programmed in a language that mere mortals . . . could program in."

Kevin Wohlever, director of operations of the Ohio Supercomputer Center's Springfield facility, agreed that the push toward parallel processing in the U.S. has been a hindrance.

"If we keep trying to make all codes fit into the cluster environments, we're losing the opportunity to make the codes that run best in the vector environment," said Wohlever. He said government-backed supercomputer development efforts in Japan and Europe have improved weather forecasting there. Japan has the world's largest supercomputer.

The Washington-based Computing Research Association, which represents academic and business research groups, praised the legislative effort but noted that next year's proposed federal budget for IT research is 0.7% below this year's allocation.

Rep. Judy Biggert (R-Ill.), one of the bill's authors, said she hopes the measures will get federal agencies "to really jump-start the next generation of high-end computers." Biggert's legislation has White House support.

Our Commenting Policies