Some say U.S. supercomputing needs a jump-start
They want to see a larger push by the federal government
Computerworld - WASHINGTON Legislation intended to strengthen supercomputing development in the U.S. is being endorsed by a Ford Motor Co. IT official who maintains that the government's emphasis on parallel processing in supercomputing is undercutting research and hurting the country's ability to compete.
The U.S. House of Representatives this month passed two supercomputing-related bills: HR 4218, the High-Performance Computing Revitalization Act of 2004, and HR 4516, which seeks about $200 million in funding for supercomputer development at the U.S. Department of Energy.
The legislation aims to bring a coordinated approach to federal supercomputing development and require U.S. agencies to make supercomputers available to researchers.
Vincent Scarafino, manager of numerically intensive computing in Ford's supercomputing program, has testified in Congress on the need for a larger federal role in supercomputing development.
In an interview, Scarafino said the U.S. has been losing its edge in supercomputing because of a shift, beginning in the mid-1990s, to parallel processing using relatively inexpensive commodity components instead of concentrating on developing new kinds of processors. That has led to reduced investment by the government, he said.
Not Pushing the Envelope
Parallel processing has cut prices and helped increase productivity. But there has been a trade-off, said Scarafino.
"We can do analysis now that is cheaper than it was five years ago, and that's great. But we're not pushing the envelope like we used to," he said. "Instead of learning how to do new things, we're learning how to do old things cheaper." Scarafino compared it to eating one's seed corn.
Parallel processing is also labor-intensive, requiring the expertise of computer scientists to program so problems can be solved simultaneously. In contrast, classic supercomputers that rely on very fast, specially designed vector processors "could be programmed in Fortran," Scarafino said. "They could be programmed in a language that mere mortals . . . could program in."
Kevin Wohlever, director of operations of the Ohio Supercomputer Center's Springfield facility, agreed that the push toward parallel processing in the U.S. has been a hindrance.
"If we keep trying to make all codes fit into the cluster environments, we're losing the opportunity to make the codes that run best in the vector environment," said Wohlever. He said government-backed supercomputer development efforts in Japan and Europe have improved weather forecasting there. Japan has the world's largest supercomputer.
The Washington-based Computing Research Association, which represents academic and business research groups, praised the legislative effort but noted that next year's proposed federal budget for IT research is 0.7% below this year's allocation.
Rep. Judy Biggert (R-Ill.), one of the bill's authors, said she hopes the measures will get federal agencies "to really jump-start the next generation of high-end computers." Biggert's legislation has White House support.
- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
Changing the Way Government Works: Four Technology Trends that Drive Down Costs and Increase Productivity
This paper discusses four technology-based approaches to improving processes and increasing
productivity while driving down department and agency costs.
- HP HAVEn: See the big picture in Big Data HP HAVEn is the industry's first comprehensive, scalable, open, and secure platform for Big Data. Enterprises are drowning in a sea of data...
- What Datapipe customers need to know about the new PCI DSS 3.0 compliance standard This handy quick reference outlines what PCI DSS 3.0 is, who needs to be compliant and how Alert Logic solutions address the new...
- The 12 PCI DSS 3.0 requirements addressed by Peer 1 Hosting This handy quick reference outlines the 12 PCI DSS 3.0 requirements, who needs to be compliant and how Alert Logic solutions address the...
- Meg Whitman presents Unlocking IT with Big Data During this Web Event you will hear Meg Whitman, President and CEO, HP discuss HAVEn - the #1 Big Data platform, as well...
- The New Way to Work Knowledge Vault This Knowledge Vault focuses on how, in today's increasingly virtual world, it's more important than ever to engage deeply with employees, suppliers, partners,... All Gov't Legislation/Regulation White Papers | Webcasts