High-performance computing (HPC) is emerging as a critical business need for companies that use simulation and virtualization to test and design products, because such companies can potentially cut product development time and improve speed to market. "Our innovation process is insatiable and perpetual," said Thomas Lange, director of corporate research and development, modeling, and simulation at The Proctor & Gamble Co, and "our need for speed is huge."
But there's a growing gap in hardware and software capabilities in high-performance systems. While hardware vendors can build systems with hundreds or thousands of processors, many HPC applications produced by independent software vendors typically use just 12 or 16 processors in parallel, according to users and a newly released study by market research company IDC.
"Hardware is getting there," said Lange, but "software is way behind."
That was one of the conclusions in the IDC study on HPC applications. The study, sponsored by the Defense Advanced Research Projects Agency and the Council on Competitiveness, was released yesterday in conjunction with the council's High Performance Users Conference here. Lange and other HPC users, vendors and researchers attended the one-day event.
The study found that most HPC vendors focus on the technical system market, which relies on PCs, workstations and smaller servers, because that's where the demand and revenue is, said Earl Joseph, an analyst at IDC. The number of users looking to scale their systems across hundreds or thousands of processors isn't large enough to justify the cost of rewriting and testing applications on large systems, he said.
The council believes HPC is critical for US companies to be competitive. For instance, in Proctor & Gamble's case, even something as simple as removing a bottle cap can require millions of calculations to recreate and test in a computer-generated environment. But because of the limits of software, physical testing is often still needed, said Lange.
"Full virtualization is impossible," he said.
Conference attendees generally agreed that there's great business value in being able to conduct product development in a virtual environment, and some industries -- including life sciences, transportation, and oil and gas -- develop code that can scale across many processors. But broader adoption of HPC is needed, some argued at the conference.
"Innovation will be the single most important factor in determining America's competitiveness," council President Deborah Wince-Smith said at the conference. HPC "is a fundamental platform for business advantage," she said.
Donald Paul, vice president and chief technology officer at Chevron, said the main role for government should be research, and "the key role for industry is to connect into that research."
Another attendee, Loren Miller, director of IT research development and engineering at The Goodyear Tire & Rubber Co. said that packaged HPC applications used to simulate tire-manufacturing processes can scale up to only 32 processors. Miller called that limiting and said he hopes vendors will begin to adapt applications to run on more processors.
"All it takes is for one of them to get it out there, and I think we will see a lot of adoption in parallel computing," said Miller.
Some vendors are already doing that. Paul Bemis, vice president of product marketing at Fluent, a Lebanon -based maker of fluid-dynamic software, says his system can scale up to 1,000 processors. But Bemis said he believes that wider adoption of HPC will depend on its accessibility to smaller businesses.
Fluent started offering software as a service two years ago with access to a 32-processor system. But Bemis said he would like to move that service onto a grid that could scale up to hundreds of processors.
"I think there is tremendous opportunity with grid," said Bemis, noting that although hardware to support that model exists, there isn't the middleware software layer needed to support periodic HPC use of the grid.