Petaflop ImperativePetaflop Imperative
Breaking the petaflop barrier has all the symbolism of the four-minute mile. But businesses want progress on a more-important measurement: Time to insight.
Still unsettled is whether the computer industry can develop new designs that overcome performance limits of off-the-shelf supercomputers, while pleasing big-science government users and the time-constrained private sector at the same time. David Shaw, chairman of the D.E. Shaw Group, an investment and technology-development company that applies computational techniques to financial trading, isn't positive it can. "If we have to develop novel architectures to achieve world leadership in supercomputing, is there enough commonality between national and commercial needs to support a common architecture for both?" he asks. "We don't fully know the answer." Shaw, a former computer-science professor and Clinton administration technology adviser, is involved in a supercomputing survey of big companies being fielded by the Council on Competitiveness. Regardless of whether science and business are aligned, Shaw says, both fields are reaching the point that "hooking commercial microprocessors together" isn't enough to solve all emerging problems in defense, intelligence, drug design, and materials science.
Include finance in the mix as well. One technologist on Wall Street says his company will probably manage 1 million networked computing devices within five years and is going to need high-performance computers that can grow to handle them. "Blue Gene is right on the money for us," he says. "We'd like to trade away from clock speed, power consumption, and cooling if we could. Heck, we'd take that in a heartbeat."
Super Size Me |
---|
The worlds five fastest supercomputers, according to a new survey |
Earth Simulator: 35.9 teraflops, NEC, Earth Simulator Center, Japan |
Thunder: 19.9 teraflops, California Digital, Lawrence Livermore National Lab, California |
ASCI Q: 13.9 teraflops, Hewlett-Packard, Los Alamos National Lab, California |
Blue Gene/L: 11.7 teraflops, IBM, Watson Research Center, New York |
Tungsten: 9.8 teraflops, Dell, National Center for Supercomputing Applications, Illinois |
Data: Top500.org |
Not so fast, some experts say. The specialized architectures IBM and Cray say will reach a petaflop first are so hard to develop software for, there aren't likely many companies that will need that much juice. "If they could get certain kinds of database apps to run on [Blue Gene], that would be interesting to CIOs," says Larry Smarr, director of the California Institute for Telecommunications and Information Technology and founder of the National Science Foundation's supercomputing program. "The problem with specialized architectures is it takes too long" to develop software.
Merck & Co. is a big user of supercomputers from IBM, Sun, and others in its research division, which tests new molecular compounds for their efficacy as potential drugs. Company scientists have their eyes on Blue Gene, but historic milestones don't mean much to them. "The petaflop in and of itself isn't enough to gain our enthusiasm," says Irene Qualters, VP for research information systems. "It has to be a sustainable architecture we can invest in over the long haul." In many cases, traditional wet-lab instruments yield cheaper, better results than computer simulations, adds senior computing director Jeff Saltzman. "Blue Gene is a long-term investment with an uncertain probability of success," Saltzman says. "We're a moving target as far as our requirements. If we can't compute it, then we'll do experiments."
Yet it's not just biotech, aerospace, and finance companies that depend on superpowerful computers. Procter & Gamble Co., for example, used Silicon Graphics Inc.'s newest Altix 3000 supercomputer to design a new aroma-preserving container for Folgers coffee that costs about $7. Advocates of government support for the supercomputing effort contend it's that kind of tech-enabled innovation that will allow U.S. industry to compete with nations such as China, where wages are lower. "The worst thing we can do is think we can compete in advanced manufacturing with low wages," says Council on Competitiveness president Wince-Smith, a former technology policy adviser in the Reagan and first Bush administrations. "We'll compete in rapid prototyping using high-end computing."
Since Japan's Earth Simulator supercomputer shocked Washington two years ago, there's been a sense that the United States could lose its lead in other scientific disciplines, just as it did in climate science. The National Science Foundation last month reported that U.S. dominance in critical scientific fields is slipping, as measured in the number of patents awarded and papers published. The percentage of American Nobel Prize winners has fallen during the 2000s amid competition from Europe and Japan. And fewer American students are training to become scientists and engineers. Meanwhile, Japan, Taiwan, and South Korea have seen rapid growth in the number of patents awarded over the past 20 years. Europe is poised to take the lead in particle physics, with the world's largest supercollider in Switzerland scheduled to open in 2007. Spain is planning to build the second-most-powerful computer for general scientific use.
There's a business threat inherent in that shift. The Earth Simulator Center in Japan is reportedly negotiating deals with Japanese automakers to use time on the world's fastest computer to boost their quality and productivity. And NEC Corp. could disclose an even faster system next year. "We in the United States need to create new computer architectures that can boost computing power by many times over our current machines--and everybody else's," Energy Secretary Spencer Abraham said at a press conference last month.
In April, people from Dupont, BellSouth, GM, IBM, Lockheed Martin, Merck, and Morgan Stanley held the first meeting of the Council on Competitiveness' high-performance computing committee. Their goal: figure out how to square the interests of government scientists and policy makers with those of private-sector computer users. One area of work is trying to form new public-private partnerships so companies can access experimental computing architectures years before they become affordable enough for budget-conscious businesses.
Boeing director Budworth, who's active in the project, says the Dreamliner will be Boeing's first airplane whose assembly will be modeled more or less end to end on a supercomputer. Software that can plot the location of every part and tool and on the factory floor is becoming so sophisticated that a petaflop computer may soon be necessary to run it.
That kind of talk can only please Donofrio, who admits the Blue Gene project carries a lot of risk. "Blue Gene is an incredibly bold adventure for us," he says. "There are a lot of people who like this idea of deep computing. And we'd have to say Blue Gene is a pretty deep computer." His greatest fear? That there won't be users bold enough to follow IBM's lead and try to run one big problem on the whole thing. "The new horizon is 'What can you do with this thing?' Not 'What can a thousand of you do with it?'" he says. "That's innovation. That will light up the board."
About the Author
You May Also Like