Seismic ShiftSeismic Shift
The nation's supercomputing centers have for years incubated innovation. Now their research agenda is being upended.
Critics of cyberinfrastructure add that the entire plan is underfunded, puts too much emphasis on experimental techniques such as grid computing instead of reliable access to big, centralized machines, and destroys the budgetary mechanism in place since the '90s that lets the centers directly disburse research grants to interdisciplinary teams of computer engineers and scientists. That process helped take promising new ideas and move them into production computing environments. "Many people in the community feel as if the National Science Foundation has gone off the deep end in its handling of the PACI centers as a computer-science experiment," says Bill McCurdy, a chemistry and engineering professor at the University of California, Davis, and a senior scientist at Lawrence Berkeley National Lab. McCurdy also used to head the Energy Department's flagship computing center at Berkeley and the Ohio Supercomputer Center. "It's definitely a sea change in the philosophy of how the centers and the NSF will function."
Shunted aside, critics say, is basic computer-science research in areas such as operating systems, architecture, and semiconductor materials that will let supercomputers keep pace with the scientific community's exploding computational needs and libraries of collected data. As an example, the NSF last year ended its IT Research program, which funded projects such as the National Virtual Observatory, an astronomy database championed by Microsoft distinguished engineer Jim Gray, and reallocated its $40 million budget. "It's sort of viewed that the NSF is anti-computer," says one supercomputer industry veteran, who spoke on the condition of anonymity. "This [attitude] may be all over Washington," he says. Lawmakers "see companies like Microsoft, IBM, Intel, Hewlett-Packard, and Dell, add up all their revenue and all their R&D, and ask why Washington has to support R&D. They don't realize that most of [the vendors' spending] is development."
The NSF is pushing forward its new ideas even as its own budget is shrinking, from $5.65 billion last year, to $5.47 billion this year, as the Bush administration redirects money into defense, intelligence, and homeland security. Next year, the NSF is slated to get an increase of 2.4%, back to $5.61 billion, barely keeping pace with inflation. "The budget environment overall for the NSF is terrible," says Chris Jehn, Cray Inc.'s senior VP of government programs.
The changes also are deeply entwined with a debate about how to best fund supercomputing research in the United States in order to stay competitive at a time when China, Japan, and other countries are making development of the machines a priority. Steve Wallach, a VP in the office of technology at Chiaro Networks Inc., former member of both the President's Information Technology Advisory Committee, which issued a key report on supercomputing in 1999, and part of the National Research Council team, which issued a report in November, says the studies show that "the U.S. lacks a consistent, long-term research strategy in supercomputing, and, without it, in five to 10 years we may be asking why we're behind other countries. The cybercenters are an integral part of this long-term research strategy."
Freeman points to his own report: An influential, 84-page document penned two years ago by a panel of computing experts that was chaired by Dan Atkins, an electrical engineering and computer science professor at the University of Michigan. It recommends sweeping changes in the way the United States manages supercomputing R&D. The NSF's computing directorate will pay for some--$123 million this year and about $125 million in '06--while the foundation's scientific disciplines each chip in their own funds. "If you take a five- or 10-year point of view, clearly this is an activity that needs to be funded across all areas of the NSF," Freeman says.
Cyberinfrastructure's budget, though, appears hamstrung. "We've been hearing about cyberinfrastructure for years," Larry Smarr, a computer-science professor at the University of California, San Diego, founder of the NSF supercomputer centers program, and director of the NCSA from 1985 to 2000, said last year. "It was originally talked about as having a $1 billion budget," the amount recommended by the Atkins report. The NSF's allotment for cyberinfrastructure today is about half that amount: roughly $400 million this year, and $509 million in 2006. Because of budget woes, the Atkins report's aggressive recommendations for building the program are "basically on idle right now," Dunning says.
Despite those constraints, the centers are moving forward on the cyberinfrastructure plan as best they can. Dunning says he's interested in supplementing the center's traditional role of providing supercomputer time to U.S. scientists and engineers with development of software that would create "cyberenvironments" to help users move large amounts of data long distances, tap into linked public databases for fields such as gene research, and submit jobs to the NSF centers using common tools (see story, Change: NCSA Plans For Future).
Within research circles, however, the controversy seems sure to continue. For example, the NCSA's decision to focus on software development could make it less efficient for users to get the computational cycles they need access to quickly. "My center of gravity lies on the intense computational side," McCurdy says. "Thom Dunning will have a different vision for the NCSA, that hardware is no more important than software development. But that also leaves us out in the cold for doing large computations. Either we fend for ourselves with clusters or make do with less time on the hardware, since more money is going to software."
Other concerns, such as how to team computer engineers and scientists now that the formal mechanism for doing so is no longer in place, will have to be addressed as well. "Doing science today is interdisciplinary, and PACI was in place to foster that interdisciplinary work directly with the centers," says Jack Dongarra, a computer-science professor at the University of Tennessee and a distinguished research staff member at Oak Ridge National Lab who last month started an online newsletter to track the development of the cyberinfrastructure program. "That added value is no longer tightly connected to the centers' program."
The NSF isn't making any promises about the future, either. When asked about the new competition in 2007, Freeman points to the fact that the centers also get university funding as proof they aren't "captives" of the NSF, and says there's no road map for 2008 and beyond. "The future of those centers is ultimately up to them," he says. "Will demand be more skewed toward databases, visualization software, and services? That's not clear. We're trying to stay flexible in what we do."
Illustration courtesy of Red Nose Studio
About the Author
You May Also Like