In Charge Of Change: The Man Who's Turning Supercomputing Centers' Mission On Its HeadIn Charge Of Change: The Man Who's Turning Supercomputing Centers' Mission On Its Head

Peter Freeman of the National Science Foundation is driving the shift from providing supercomputing cycles to developing software and services as part of a national cyberinfrastructure.

Aaron Ricadela, Contributor

March 11, 2005

5 Min Read
information logo in a gray background | information

Peter FreemanThe National Science Foundation is phasing out its most recent supercomputing research program and getting ready to fund new IT priorities for the rest of the decade. Assistant director Peter Freeman is the man who'll make many of the decisions about how scientists and engineers get early access to supercomputing centers' high--end technologies.

Freeman, who works at NSF headquarters outside of Washington, D.C., has since 2002 headed the NSF's Computer and Information Science and Engineering Directorate, which manages the nation's three largest supercomputing research centers in Champaign, Ill., San Diego, and Pittsburgh. But a shift in the centers' priorities from providing supercomputing cycles to developing software and other services as part of a national "cyberinfrastructure" has many in the industry saying Freeman's goals aren't in everyone's best interests. information senior writer Aaron Ricadela spoke with Freeman last week.

information: In the "shared cyberinfrastructure" that the NSF is building, new requirements like common software stacks, faster networks, and better management of large data sets are supplementing the need for computational cycles. How could the NSF's new IT priorities benefit high--performance computing users?

Freeman: Every issue that industry faces----large databases, massive simulations, online access----either has or will be faced by scientists or engineers first. Their work demands technology that's not yet widely available. After we work out the kinks, industry comes to us. Until about four years ago, we didn't call it cyberinfrastructure. That term was coined to describe the integration of supercomputers, advanced networks, middleware, visualization software, and services----the whole range of things scientists and engineers are continually using in their research. What we're seeing now is the integration of all of these things.

Prior to 10 years ago, most areas of science and engineering research were using computers in some way, but they were fairly pedestrian uses: word processing, E--mail, small databases, spreadsheets, or very special uses. They didn't change much the nature of the research being done. Then, what grew exponentially, is in field after field, scientists and engineers have discovered that modern computers, when properly integrated, have the power of revolutionizing their conduct of science----not just giving themselves a nicer processor or a faster adding machine, but literally permitting them to change the very nature of how they carry out their science.

information: At the same time, the NSF's computing directorate is ending its $40 million IT research program, and redistributing the money to what you call emerging priorities. What was funded under that program, and why is it ending?

Freeman: That program ended last year. It dramatically increased the cooperation between computer scientists and physicists, chemists, biologists, anthropologists, and others. It funded things like the National Virtual Observatory that [Microsoft distinguished engineer] Jim Gray was a spark of. That permits astronomers to make new discoveries not by peering through a telescope on a cold, starry night, but by mining a database. Or it lets them go back through digitized records to see if they can recognize a supernova in photos from 1930, in drawings from 1850, or in writings by a Chinese astronomer in 1000. At the same time, it illustrates the power of IT. The NSF has five priority areas, and fields get transformed. Unlike many agencies in this town, we believe in sunsetting things from time to time.

information: Last September, the NSF ended the Partnerships for Advanced Computational Infrastructure, which let the supercomputer centers fund teams of computer scientists and physical scientists to incubate new technologies. What's changing that led to the end of PACI?

Freeman: As an analogy, there's also been a transition in corporate computing from mainframes, to when it looked like everything was going to the desktop, to a mix of those approaches. But the mainframes today are used, managed, and seen differently than 20 years ago. In the area of supercomputing, you see a similar kind of thing. Twenty years ago when the NSF started its supercomputing program, there weren't very many supercomputers around. It took a large budget, and a large number of specialized operators and programmers. That world has changed----the field is going through that kind of a transformation that we saw in business.

When people talk about PACI, they're talking about the way high--performance computing used to be provided to the research community, versus the way we think it will be provided five to 10 years down the road. We're right in the middle of that kind of transformation. Regarding scientists working on computational problems using those facilities, 20 years ago there were relatively few. The NSF started the PACI program in 1997, and at that time the NSF deemed it an appropriate way to fund those kinds of activities: providing money to the centers and letting them parcel that money out to small teams.

information: That process is different now that PACI ended. And in 2007, there will be a new competition among the centers to see which get funded. What's the role of those centers today, and what's their future?

Freeman:: They are not and never have been captives of the NSF. We provide 40% to 50% of their budget. But they're on university campuses----in the case of the Pittsburgh center it's two universities in a consortium----and they get support from there as well.

The world is changing. Their role in the future is certainly not going to be the same as it was in the past. To be perfectly honest, we don't know what the role will be because this whole area is changing very rapidly. The technology, but also the use of cyberinfrastructure in research and education is changing quickly. Will demand be more skewed toward databases, visualization software, and services? That's not clear. We're trying to stay flexible in what we do. We don't have a road map after 2008, 2009, 2010. And so the future of those centers is ultimately up to them.

Return to main story, Seismic Shift

Read more about:

20052005
Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights