MAIN MENU
General Info
» About CCR
» Media/Press
» Photo Album
» Video Library
» News
» Services
» Events
» Job Opportunities
» Staff
» Contact Us
Projects
» Research Highlights
» Bioinformatics
» Grid Computing
» Visualization
User Info
» Machine Status (Hotpage)
» Accounts
» Resources
» Technical Support
Education/Outreach
» Access Grid
» Education
» Partners
» Presentations
» Training/Links
CCR HOME


From the issue dated May 20, 2005

The Chronicle: Budget Cuts at NSF May Signal a Crisis in Computing

Critics say the U.S. has no clear plan for the future of supercomputers

Many researchers warn that a crisis looms for academic supercomputing in the United States, largely because of what they see as the National Science Foundation's failure to support the technology adequately.

The agency is the principal source of supercomputing time for most scholarly researchers in the country, yet the foundation decided last fall to withdraw financing for its three supercomputer centers starting in 2008.

What will happen to the centers after that is uncertain. Although NSF officials say they are as committed to supercomputing as ever, many researchers and policy watchers say that the move by the NSF is just the latest sign that the federal government's supercomputing efforts are rudderless.

Even some advisers to the Bush administration have recently called on government agencies to develop a clearer road map for purchasing and operating cutting-edge supercomputers and for developing supercomputer software.

Lawmakers have signaled their concern as well: The House of Representatives passed a bill last month that would require the Bush administration to "provide for sustained access by the research community in the United States to high-performance computing systems that are among the most advanced in the world."

"There is at the moment a lot of uncertainty," says Daniel A. Reed, a former director of the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign and the vice chancellor for information technology at the University of North Carolina at Chapel Hill.

The science foundation's plans are crucial because its supercomputers offer capabilities that many researchers can't find elsewhere.

Some other federal agencies operate supercomputers that are far more powerful than the NSF's. The NSF's computers are outgunned by machines operated by the Energy Department, the National Aeronautics and Space Administration, and the Naval Oceanographic Office, but researchers in academe generally cannot get access to them unless their research is closely allied to the agencies' work.

Some universities have assembled their own supercomputers, but they are generally less powerful than NSF machines.

The stakes, researchers say, are high, both for the intellectual vitality of academe and for the nation's industrial competitiveness. High-performance computing has evolved into an essential tool for scientists. With complex computer simulations and models, researchers can both compare theoretical predictions against real-world measurements and simulate experiments that they could never actually perform.

"Computing is becoming a third pillar in the sciences," along with experimentation and theoretical research, says Klaus J. Schulten, a physics professor at the University of Illinois at Urbana-Champaign who uses supercomputers to simulate biological processes.

Tighter Purse Strings

The science foundation is spending about $301-million on supercomputing this fiscal year and has requested $307-million for the fiscal year that starts October 1, according to the National Coordination Office for Information Technology Research and Development, a White House unit that tracks high-performance computing.

While that budget request is still substantial, it is far less than what many researchers think is needed. Two years ago, for example, an advisory panel to the National Science Foundation proposed that it should spend more than an additional $1-billion annually on "cyberinfrastructure" projects, including supercomputing.

Little of that additional money has materialized, but the science foundation has embraced the advisory panel's recommendation to emphasize the creation of "ubiquitous, comprehensive digital environments" for researchers to conduct research and collaborate with one another. Many researchers see the science foundation's move to retool its supercomputing program as a step toward shifting supercomputing funds into a broader cyberinfrastructure effort.

Indeed, Arden L. Bement Jr., the director of the science foundation, touted his agency's cyberinfrastructure projects during an address this month to the Internet2 high-speed networking organization -- which includes the universities that employ most of the nation's academic supercomputer users.

"We are reaching a point in which bold, novel research is being hampered by a lack of sophisticated cyber tools, and that is simply unacceptable," Mr. Bement told the conference. "An effective cyberinfrastructure will help ensure that the boldest ideas are not constrained for want of tools."

Even so, some officials say they were surprised by the science foundation's decision last year to allow the five-year contracts for its two principal supercomputer centers to expire: one, the San Diego Supercomputer Center, at the University of California at San Diego; the other, the National Center for Supercomputing Applications, at the University of Illinois at Urbana-Champaign. Since making the decision to end financing for the centers, the foundation has provided additional funds to allow them to continue to operate for three more years.

But the ultimate fate of the two centers -- as well as that of the Pittsburgh Supercomputing Center, which is operated by Carnegie Mellon University and the University of Pittsburgh, for which financing also has been withdrawn -- is up in the air because the science foundation plans to invite new bids for the operation of supercomputer facilities for researchers.

There is no guarantee that any of the three existing centers will win a contract. And in the short term, while the existing centers are on hold, the NSF is not budgeting any money for buying new computers for the centers, even though their supercomputers are going stale almost as fast as day-old bread.

"Not having a well-defined upgrade path for the center poses a problem for us," says Thom Dunning, director of the National Center for Supercomputing Applications, for which NSF has committed $35-million this year, $20-million next year, and $17-million in 2007. "I would say we are missing opportunities because we don't have that technology refresh."

Officials at the science foundation, however, say their plan makes sense and has precedent. In the 1990s, they note, the foundation was underwriting five supercomputer centers but pared its support down to the current three. Competition helps make sure that the centers serve scholars' needs and are making good use of taxpayers' funds, they say.

"Having a relatively large sum of money uncompeted for a long period of time is not a good idea," says Sangtae Kim, director of the science foundation's Division of Shared Cyberinfrastructure.

Even if the existing centers do not win the new competition, he says, they will most likely be able to get other grants from the NSF or other agencies, as they do today.

Not Just Big Machines

Supercomputer centers do more these days than just provide time on mammoth machines. The centers have also moved into software and network projects that do not necessarily rely on supercomputers. For example, the Mosaic Web browser, which was the foundation for both Netscape's and Microsoft's browsers, was developed at the National Center for Supercomputing Applications.

Some researchers say that, with that expansion, the supercomputer centers may have strayed from their mission of providing researchers with access to supercomputers.

Consequently, "it's probably appropriate for the NSF to be considering some institutional evolution," says Larry Smarr, director of the California Institute for Telecommunications and Information Technology and a former director of the National Center for Supercomputing Applications. The California institute, a joint venture of the University of California's Irvine and San Diego campuses, seeks to broaden the reach of the Internet.

But many academic scientists worry that the changes in the mission of these centers and the NSF's financing decisions could upend American supercomputing research. If none of the incumbents win a new contract from the NSF, building a new supercomputer center from scratch would not be easy or inexpensive, they say. It might not even be smart.

"You don't build a highway and decide a few years later that you're going to take it away," says Kelvin K. Droegemeier, a professor of meteorology at the University of Oklahoma who relies heavily on supercomputers in his research.

Moreover, the host institutions for the current centers have invested hundreds of millions of dollars, from sources other than the federal government, in the supercomputer centers, and that investment would be lost if the supercomputer centers moved elsewhere, researchers say.

And even if the centers stay where they are, the current cloud of uncertainty is destructive to staff morale, says Russ Miller, a professor of computer science at the State University of New York at Buffalo and director of its Center for Computational Research. "Their jobs can't be hanging in the balance every two years," he says. "Otherwise, they'll all find a more stable position somewhere else."

"I wonder if NSF really understands the turmoil and pain that the lack of planning can cause," says Mr. Dunning, of the National Center for Supercomputing Applications. Other agencies, such as the Energy Department, are much more protective of their supercomputing facilities, he says. "I don't think they think heavily about that" at NSF, he says. Asked about that comment, the NSF's Mr. Kim replies that he believes that the agency supports the supercomputer centers.

Supercomputer Shortage

Many researchers say they cannot get enough time on the NSF's supercomputers as it is.

"High-end computing resources are not readily accessible and available to researchers with the most demanding computing requirements," the President's Information Technology Committee said in April in a summary of a draft report expected to be released later this year.

Indeed, demand for supercomputing is on the rise. Computer models are becoming more realistic, which fuels interest by scholars in using them, says Mr. Dunning. New disciplines, like molecular biology and environmental modeling, are breaking into supercomputing, he says.

All supercomputer users are not equal. A relative handful could monopolize the existing machines doing research in such fields as cosmology and high-energy physics that require enormous computations, even by the standards of supercomputers. Many others, in disciplines such as political science and musicology, need more-limited amounts of supercomputer time, and there is not enough available time on the supercomputers to fully serve both groups.

The NSF supercomputer centers, at the foundation's behest, have often chosen to meet the needs of the small users, many researchers say. "They're trying to keep as many people happy as they can," says Calvin Ribbens, an associate professor of computer science in Virginia Tech's College of Engineering and deputy director of the institution's Terascale Computing Facility, which consists of 1,000 advanced Apple computers working in unison. But the centers' strategy effectively bars truly revolutionary research, which requires huge blocks of computing time, he says. "To get the Nobel Prize, you need to use the machine for a long time."

"If we had 10 times the amount of supercomputing power that we have today, there are problems that we could solve," says Ralph Roskies, scientific director of the Pittsburgh Supercomputing Center.

Robert Sugar, an emeritus professor of physics at the University of California at Santa Barbara, says he needs far more time, even though his team of theoretical physicists has been allotted "several million hours" of supercomputing time on NSF facilities this year. That allocation is smaller than it sounds, because each separate processor in the supercomputer counts against that time limit, so if 1,000 processors work on a project, the time limit will be reached in a few thousand hours.

High-end computing is "a generic intellectual amplifier," says Mr. Reed, the vice chancellor at Chapel Hill, meaning that it is useful in many different disciplines. But that flexibility has its own drawback: orphanhood during budget debates. "It's everybody's second priority but often not anybody's first priority," he says.

And it should be a top priority, says William A. Wulf, president of the National Academy of Engineering and a former NSF official. Supercomputing is not just an intellectual curiosity, he and others say. Rather, supercomputing is essential for America's industrial and academic competitiveness.

Even researchers who don't need supercomputers today have a stake in the debate, he says, because tomorrow's desktop computers and software will be influenced by today's decisions about research into supercomputing technology.

"Supercomputers are time machines," he says. "What you've bought when you've bought a supercomputer is being able to do something sooner rather than later."

SUPERCOMPUTERS OF YESTERDAY AND TODAY

There are as many ways to compare supercomputers as there are to compare cars or blenders. The most popular measure is how many calculations using numbers that include decimal points, known as floating-point operations, or flops, the machine can perform in one second: A computer that can churn out one trillion flops in a second is rated at one teraflop; a machine that can handle one billion is rated at one gigaflop; and a computer that can perform only one million is rated at one megaflop.

 
 
http://chronicle.com
Section: Information Technology
Volume 51, Issue 37, Page A1
Last Modified: Thursday, 19-May-2005 10:19:01 EDT
Copyright 1998-2006, All rights reserved.
Send comments or questions regarding this site to

Center for Computational Research
University at Buffalo - State University of New York
701 Ellicott Street
Buffalo, New York 14203
Phone: (716) 645-6500
Fax: (716) 849-6656