The Chronicle of Higher Education
From the issue dated March 5, 2004


http://chronicle.com/weekly/v50/i26/26b01401.htm

Recipe for an Affordable Supercomputer: Take 1,100 Apples ...

By HASSAN AREF

It is sometimes small comfort that we perform so many reasoning tasks infinitely better than even the most powerful machines. When a computer can play chess about as well as the human world champion, we begin to wonder how long our intellectual superiority in any area will last. No real worries thus far, I contend, but stay tuned: Machines are improving at impressive rates, while human intelligence isn't. We are quickly approaching a time when human mental capability across a broad range of tasks will be eclipsed by machines. In less than a decade, a dinner conversation with your laptop will be more interesting than a similar conversation with many of your co-workers or business partners.

Today's supercomputers, for instance, will be commonplace a decade from now. In fact, "supercomputer" is a relative term. Somewhat like "superstar," it reflects the best of breed for a particular era. A computer that is "super" one year may seem much less so the following year. And after a couple of years, it is sure to have lost most of its superiority. The Top 500 list (see http://www.top500.org) -- produced by the University of Tennessee at Knoxville; the University of Mannheim, in Germany; and the Lawrence Berkeley National Laboratory, and updated twice a year -- ranks the computers that are the fastest in the world. The top entries always attract attention, which probably says more about human nature than about technology.

The most recent version of the list includes some interesting entries. All of the fastest computers in the world today are massively parallel cluster machines -- that is, they consist of banks of chips or computers, each one with about the amount of power that would be found in a desktop machine. The computers within a cluster are made to work in parallel, producing results at a rate that is, roughly speaking, as many times faster than the speed of an individual computer as there are computers in the cluster.

At the top of the list is the Earth Simulator, begun in 2001 by the Japanese. As its name suggests, the giant project was designed to produce, through computer simulation, insights into climate-change processes affecting our planet. It can complete some 36 trillion floating-point operations (teraflops, in computer jargon) in sustained calculation. The human brain can't come close to that kind of speed.

The second entry on the Top 500 list is ASCI Q, housed at the Los Alamos National Laboratory. The geopolitical role of ASCI Q is noteworthy. When the United States signed the test-ban treaty for nuclear weapons, one issue that came up was how we would guarantee that our nuclear stockpile remained secure and ready for use. That area of technological endeavor is known by the benevolent-sounding name "stockpile stewardship." The government decided that supercomputers could do the job. The name for the activity became ASCI, the Advanced Scientific Computing Initiative -- a play on ASCII (American Standard Code for Information Interchange), which was early computerese for a comprehensive translation table between strings of numbers and the full set of characters needed in text documents. ASCI Q performs at almost 14 teraflops, so that seems to be the amount of computational power needed to safeguard national security and global stability.

No. 3 on the list is Virginia Tech X, the name of my institution with the Roman numeral appended. "X," as we call it, is a cluster of 1,100 Apple G5 desktop computers running the latest release of the company's Unix-based OS X operating system with certain technological additions: InfiniBand communications hardware and software, produced by Mellanox; a new cooling system and customized racks for the computers, produced by Liebert; and a proprietary software package known as Déjà Vu that compensates for the inevitable glitches that occur on individual components of the cluster when it is running a calculation involving many machines.

X got its name by being the first academic machine to exceed 10 teraflops. The name is also a play on OS X. Clocked at 10.28 teraflops, X may reach even higher speeds after a planned upgrade this spring, from the Apple G5 machines to Xserve G5's, designedunlike the Apple G5'sspecifically for a network setting. And we at Virginia Tech hope to follow with the clusters L and, in due course, C -- capable of 50 and 100 teraflops, respectively. Those powerful machines will be used primarily for problems in science and engineering, like the modeling of biomolecules and global climate change, comparing designs for complex structures, and finding huge prime numbers.

Not only is X's computational speed impressive, but the cost of the hardware was a mere $5.2-million, only a fraction of the cost of ASCI Q or the Earth Simulator. The Virginia Tech machine was assembled in record time, just three months.

The fourth and fifth entries on the Top 500 list are at the National Center for Supercomputing Applications, on the campus of the University of Illinois at Urbana-Champaign, and the Pacific Northwest National Laboratory, respectively. They have not thus far passed the 10-teraflop benchmark.

One very interesting lesson from the Top 500 list is that supercomputers have suddenly become affordable. If, for argument's sake, we define a supercomputer to be a machine that is 1,000 times as fast as the average desktop computer, and if we agree that hooking up desktop computers in parallel is one way to make a supercomputer, it follows that any institution or company that can afford to set aside 1,000 desktop machines, and to invest in the communications software to link them, can own a supercomputer. That is extremely good news for universities and corporations. It has more-sinister overtones when one thinks about wealthy groups or nations unfriendly to the United States.

The great scientific minds of earlier times, like Leonhard Euler and Carl Friedrich Gauss, needed lots of arithmetic in their work and did many calculations by hand. I sometimes wonder if, when I die and go to heaven, I will meet one of those masters. I can imagine the conversation turning to computation. At some point, I suspect, I will blurt out that we constructed a machine capable of 10 teraflops. I can envision the frown on the sage's face, and the gentle question: "So, my son, what did you do with all that computational power?"

That's the real issue and the real challenge. As we construct machines that rival the mental capability of humans, will our analytical skills atrophy? Will we come to rely too much on the ability to do brute-force simulations in a very short time, rather than subject problems to careful analysis? Will we run to the computer before thinking a problem through?

Humans have had centuries to come to grips with machines that outperform us in physical endeavors. A major challenge for the future of humanity is whether we can also learn to master machines that outperform us mentally. One day a supercomputer will mean a computer with superhuman intelligence. Maybe one day the computers will run a Web site that lists the 500 smartest humans.

Hassan Aref is dean of the College of Engineering and a professor of engineering science and mechanics at Virginia Tech.

http://chronicle.com
Section: The Chronicle Review
Volume 50, Issue 26, Page B14

Front page | Career Network | Search | Site map | Help

Copyright © 2004 by The Chronicle of Higher Education