Point, Click, Supercompute

A new installation coming to the San Diego Supercomputer Center wants to bring accessible, high-powered data crunching to a bigger audience of researchers. By Chris Oakes.

It will be big, it will be powerful, and like many things these days, it will have a Web site.

A new installation coming to the San Diego Supercomputer Center next year includes a first for supercomputers hooked to the Net: a Web interface.

"It's a way to tap a new community of users," said Wayne Pfeiffer, the deputy director of the San Diego center, where the new IBM mega-machine will be set up in late 1999. "Users wouldn't even have to log on. They could just define the problem and say 'go' or 'solve' and have the Web interface interact with the supercomputer."

By users, Pfeiffer means research scientists. The project's backers hope the Web connection encourages more researchers to surf into the power of the computer's 1,000-plus processors.

When completed, the IBM RS/6000 SP will have more than 1,000 microprocessors and be the largest computer ever constructed using IBM's Power3 chips. The RS/6000 can rip through data at a rate of nearly four trillion calculations per second, the company says. The fastest machine at the center now contains 256 processors.

"Machines with a thousand processors have a thousand times as much memory as on the desktop computer," Pfeiffer said. "That changes completely the character of the problem you can solve."

"[The new project] makes very large-scale computer power available to the academic community, in addition to the government," added Lou Bifano, vice president the RS/6000 project. "Problems of a global scale can now be addressed by the academic community."

The National Partnership for Advanced Computational Infrastructure, which oversees the nation's scarce supercomputing hardware, selected the next-generation IBM supercomputer for installation at the San Diego center.

Pfeiffer thinks academia is underutilizing this resource. Biologists and social scientists, he said, have as much to gain from using supercomputers as physicists and chemists do.

"There are lots of challenging problems in those fields that could make productive use of high-end computers."

Most supercomputers can be accessed remotely using various networks, including the Internet. But this is the first specimen to use the Web as a front end for users.

A friendlier interface means researchers can circumvent the supercomputer's Unix-based operating system, Pfeiffer said, and the arcane programming environment that is standard procedure for supercomputer use today.

Widespread access to increasingly potent supercomputers is not just important to specialized research, says supercomputer builder Michael Warren. All science, he argues, is advanced by intensive computing power.

"For most of history there were two kinds of scientific investigation. You could do scientific experiments or you could do theory," he said. "But now there's actually a third form of science, which is being able to do simulations. [You can] effectively experiment inside the computer which would be impossible to do in the real world."

One well-known experiment simulates a nuclear blast, eliminating the need for actually blowing up a desert or an atoll. The same intensive numerical simulation, Warren says, can be applied to anything.

"The work that I do on these computers is effectively simulating the evolution of the universe," he said. "I can understand how galaxies evolve over a time span of 15 billion years. And my colleagues down the hall can use the same computer to study the motion of atoms over time-scales of picoseconds."

Warren led a pioneering project to bring down the price tag of supercomputers while working at the Los Alamos National Laboratories. His team produced the Avalon supercomputer earlier this year, which cost US$150,000, a pittance in the supercomputing world. It was built using 68 off-the-shelf, high-end personal computers, connected by network switches.

Again, the point was simple: Cheaper supercomputers means more users.

The Web interface planned for the San Diego center is probably something researchers will take to, Warren said. "One of the problems has been that you can generate huge amounts of data on one of these machines -- but to analyze it you have to have them send you tapes through the mail. Or you have to have a very fast network to do any sort of graphics remotely."

Warren expects the project to be successful. But only time will tell, he says.

"If you go back in two years and look at science articles and breakthroughs generated by the new machines ... then you can the really judge their worth."