The idea was to use a zillion computers connected to the Net to handle jobs that normally require supercomputers. But Adam Beberg and the Distributed.net project have been there and done that.
That's why Beberg left Distributed.net Friday to pursue his own plans for taking so-called "distributed computing" to the next level.
The new project is called Cosm.
"Cosm is protocols designed to allow really large-scale distributed computing over the Internet," said Beberg, who will run the project. "Basically, the goal is to get distributed computing big."
Distributed.net coordinates large-scale computing projects, leveraging the idle processing power of thousands of computers linked through the Internet.
High-profile success came when Distributed.net won a challenge to find the right key out of 63 quadrillion that would unlock data shackled using the DES Encryption method. The encryption challenge was what motivated enthusiasts of distributed computing to work together.
But now they want to bring the idea to a wider range of computing-intensive applications, such as chip design, rendering of computer-animated movies, weather modeling, and simulations for bridge construction. All of these applications are super processor-intensive, yet the researchers and companies that use them don't always have the resources at hand.
Enter distributed computing.
"The purpose of what we're doing here and what Adam's doing is to let any task be applied to [the distributed computing] model, said David McNett, the president of the Distributed.net. "Nobody has taken this to a more general application. We do have similar goals -- we just disagree fundamentally on how to get from here to there."
Beberg's departure was amicable, and creates what amounts to a parallel path of development, much like two flavors of the Unix operating system.
To let distributed computing work more easily with a wider range of applications, it needs a set of shared software nuts and bolts. These connectors -- in the form of underlying software protocols designed specifically for distributed computing -- will make the system a more viable option when researchers and companies need to carry out a processor-intensive job.
"Basically what it needs to be able to do is allow people to write code that they can put into the system for projects, and then Cosm will handle the distribution work and [retrieving] the results," Beberg said.
The result: Big jobs can be done a lot faster.
"Say the code is meant to render an animated movie," Beberg said. "The [animation team] writes the code they need to have to do this. They put it out there and basically they ask people to help them."
But right now distributed computing doesn't have enough flexibility to enable such a come-and-get-it scenario, he says.
"The protocols really are the keystone to distributed computing," McNett said. "[They are] the language that the different computers speak."
Both projects will rely on volunteers working together with the help of contributing developers.
"There's no one answer to distributed computing. People will find that one model is more appropriate for them," said McNett. "There's plenty of room for both efforts to coexist. The field has gotten large enough to justify two efforts."