Skip to main content

Processing for Science

"@Home" projects band together and proliferate

Fans of the spacetime continuum can now uncover gravitational ripples at their desks thanks to the February launch of Einstein@Home. The project is one of the latest of at least 60 "@home" projects now on the Internet, in which personal-computer users can donate spare processor power to help solve scientific problems. And no need to choose one mission over another: @home software can now multitask, and enough microchip muscle exists to handle many more distributed-computing projects.

Save for computationally intense tasks such as rendering graphics, typical modern PCs that perform at least one billion floating-point operations per second (that is, most home computers built since about 2000) almost never employ their full power. Distributed computing takes advantage of this spare capacity, dividing large tasks into tinier ones and sending them over the Internet for usually idle computers to work on. The result is unparalleled processing muscle: IBM's BlueGene/L, now the most powerful supercomputer, cranks out about 70 trillion flops; meanwhile SETI@home conservatively runs off roughly 500,000 PCs at more than 100 trillion flops, says SETI@home director David P. Anderson.

Since the first public distributed-computing project--the Great Internet Mersenne Prime Search--was launched in 1996 to look for large prime numbers, virtual supercomputing projects have emerged for the serious (testing potential drugs with FightAIDS@home) to the sublime (the Monkey Shakespeare Simulator). Anderson expects hundreds of @home projects to emerge in the next few years and the number of participating CPUs to reach 30 million from the roughly 1.3 million of today.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


A key development in the surge is the formation of distributed-computing platforms that can host multiple projects. Among the biggest is the Berkeley Open Infrastructure for Network Computing (BOINC), which hosts SETI@home and Einstein@Home as well as the formerly independent Climateprediction.net, which joined in August. In coming months BOINC partners will include FightAIDS@home, PlanetQuest and Orbit@home. Other umbrella distributed-computing software platforms include Grid.org, which is running two projects to find compounds against cancer and predict three-dimensional protein structures from amino acid sequences, and Find-a-Drug.org, which currently has nine projects looking for drugs against various ailments, such as malaria and Creutzfeldt-Jakob disease, the human relative of mad cow disease.

Such @home hosts are also time-savers for scientists. BOINC, for instance, offers open-source infrastructure code so researchers do not have to write their own. It can take several person-years to develop the software, because it must perform unobtrusively on different operating systems in up to a million computers while protecting against erroneous results and malicious attacks. "We want to make it easy for scientists to get access to millions of computers' worth of processing power," says Anderson, who also directs BOINC.

Anderson estimates that, for a typical computer, the practical upper limit for the number of @home projects is roughly 12. At that point, its processing power is parceled so thin that projects consider it useless. A service that rotates a PC automatically between projects is possible in the future, he adds. Still, umbrella platforms might interfere with one another if operating simultaneously on the same computer. But with the roughly 200 million privately owned computers in the world, notes Ed Hubbard, president of United Devices in Austin, Tex., which runs Grid.org, "there's plenty of room for everybody."