Slideshow

How you can help the LHC and 12 other ways to donate your PC's spare processing power

Contribute to projects including the LHC, climate forecasting, projects that benefit humanity, target disease cures and more

  • BURP


    With an acronym like that, how can you go wrong? BURP stands for Big and Ugly Rendering Project, and is designed to render 3D animations. The [[xref:http://burp.renderfarming.net/|project|BURP]] started in 2007.
  • IBM World Community Grid


    The [[xref:http://www.worldcommunitygrid.org/|World Community Grid's|World Community Grid - Home]] mission is "to create the world's largest public computing grid to tackle projects that benefit humanity". The grid supports research into cures for muscular dystrophy, influenza, AIDS and childhood cancer, among other things. The grid, which was headed for half a million members as of November 2010, runs on BOINC software and is funded by the NSF.
    A number of other volunteer computing projects also target disease cures, largely via protein research. For example, [[xref:http://boinc.bakerlab.org/rosetta/|Rosetta@home's|Rosetta@home]] pitch is that it "needs your help to determine the three-dimensional shapes of proteins in research that may ultimately lead to finding cures for some major human diseases," including Malaria and Alzheimer's. The project, started in 2005, is run by the [[xref:http://depts.washington.edu/bakerpg/|Baker Laboratory|The Baker Laboratory Homepage, University of Washington Department of Biochemistry: Protein Folding, Protein Structure, Protein Design, Structure Prediction, ROSETTA, I-sites, protein L, SH3, Baker Lab, David Baker, Phage Display]] at the University of Washington. Docking@home and Folding@home are among other worthy volunteer computing projects that tackle protein research to help fight diseases.
  • SETI@home


    SETI stands for Search for Extraterrestrial Intelligence. This [[xref:http://setiathome.ssl.berkeley.edu/|project|SETI@home]], hosted at UC Berkeley, celebrated its 10th anniversary this year. The SETI@home team last year expanded on the traditional SETI@home search for narrowband signals via the Arecibo Observatory radio telescope in Puerto Rico to look for broader-band, short-time pulses, via its Astropulse application. At last count, SETI@home had about 180,000 active volunteers and nearly 290,000 active computers doing work.
  • NQueens@home


    Expands on the original eight queens problem in which you try to figure out how to put eight queens on a chess board in such a way that none of them can attack any of the others. [[xref:http://nqueens.ing.udec.cl/|NQueens@home]] attempts to find solutions if you increased the number of boards and queens to the value N, which most recently is 26. Really, you wouldn't want to try to figure that out without the help of a distributed computing network. This isn't the only project devoted to figuring out chess problems. Chess960@home focuses on Fischer Random Chess, a twist on classical chess in which pieces start in different positions.
  • LHC@home


    [[xref: http://lhcathome.cern.ch/athome/index.shtml|LHC@home]] "enables you to contribute idle time on your computer to help physicists develop and exploit particle accelerators, such as CERN's [[xref:http://www.networkworld.com/news/2008/042208-large-hadron-collider.html?page=1|Large Hadron Collider]]." One application, SixTrack, generates results that are "essential for verifying the long-term stability of the high energy particles in the LHC". For CERN, volunteer computing resources are seen as useful for tasks that require lots of computing power but not so much data transfer (at the time we created this slideshow, LHC@home did not have any work for its volunteer computing force). CERN got LHC@home up and running in 2004 to celebrate the European Organization for Nuclear Research's 50th anniversary.
  • Climateprediction.net


    [[xref:http://climateprediction.net/|Climateprediction.net,|Climateprediction.net : The world's largest climate forecasting experiment for the 21st century.]] based at Oxford University and the Open University in the U.K., describes itself as "the world's largest climate forecasting experiment for the 21st century". This distributed computing project is designed to produce predictions of the Earth's climate up to 2080 and to test the accuracy of climate models. Experiments include estimating the possible effects of climate change mitigation strategies and an investigation of the possible impact of human activity on extreme weather risk.
  • So you'd like to do more volunteering but can't find the time? Here's an easy way to do it: Donate the spare processing power on your computer via one of the dozens of ongoing volunteer computing projects, many based on open source software called [[xref:http://www.networkworld.com/news/2009/083109-boinc.html|BOINC]]. You know, like SETI@home, the well-documented project that uses otherwise idle computing cycles to help "search for extraterrestrial intelligence".
    Here's a look at 12 cool projects, with thanks to volunteer computing enthusiast Jonathan Brier and UC Berkeley's David Anderson for their insights. The Web sites for the various projects typically include stats on how much processing power they're using, who is volunteering their processors, etc.

  • Einstein@home


    Albert Einstein recognized that we live in a universe of gravitational waves. This [[xref:http://einstein.phys.uwm.edu/|project|Einstein@Home]] searches for spinning neutron stars, or pulsars, using data from the LIGO using data from the LIGO and GEO gravitational wave detectors. It also seeks radio pulsars in binary systems.
  • Enigma@home


    [[xref:http://www.enigmaathome.net/|Enigma@home|Enigma@Home]] is a distributed computing based on the [[xref:http://www.bytereef.org/m4_project.html|M4 Project|M4 Message Breaking Project]] designed to break three original Enigma messages intercepted in the North Atlantic in 1942. (The project gets its name from the four rotor Enigma M4 machine presumed to be used by the Germans for enciphering the signals during wartime.) The project, which started in January of 2006, succeeded in breaking the first two messages (the first one read in part "Forced to submerge during attack") within a couple of months, but is still working on No. 3. For more on this project, go [[xref:http://www.networkworld.com/news/2009/083109-nazi-enigma-messages.html|here]].
  • MilkyWay@home


    The goal of [[xref:http://milkyway.cs.rpi.edu/milkyway/|MilkyWay@home]] is "creating a highly accurate three dimensional model of the Milky Way galaxy using data gathered by the Sloan Digital Sky Survey. This project enables research in both astroinformatics and computer science." Among other things, the project is designed to help figure out how galaxies are formed. MilkyWay@Home is a joint effort between Rensselaer Polytechnic Institute's departments of Computer Science and Physics, Applied Physics and Astronomy.
  • [[xref:http://qcn.stanford.edu/index.php|Quake-Catcher Network]]

    This one is different from the other projects in that it's not really exploiting processing power, but rather, built-in accelerometers in laptops as a distributed seismograph. The idea is to provide a better understanding of earthquakes, give early warning to schools, emergency response systems and others (a big emphasis of the project is getting K-12 science teachers involved). While desktop systems don't have accelerometers, they can be outfitted with USB-based sensors to partake in the network. The smallest earthquake detected by the network so far measured 3.1 in southern California and the largest has been a 6.4 in Japan. The project, which has about 1,000 sensors in action (though the number varies week to week), is the brainchild of researchers from the University of California, Riverside and Stanford University.
  • GPUGrid.net


    [[xref:http://www.gpugrid.net/|GPUGrid.net|GPUGRID]] stands apart from a lot of older volunteer computing projects in that it relies on graphics processing units from NVIDIA graphics cards and PlayStation3 systems. The project uses the processing power to perform simulations aimed at better understanding proteins and other molecular events.
  • AQUA@home


    This [[xref:http://aqua.dwavesys.com/|project|AQUA@home]], launched in 2008, is run by [[xref:http://www.dwavesys.com/|D-Wave|Welcome to D-Wave Systems]], a Canadian company trying to build a quantum computer. D-Wave's stated goal is "to predict the performance of superconducting adiabatic quantum computers on a variety of hard problems arising in fields ranging from materials science to machine learning". The company's current focus is trying to determine how an adiabatic quantum computer's running time scales with the size of the input problem, says Dr. Kamran Karimi, D-Wavealgorithms researcher. "We want to go to 200-qubit and 240-qubit problems," Karimi says.
Show Comments

Market Place