Forecasting a warming world via thousands of PCs
Citizen-scientists, volunteers donate computing time for climate research
Computerworld - Cloud computing can be expensive if you need tens of thousands of CPU hours. As an alternative, some scientists are using low-cost resources, such as home computers, to conduct research.
For example, thousands of PCs played a role in the research involved in the development of a recently released study that found global temperatures may be rising faster than expected, from 2.5 degrees to 5.4 degrees Fahrenheit by 2050.
As many as 50,000 PCs were used to run thousands of simulations to help researchers examine various changes to a climate model originally written for a high-performance computing (HPC) system but adapted to run on home computers.
Daniel Rowlands, a climate scientist at Oxford University and the study's lead author, said the research effort took about 5,000 CPU years of compute time. This is an approximate measurement indicating the amount of work a CPU can accomplish in a year. It doesn't account for differences in processing capability but does give an idea of scale of the effort.
Rowlands said it could have cost more than $1 million to to do the work on a public, commercial cloud. Instead, the researchers did their work on ClimatePrediction.net, which uses the Berkeley Open Infrastructure for Network Computing (BOINC) framework for distributed computing. It's also the same framework used by the Search for Extraterrestrial Intelligence, or SETI@Home, system. The SETI project is the largest such system, with more than 3 million registered hosts, according to BoincStats.com.
ClimatePrediction has more than 500,000 registered hosts, although the number of active hosts is about 33,000. The registered hosts figure is the total number of people who have signed up; the active hosts figure is the number of machines running calculations at any given time.
Rowlands said ClimatePrediction is the only distributed network for climate change research, although NASA is developing its own distributed computing platform called Climate@Home.
Similar to ClimatePrediction, NASA's system will eventually enable volunteers to run climate simulations on their home computers during idle times. The NASA effort, announced more than year ago, is in beta testing and is not publicly available. The release date has not been announced.
The BOINC framework has met with wide adoption among researchers. Some of the other research projects include Rosetta@home, which investigates protein folding, PrimeGrid@home, which conducts mathematical research, and MilkyWay@home, which creates 3D models of our galaxy.
These efforts are called citizen science or volunteer computing. "There are certain types of problems that the public wants to participate in," said Rom Walton, BOINC release manager.
The BOINC software is written in C and C++, and all the components make the assumption that the back end is a SQL server. There are no minimum PC requirements, and Walton said some participants are running the software on Pentium III processors, which were introduced in 1999.
Climate applications that run on HPC systems can produce simulations at a much higher resolution than those delivered via a PC. They may be able to model, for instance, the climate changes for a city. But the research in Rowland's paper, which was published this week in the journal Nature Geoscience, was focused on continentwide changes across the globe. It also ran thousands more simulations on the PCs to conduct its research.
The temperature rise in this research were compared with temperatures from 1961 through 1990. It is an increase that's within the range of warming predicted by the Intergovernmental Panel on Climate Change, but concludes that the warming may be higher than earlier estimates if nothing is done to mitigate greenhouse gases.
"We are completely indebted to our volunteers," Rowlands said of the effort.
Patrick Thibodeau covers cloud computing and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov, or subscribe to Patrick's RSS feed . His email address is firstname.lastname@example.org.
Read more about PCs in Computerworld's PCs Topic Center.
- Top 3 Myths about Big Data Security : Debunking common misconceptions about big data security Big data represents massive business possibilities and competitive advantage for organizations that are able to harness and use that information. But how are...
- Magic Quadrant for Data Masking Technology IBM is a leader in Gartner Inc's Magic Quadrant for Data Masking Technology. Read the full report to learn about IBM.
- Best Practices for Securing Hadoop Historically, Apache Hadoop has provided limited security capabilities. To protect sensitive data being stored and analyzed in Hadoop, security architects should use a...
- Top Tips for Securing Big Data Environments: Why Big Data Doesn't Have to Mean Big Security Challenges Organizations must come to terms with the security challenges they introduce. As big data environments ingest more data, organizations will face significant risks...
- What should I look for in a Next Generation Firewall? SANS Provides Guidance With so many vendors claiming to have a Next Generation Firewall (NGFW), it can be difficult to tell what makes each one different....
- Why Are Customers Really Deploying an NGFW? It seems every IT Security expert is talking about the NGFW, but what are people really doing? This webcast covers 5 real-world customer... All PCs White Papers | Webcasts
Our new weekly Consumerization of IT newsletter covers a wide range of trends including BYOD, smartphones, tablets, MDM, cloud, social and what it all means for IT. Subscribe now and stay up to date!