Home PCs Predict Hotter Earth

By Stephen Leahy
Wired News Jan, 31, 2005bbc-climate-predict.png

Global warming may ramp up average temperatures by 20 degrees Fahrenheit in less than 50 years, according to the first climate prediction experiment relying on the distributed computer power of 90,000 personal computers. The startling results were published this week in the journal Nature.

The PCs, located in 150 countries, allowed British scientists to run more than 50,000 simulations of the future global climate, many more than the “best ever” 128 simulations using supercomputers, said Myles Allen, chief scientist of climateprediction.net and a physicist at Oxford University. A distributed-computing project, climateprediction.net involves several British universities and the Hadley Centre for Climate Prediction and Research.

Prior climate simulations have used “nothing close to the kind of computing power we were able to use in this experiment,” said Allen.

The climateprediction.net experiment was designed to find the range of possible 21st-century climate changes due to global warming. The results were both surprising and “very worrying,” he said.

The team found that global temperatures could rise between 4 degrees and 20 degrees Fahrenheit if greenhouse gases double from pre-industrial levels. At current emission rates, that doubling is expected around 2050. Previous best estimates had set the temperature increase at between 2 and 8 degrees Fahrenheit.

The massive personal-computer data crunch revealed that the Earth’s climate may be much more sensitive to greenhouse gas emissions than scientists previously believed. Even if all man-made greenhouse gas emissions ended today, a high risk of serious climate-related problems still exists, said Allen.

“There is real urgency here. We need to explore the uncertainties to rule out the possibilities of an extreme temperature rise,” he said.

Climateprediction.net is based on the successful SETI@home public-computing model. SETI, or Search for Extra-Terrestrial Intelligence, involves several million people who have downloaded software that analyzes data from distant galaxies for signs of alien life. The program does its work while their computers are idle and then uploads the results every few days or weeks.

Not long after SETI@home was launched in 1999, Allen thought something similar could be used to meet the exponentially growing need for computer power to run increasingly complex global climate models. The models try to simulate as many climate factors as possible, including incoming and outgoing radiation, the way air moves, how clouds form and precipitation falls, the way ice sheets grow or shrink and so on.

It proved difficult to adapt SETI@home until the Berkeley Open Infrastructure for Network Computing, or BOINC was released last year, according to David Anderson, director of the SETI@home project at the Space Sciences Laboratory at the University of California at Berkeley. Anderson and his colleagues developed BOINC to allow users to participate in many internet computing projects and tell their computers how much time to devote to each.

Now, in addition to climate modeling, several other science projects are taking advantage of the enormous computing power of the world’s 200 million-plus PCs, very few of which are being tapped, said Anderson.

Climate simulations are ideal for distributed computing because so many simulations need to be run in parallel to test all the variables, he said. When a participating PC is idle, it chugs away on a simulation and after two or three months, it uploads the results to the climateprediction.net server.

However, no U.S. climate researchers are currently using distributed computing to run U.S.-designed global climate models, he said. Some have tried to get funding but were turned down, even though setup costs are only about $200,000. Investments in super-expensive supercomputers such as IBM ASCI White at the Lawrence Livermore National Laboratory, as well as the forthcoming ASC Purple and Blue Gene/L — which have a combined contract price of $290 million — are touted as the technology needed for climate predictions.

“We can do the same thing on a shoestring with public computing,” said Anderson.

While Blue Gene/L, if it works later this year as billed, will have the processing power of 400,000 PCs, Anderson and Allen hope 400,000 or more people will join climateprediction.net. Windows, Linux and Mac users are welcome; the modest technical requirements are listed on the group’s website.

“With more people, we can get results faster. Instead of taking six months, it might only take one,” said Anderson.

Much more computing work must be done to improve future climate predictions, Allen acknowledges. “We don’t know how likely or unlikely (it is) that temperatures will hit the top end of the range,” he said. Later this year, simulations using new global climate models with better data from the world’s oceans will be available on climateprediction.net.

How quickly that work proceeds is up to computer users around the world, who can choose to download BOINC and participate in any of the affiliated internet computing projects.

“(Climateprediction.net) is the one that runs on my computer,” said Anderson.

First published Jan, 31, 2005 on Wired News

Contact: writersteve AT gmail . com (no spaces)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s