Friday, February 18, 2011


I knew this was coming: crowdsourcing climate data.
Data.Rescue@Home is an internet-based attempt to digitize historical weather data from all over the globe and make the digitised data available to everybody. Two projects are currently online: German radiosonde data form the Second World War and meteorological station data from Tulagi (Solomon Islands) for the first half of the 20th century.
You log in, look at a scanned image of an old weather record, and enter the data as numbers on a form. 

Not much progress yet. Up and running since October 2010, but only about 150 of 2000 scanned images have been coded. Where are the masses when you need them?

New climate variability results: models and data, again

The New York Times reported yesterday on two new Nature papers on climate change (extreme precipitation events linked to anthropogenic global warming through computer simulation), expected to stir up debates again.

Meanwhile, a few weeks ago the 20th-Century Reanalysis Project reported on recent results of the longest-term weather data reanalysis project yet, collecting every scrap of available weather data from 1871-2008 and running them through a weather forecast model to "fill in the blanks" for what's missing.

A salient finding from this study: changes in the North Atlantic Oscillation (also see the North Atlantic Oscillation theme site) appear to be driven throughout the study period primarily by natural variability. In other words, the reanalysis isn't seeing an effect of global warming on variability in the NAO.

The reanalysis data go back to 1871 — but as they go back in time, they get thinner and thinner. Most data prior to the 1950s are from the surface only. The reanalysis model fills in the missing data. So the large majority of data in the pre-1950s reanalysis are created by the model.

Meanwhile, the Nature studies are looking at an entirely different kind of variability, i.e. frequency of extreme precipitation events in the UK (one study) and the Northern Hemisphere (the second study). (It's worth jumping to the actual articles from the links given on the Nature news page.) These studies compare observational data with results from simulation models with and without anthropogenic forcing (i.e. greenhouse gases and other human influences on climate). The results: (a) natural variability alone can't account for the increased northern hemisphere precipitation in the second half of the 20th century, and (b) anthropogenic factors, added to the simulation models, doubled the risk of the floods experienced in the UK in 2000.

This, combined with the comments on the two Nature pieces, make for a lovely skeptic paradox. The skeptics are very happy with the results from the model-driven reanalysis data which (they think) confirm their views. (Another nail in the coffin of AGW, one wrote.) But they roundly reject the idea that simulation models could explain the significant increase in extreme precipitation.

By the way, Piers Corbyn, mentioned in the Kevin Crean comment on the Nature news page, runs a commercial long-term weather prediction service in the UK using his own "solar/lunar" model, whose details he will not reveal and which has never been peer reviewed. He's had some notable successes in forecasting major storms long in advance (months). He places bets on his own forecasts (and sometimes wins). He's a skeptic in the Christopher Monckton vein. (Monckton, by the way, claims to be a hereditary member of the House of Lords, but the Lords are having none of it.)

I'm going to be working on an op-ed about this over the weekend. Comments welcome.