Saturday, August 27, 2011

Natural thermometers

Yesterday I gave a talk  on communicating with non-scientists about climate change, at the 2011 Stephen Schneider Symposium in Boulder. (My slides from the talk.) Preparing the talk got me thinking about metaphors, images, and ideas that can get across the nature and severity of climate change to people who aren't experts and don't know what to make of numbers and graphs. Today's post will be first in a series on this.

In 2007, the Extreme Ice Survey set up 38 time-lapse cameras on 22 glaciers around the world. These are mostly very cold, remote places, so the cameras are automatic.

Just four years later, the results are astonishing.

Some of the cameras had to be moved, multiple times, as the glacier faces moved miles upstream.  We are talking about colossal rivers of ice that took centuries to form, diminishing and even disappearing completely in the space of a few decades.

Here's a 1-minute video of the Kadin glacier in Alaska, 2007-2010:


AK-01 Columbia Kadin Narrated from Extreme Ice Survey on Vimeo.

Since the 1990s, 95 percent of the world's glaciers are in retreat. Better than any human instrument, they show directly, and radically, the state of planetary warming and the difference just a couple of degrees can make.

Passing through the Denver airport, I saw videos from the Extreme Ice Survey playing on the intra-terminal train platform. What a great way to put this out there for everyone to see!

Monday, May 2, 2011

Tornadoes and climate change

One of my friends wanted to know what I thought about tornadoes and global warming. Here's my basic line; this applies to any kind of extreme weather.

Temperature is a measure of energy. So higher average temperatures mean that there's more energy in the global system. The function of the climate system is to balance incoming solar radiation and outgoing heat. It does that by moving energy from the equator (where the most radiation comes in) to the poles (where the most heat goes out). Winds — including the winds that cause tornadoes — are one manifestation of moving energy.

Now - you can never attribute any particular weather event, or even a series of apparently related events like these tornadoes, directly to climate change, because weather is so highly variable and chaotic. Only changes in averages are actually climatic.

The relationship between tornadoes and climate is especially hard to analyze since they're highly local phenomena (unlike hurricanes, which occur on a much larger physical scale). This year's tornado series may or may not be the beginning of a trend — we don't know, and we can't know for a number of years. On the other hand, it's certainly plausible that they are part of a trend. Same thing for hurricanes, snowstorms, and other extreme phenomena: we don't know for sure yet, but there's an obvious intuitive connection between higher numbers of more energetic events and increasing energy in the overall climate system.

Laura Gottesdiener did a nice report on this at Huffington Post Green.  Here's an extract:
A 2008 report from the U.S. Global Change and Research Program, a federal interagency research program overseen by the White House Office of Science and Technology Policy, found that more greenhouse gases in the atmosphere could lead to an increase in severe storm conditions that make tornadoes possible. "We can’t say there is a correlation between a specific tornado and global change," said program director Thomas Armstrong. "But the reports do indicate that there is a positive correlation between climate change and the frequency of conditions favorable to the formation" of tornadoes, he said, while stressing that the research is still preliminary.
Sound about right to me. Basically, if you like making bets on low odds, this could be a good one to get in on.

Tuesday, March 22, 2011

Climate smackdown: data versus models?

For decades, skeptics have tried to boil every climate change debate down to this: good, hard data vs. bad, fuzzy models. This caricature lets them attack every model-based climate projection while waiting for data to confirm the reality of human-induced global warming.

Now, here we go again. On February 10, the Wall Street Journal’s professional global warming skeptic, Anne Jolis, trumpeted recent data showing that certain global climate patterns haven’t changed much since 1871. “The weather isn’t getting weirder. The latest research belies the idea that storms are getting more extreme," she wrote, to loud applause from other skeptics. “Another nail in the coffin of anthropogenic global warming,” crowed one.

Less than a week later, the scientific journal Nature presented two papers suggesting that human greenhouse gas emissions have increased the likelihood of heavy rains. One linked the devastating UK floods of the year 2000 to global warming [see it here]. The other identified greenhouse gas emissions as a likely contributor to increases in exceptionally heavy “precipitation events” across the northern hemisphere [see it here]. These results confirm the obvious: a warmer climate leads to more evaporation, and hence more precipitation overall. But they went further, predicting where this extra precipitation would occur and linking it specifically to human emissions.

Both Nature papers relied on climate models, computer simulations of the global atmosphere. When researchers left human greenhouse-gas emissions out of these simulations — simulating the climate as it might be without industrial societies — their models projected fewer heavy precipitation events than observed. When they put them back in, the likelihood of intense precipitation went up.

The skeptic response? “No real data supporting their claims,” one wrote on Andy Revkin's Dot Earth blog. “Just climate models. GIGO [garbage in, garbage out].”

It’s a familiar refrain in the climate change wars. Climate models, goes the tune, are insubstantial fantasies. Tweak their knobs and you can make ‘em say anything. Climate data, on the other hand, are solid, substantial. “Sound science” equals “data, not models.”

But wait — about those data that made Jolis so happy... where exactly did they come from? Here’s a hint: the investigators were awarded over 3 million hours of supercomputer time to do their work. It’s called the 20th-Century Reanalysis Project (20CR, for short). “Reanalysis” is a technique for re-processing past weather data to make a climate dataset.

Here's the paper that got her so jazzed. 20CR began with a comprehensive collection of surface pressure readings covering the period 1871-2008. The project then spent some of those millions of supercomputer hours to pipe those data through a computer forecast system — a simulation model.

That forecast model uses a 3-dimensional grid to represent the atmosphere. The grid mesh contains well over 1 million points, and every one of those points must be assigned a value. Yet the surface pressure readings used as input came from a relative handful of locations — for 1871, the study’s first year, only 62 land stations worldwide. In reanalyses of this type, the vast majority of data  are calculated by the forecast model, not measured by instruments.

So the “data” that had Jolis gloating were in fact largely generated by a computer simulation — the same type of model (though not the same model) used in the Nature studies. According to some skeptics’ own tenets, then, the 20CR data can’t be much more than a scientific fantasy.

True? Of course not. Getting a scientific grip on something as big and complicated as the global atmosphere simply can’t happen without computer modeling. Today, every credible global dataset, without exception, is processed, filtered, corrected, and/or partially generated by computer models. Those who think it’s data versus models — hard evidence vs. squishy algorithms — are living in a long-vanished world, where “science” meant laboratory experiments on highly simplified systems.

So who’s right? Are human greenhouse emissions altering the chances of extreme weather, as the Nature papers suggest? Or does natural climate variability remain unchanged, as 20CR seems to show? Unfortunately, that question can’t yet be answered, because 20CR and the Nature studies addressed different climate patterns that can’t be directly compared. One thing is sure, though: it’s going to take both observations and computer models to find out. Everything we know about the climate — past, present, and future — depends upon our ability to simulate its operation.

The idea that it’s “models bad, data good” just won’t work. We can’t let the skeptics set the terms of the debate. They don’t even understand what the terms mean.

Coda:

As for the 20CR scientists, they responded to Jolis on Feb. 23. Mild-mannered creatures that they are, they wrote that her opinion "does not accurately reflect our views."
As for the statement that the Twentieth Century Reanalysis Project... shows 'little evidence of an intensifying weather trend': We did not look at weather specifically, but we did analyze three weather and climate-related patterns that drive weather, including the North Atlantic Oscillation. And while it is true that we did not see trends in the strength of these three patterns, severe weather is driven by many other factors.

The lack of a trend in these patterns cannot be used to state that our work shows no trend in weather. Many researchers have found evidence of trends in storminess and extreme temperature and precipitation in other weather data over shorter periods.

Finally, the article notes that the findings are 'contrary to what models predict.' But models project forward, while our analysis looked back at historical observations. We see no conflict between the 100-year-projection of changes in weather extremes resulting from additional carbon dioxide and the fact that our look back at three indicators showed no historical trend.
They fail to point out that their analysis is itself produced by a model.

Tuesday, March 8, 2011

Fact-free science

Been meaning to write a post about Judith Warner's great piece "Fact-Free Science," in the NY Times Magazine of February 27.
“This is our generation’s Sputnik moment,” President Obama declared in his State of the Union address last month.

It would be easier to believe in this great moment of scientific reawakening, of course, if more than half of the Republicans in the House and three-quarters of Republican senators did not now say that the threat of global warming, as a man-made and highly threatening phenomenon, is at best an exaggeration and at worst an utter “hoax,” as James Inhofe of Oklahoma, the ranking Republican on the Senate Environment and Public Works Committee, once put it. These grim numbers, compiled by the Center for American Progress, describe a troubling new reality: the rise of the Tea Party and its anti-intellectual, anti-establishment, anti-elite worldview has brought both a mainstreaming and a radicalization of antiscientific thought.
In a 2008 Gallup Poll, 63 percent of Americans said they thought the effects of global warming were already visible. In 2010, that number went down to 53 percent. It's now an article of faith in the Republican Party that climate change is either not happening, or not caused by human activity.

For the last couple of months, I've been subscribing to Google News Alerts on the terms "global warming" and "climate change." If you want to get depressed, do this; I would say that at least 40 percent of this "news" is articles skeptical of anthropogenic global warming. These come now from all over the world, with a substantial percentage of them emanating from India. Almost none of these articles cite new scientific findings; they recycle the same old, already answered false claims about cosmic rays, solar influence, Little Ice Age "recovery" (an idea with no physical basis at all), and so on.

To avoid having to shoot yourself after reading this stuff, look at skepticalscience.com, the best clear, simple, well-evidenced source for evaluation of skeptic claims.

Interview with me in Rorotoko

The intellectual book review Rorotoko published an online "interview" with me yesterday morning (March 7).

My attempt to boil down some of the main points of A Vast Machine to a conversational format.

Friday, February 18, 2011

Data.Rescue@Home

I knew this was coming: crowdsourcing climate data.
Data.Rescue@Home is an internet-based attempt to digitize historical weather data from all over the globe and make the digitised data available to everybody. Two projects are currently online: German radiosonde data form the Second World War and meteorological station data from Tulagi (Solomon Islands) for the first half of the 20th century.
You log in, look at a scanned image of an old weather record, and enter the data as numbers on a form. 

Not much progress yet. Up and running since October 2010, but only about 150 of 2000 scanned images have been coded. Where are the masses when you need them?

New climate variability results: models and data, again

The New York Times reported yesterday on two new Nature papers on climate change (extreme precipitation events linked to anthropogenic global warming through computer simulation), expected to stir up debates again.

Meanwhile, a few weeks ago the 20th-Century Reanalysis Project reported on recent results of the longest-term weather data reanalysis project yet, collecting every scrap of available weather data from 1871-2008 and running them through a weather forecast model to "fill in the blanks" for what's missing.

A salient finding from this study: changes in the North Atlantic Oscillation (also see the North Atlantic Oscillation theme site) appear to be driven throughout the study period primarily by natural variability. In other words, the reanalysis isn't seeing an effect of global warming on variability in the NAO.

The reanalysis data go back to 1871 — but as they go back in time, they get thinner and thinner. Most data prior to the 1950s are from the surface only. The reanalysis model fills in the missing data. So the large majority of data in the pre-1950s reanalysis are created by the model.

Meanwhile, the Nature studies are looking at an entirely different kind of variability, i.e. frequency of extreme precipitation events in the UK (one study) and the Northern Hemisphere (the second study). (It's worth jumping to the actual articles from the links given on the Nature news page.) These studies compare observational data with results from simulation models with and without anthropogenic forcing (i.e. greenhouse gases and other human influences on climate). The results: (a) natural variability alone can't account for the increased northern hemisphere precipitation in the second half of the 20th century, and (b) anthropogenic factors, added to the simulation models, doubled the risk of the floods experienced in the UK in 2000.

This, combined with the comments on the two Nature pieces, make for a lovely skeptic paradox. The skeptics are very happy with the results from the model-driven reanalysis data which (they think) confirm their views. (Another nail in the coffin of AGW, one wrote.) But they roundly reject the idea that simulation models could explain the significant increase in extreme precipitation.

By the way, Piers Corbyn, mentioned in the Kevin Crean comment on the Nature news page, runs a commercial long-term weather prediction service in the UK using his own "solar/lunar" model, whose details he will not reveal and which has never been peer reviewed. He's had some notable successes in forecasting major storms long in advance (months). He places bets on his own forecasts (and sometimes wins). He's a skeptic in the Christopher Monckton vein. (Monckton, by the way, claims to be a hereditary member of the House of Lords, but the Lords are having none of it.)

I'm going to be working on an op-ed about this over the weekend. Comments welcome.