Climate models don't prove anything. They are merely a set of hypothesis-testing tools for understanding how the intrinsically complex system that is our climate works. As such, they must be tested against the available data. Gerald Meehl and colleagues at NCAR have conducted another such test (published in the October Journal of Climate).
It's frequently argued that the models are the best we've got in understanding what might happen in the 21st century as we increase the level of greenhouse gases in the atmosphere, because the "experiment" we can physically conduct will take a hundred years, with some risks related to the outcome. But in fact, we're reaching the point where greenhouse gases and the other climatological drivers have varied enough over the last 100 years that we can begin to conduct the experiment through a sort of hindcasting. That's what Meehl et al. have done in their latest Journal of Climate paper.
They plugged in five climate forcers into the model, two natural (solar variability and the cooling effect of stuff spewed out by volcanoes) and three man-made (greenhouse gases, ozone and sulfate aerosols). They ran simulations with various combinations to see what might reasonably explain the actual variance we see in 20th century climate records. The result:
The late-twentieth-century warming can only be reproduced in the model with anthropogenic forcing (mainly GHGs), while the early twentieth-century warming is mainly caused by natural forcing in the model (mainly solar). However, the signature of globally averaged temperature at any time in the twentieth century is a direct consequence of the sum of the forcings.
And it provides increased confidence that the models might have something useful to say about what might happen in the future.
Posted by John Fleck at October 09, 2004 10:34 AM