Global Warming More Moderate Than Worst-Case Models

That’s the headline from a press release earlier this week from Duke University’s highly regarded Nicholas School of the Environment. Here’s the lede:

DURHAM, N.C. – A new study based on 1,000 years of temperature records suggests global warming is not progressing as fast as it would under the most severe emissions scenarios outlined by the Intergovernmental Panel on Climate Change (IPCC).

“Based on our analysis, a middle-of-the-road warming scenario is more likely, at least for now,” said Patrick T. Brown, a doctoral student in climatology at Duke University’s Nicholas School of the Environment. “But this could change.”

This could change”? I thought everything was settled!  97 percent and all that.  And why might it change?

On closer inspection it appears the underlying study, “Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise,” published in Nature, is actually an attempt to minimize the potential consensus-wrecking effect of the current temperature pause.  But the concessions necessary to preserve The Climate Narrative are rather significant.

The abstract is typically opaque (the complete study is almost impossible to decode, as it is a very thick statistical analysis), and I’ll try a longer stab at decoding it below, but the nub of it is this:

We find that the empirical EUN [Envelope of Unforced Noise] is wide enough so that the interdecadal variability in the rate of global warming over the 20th century does not necessarily require corresponding variability in the rate-of-increase of the forced signal. The empirical EUN also indicates that the reduced GMT [Global Mean Surface Air Temperature] warming over the past decade or so is still consistent with a middle emission scenario’s forced signal, but is likely inconsistent with the steepest emission scenario’s forced signal.

In other words, natural climate variability is bigger than we thought, and may not be adequately accounted for in current climate models.

So this appears to be mixed news for the climatistas. One the one hand, it suggests the current “pause” of the last 18 years is not inconsistent with greenhouse gas “forcing,” because, as the press release says, natural “wiggles” can overwhelm greenhouse gas effects:

Under the IPCC’s middle-of-the-road scenario, there was a 70 percent likelihood that at least one hiatus lasting 11 years or longer would occur between 1993 and 2050, Brown said.  “That matches up well with what we’re seeing.” . . .

But stay tuned:

“Statistically, it’s pretty unlikely that an 11-year hiatus in warming, like the one we saw at the start of this century, would occur if the underlying human-caused warming was progressing at a rate as fast as the most severe IPCC projections,” Brown said. “Hiatus periods of 11 years or longer are more likely to occur under a middle-of-the-road scenario.”

So on the other hand, the more extreme warming scenarios beloved of Goreacles everywhere are less likely. That’s what a lot of people have been saying for a while now, though saying so gets you branded as a gay marriage skeptic or cake-baking denier or something. But what fun is non-catastrophic or slow or moderate global warming when you need to whip up panic and empower the Thermocrats to commandeer the world’s energy supply?

Another question: did the Koch brothers or some other nefarious outfit sponsor this research? Nope: it was sponsored by the National Science Foundation and National Instiutes of Health. (Yeah, I don’t get the latter either.)

Okay, here’s the longer explanation.  Pop an Advil and take in this paragraph from the body of the complete study:

Several recent studies have compared observed GMT anomalies (and trends) with the forced signal and EUN produced by an ensemble of CGCM [Coupled Atmosphere-Ocean General Circulation Models] runs. It should be noted that when different CGCMs are incorporated into the ensemble, the spread of GMT values samples uncertainty in model parameters and structure as well as uncertainty in the state of unforced variability. The observed GMT anomaly in 2013 was near the lower boundary of the 5–95% EUN simulated by the CGCMs (Supplementary Fig. S1). This has been interpreted as evidence that the CGCM-simulated forced signal may be increasing too rapidly, possibly because the increase in external forcings have been overestimated, and/or because the CGCMs are oversensitive to external forcings. However, it has also been noted that when the CGCMs’ EUN is considered, the recently observed rate of warming may still be consistent with the forced signal produced from the CGCMs. Hypothetically, if the observed GMT anomaly were to fall below the CGCM-produced EUN, it would not necessarily indicate that the forced signal was increasing too rapidly. Instead, it could indicate that the CGCM-produced EUN is not large enough (i.e., that CGCMs underestimate the magnitude of unforced noise but not the magnitude of the forced rate of warming). On the other hand, if the CGCM-produced EUN is too large (i.e., if CGCMs overestimate the magnitude of unforced noise compared to reality), then recent observations may already confirm that the forced signal over the 21st century is increasing too rapidly. . .

The unforced noise produced by CGCMs is an emergent property of the simulations and is not guaranteed to accurately represent empirical observations. Indeed, GMT variability on interannual timescales is heavily influenced by the El-Nino/Southern Oscillation (ENSO) and many CGCMs still struggle with the precise magnitude and spectral characteristics of ENSO variability. Additionally, several studies have suggested that CGCMs may systematically underestimate the magnitude of interdecadal unforced variability compared to the real climate system.  (Emphasis added.)

This is a fairly damaging comment on current computer climate models.  For those of you still reading, here is one possible translation into English: the current temperature pause falls within the likely range of natural variability, but is also consistent (barely) with the climate models that predict the “forcing” effect of human greenhouse gas emissions. Keep hope alive!

But here’s the problem: the climatistas keep saying that the sharp intermittent warming periods we have seen over the last century are due to a human cause and are beyond the bounds of natural variability, but that the pauses (or even the decline in global temps from roughly 1940 to the late 1970s) are the result of natural climate variability after all. Very convenient if you are a climatista. But also inconsistent and unpersuasive. The bumper sticker should read: “If it warms, it’s our fault; if it doesn’t warm, it’s Nature at Work!”

Responses