Climate alarmism is not based on empirical observation; rather, it is entirely predicated on computer models that are manipulated to generate predictions of significant global warming as a result of increased concentrations of CO2. But a model in itself is evidence of nothing. The model obeys the dictates of its creator. In the case of climate models, we know they are wrong: they don’t accurately reproduce the past, which should be the easy part; they fail to account for many features of the Earth’s present climate; and to the extent that they have generated predictions, those predictions have proven to be wrong. There is therefore no reason why anyone should rely on predictions of future climate that are generated by the models.
Realities about climate models are much more prosaic. They don’t and can’t work because data, knowledge of atmospheric, oceanographic, and extraterrestrial mechanisms, and computer capacity are all totally inadequate. Computer climate models are a waste of time and money.
Inadequacies are confirmed by the complete failure of all forecasts, predictions, projections, prognostications, or whatever they call them. It is one thing to waste time and money playing with climate models in a laboratory, where they don’t meet minimum scientific standards, it is another to use their results as the basis for public policies where the economic and social ramifications are devastating. Equally disturbing and unconscionable is the silence of scientists involved in the IPCC who know the vast difference between the scientific limitations and uncertainties and the certainties produced in the Summary for Policymakers (SPM).
IPCC scientists knew of the inadequacies from the start. Kevin Trenberth’s response to a report on inadequacies of weather data by the US National Research Council said
It’s very clear we do not have a climate observing system….This may come as a shock to many people who assume that we do know adequately what’s going on with the climate, but we don’t.
This was in response to the February 3, 1999 Report that said,
Deficiencies in the accuracy, quality and continuity of the records place serious limitations on the confidence that can be placed in the research results.
Before leaked emails exposed its climate science manipulations, the Climatic Research Unit (CRU) issued a statement that said,
[General circulation models] are complex, three dimensional computer-based models of the atmospheric circulation. Uncertainties in our understanding of climate processes, the natural variability of the climate, and limitations of the GCMs mean that their results are not definite predictions of climate.
Phil Jones, Director of the CRU at the time of the leaked emails and former director Tom Wigley, both IPCC members, said,
Many of the uncertainties surrounding the causes of climate change will never be resolved because the necessary data are lacking.
Stephen Schneider, prominent part of the IPCC from the start said,
Uncertainty about feedback mechanisms is one reason why the ultimate goal of climate modeling – forecasting reliably the future of key variables such as temperature and rainfall patterns – is not realizable.
[Ed.: Steve Schneider is notorious for having been a hysterical advocate of human-caused global cooling before he became a hysterical advocate of human-caused global warming.] Schneider also set the tone and raised eyebrows when he said in Discover magazine:
Scientists need to get some broader based support, to capture the public’s imagination…that, of course, entails getting loads of media coverage. So we have to offer up scary scenarios, make simplified dramatic statements, and make little mention of any doubts we may have…each of us has to decide what the right balance is between being effective and being honest.
The IPCC achieved his objective with devastating effect, because they chose effective over honest.
Dr. Ball goes on to explain how general circulation models are constructed, and why they are so unreliable:
The surface [of the Earth] is covered with a grid and the atmosphere divided into layers. Computer models vary in the size of the grids and the number of layers. They claim a smaller grid provides better results. It doesn’t! If there is no data a finer grid adds nothing. The model needs more real data for each cube and it simply isn’t available. There are no weather stations for at least 70% of the surface and virtually no data above the surface. There are few records of any length anywhere; the models are built on virtually nothing. The grid is so large and crude they can’t include major weather features like thunderstorms, tornados, or even small cyclonic storm systems.
One thing I had not realized is that climate models are so complex that they require an unrealistic amount of computer time to perform a single run. As a result, the climateers simply leave out lots of variables:
Caspar Ammann said that GCMs (General Circulation Models) took about 1 day of machine time to cover 25 years. On this basis, it is obviously impossible to model the Pliocene-Pleistocene transition (say the last 2 million years) using a GCM as this would take about 219 years of computer time.
So you can only run the models if you reduce the number of variables. O’Keefe and Kueter explain.
As a result, very few full-scale GCM projections are made. Modelers have developed a variety of short cut techniques to allow them to generate more results. Since the accuracy of full GCM runs is unknown, it is not possible to estimate what impact the use of these short cuts has on the quality of model outputs.
Omission of variables allows short runs, but allows manipulation and moves the model further from reality. Which variables do you include? For the IPCC only those that create the results they want.
The alarmists’ models cannot withstand scrutiny by qualified scientists who are not in on the scam:
Most don’t understand models or the mathematics on which they are built, a fact exploited by promoters of human caused climate change. They are also a major part of the IPCC work not yet investigated by people who work outside climate science. Whenever outsiders investigate, as with statistics and the hockey stick, the gross and inappropriate misuses are exposed.
There is much more, but let’s close with this:
The IPCC chapter on climate models appears to justify use of the models by saying they show an increase in temperature when CO2 is increased. Of course they do, that is how they’re programmed. Almost every individual component of the model has, by their admission, problems ranging from lack of data, lack of understanding of the mechanisms, and important ones are omitted because of inadequate computer capacity or priorities. The only possible conclusion is that the models were designed to prove the political position that human CO2 was a problem.
Scientists involved with producing this result knew the limitations were so severe they precluded the possibility of proving the result. This is clearly set out in the their earlier comments and the IPCC Science Report they produced. They remained silent when the [Summary for Policy Makers] claimed, with high certainty, they knew what was going on with the climate. They had to know this was wrong. They may not have known about the political agenda when they were inveigled into participating, but they had to know when the 1995 [Summary for Policy Makers] was published because Benjamin Santer exploited the SPM bias by rewriting Chapter 8 of the 1995 Report in contradiction to what the members of his chapter team had agreed. The gap widened in subsequent [Summaries for Policy Makers] but they remained silent and therefore complicit.
We are witnessing the greatest scandal in the history of science. Someday before long, the discreditable role played by Benjamin Santer, Michael Mann and others will be universally recognized. Until then, governments will continue to funnel billions of dollars to alarmist scientists to reward them for leading the charge for expanded government power.