With a hat-tip to our reader and commenter, the indefatigable JK, I offer you a detailed essay by Dr. Roy Spencer about climate modeling. (Dr. Spencer’s CV is here. Keep it in mind next time somebody tells you the scientific consensus among Actual Climate Scientists is “settled”.)
One of the key ideas in this article is that for the Earth’s average surface temperature to vary over time means that the energy lost to space must be greater or less than the energy it receives from the Sun. For it to remain constant, solar energy and radiative heat loss must be in perfect balance, and that this state of equilibrium is assumed to be Earth’s default condition — despite obvious (And enormous) fluctuations in the past that could not possibly have been caused by human activity.
We read:
The average rate of energy gain by the global climate system from sunlight is variously estimated to be 235 to 245 Watts per square meter (W/m2), so, for purposes of discussion the assumption is 240 W/m2. For global temperatures to remain approximately constant over time, the rate of energy loss by the system to outer space, which occurs through infrared (IR) “heat” radiation, must also be approximately 240 W/m2.
But just how well do climate researchers know these numbers, and what is the evidence that there is a natural balance between them? The best satellite measurements from the National Aeronautics and Space Administration’s (NASA’s) Clouds and the Earth’s Radiant Energy System’s (CERES’s) instruments are only accurate to a few W/m2 (about 1 percent of the average energy flows). To estimate the level of global energy imbalance, researchers use long-term measurements of the gradual warming of the global average oceans to estimate the energy imbalance. From the observed rates of warming of the deep ocean it is straightforward to compute that the current energy imbalance is only about 0.6 W/m2, which is a tiny fraction of the approximate 240 W/m2 natural energy flows. This imbalance is thus considerably smaller (by about a factor of four) than the accuracy with which one can measure global average rates of energy gain and loss in and out of the climate system using satellites.
This is important because it means that some portion of recent warming could be natural. But since climate researchers do not understand natural sources of climate change, such as those that caused the Roman Warm Period of about 2,000 years ago, the Medieval Warm Period of about 1,000 years ago, and the Little Ice Age several centuries ago, most climate researchers simply assume that a similar event is not happening today.
Instead of admitting that natural processes could be at work in causing climate change, “energy equilibrium” is what is assumed by the mainstream climate research community for the natural state of climate system unaffected by humans. The members of this community assume that the rate of energy input into the climate system from the sun is, on average, exactly equal to the rate of energy loss to outer space from IR radiation when averaged globally and over many years. The current, small roughly 0.6 W/m2 imbalance in the approximate 240 W/m2 energy flows in and out of the climate system is then entirely blamed on the burning of fossil fuels.
But this energy balance assumption for the Earth is a statement of faith, not science.
Climate models, then, begin with this assumption. But this means that they have to build it, somehow, into the model: in order to make sure that the CO2 concentration is the only variable that can actually tip this otherwise-static balance, the model has to be designed so that all the other effects add up to a net warming of zero. But because the complexities of the climate system are not fully understood, these simplifying models require some bits — “fudge factors” — that are simply put in by hand to make everything neatly cancel out.
Even once this is done, the various models don’t agree:
The large number of climate models produce global warming rates which vary by about a factor of three between them (1.8°C to 5.6°C) in response to a doubling of atmospheric CO2 (2 x CO2). In 2023, Earth’s atmosphere was about 50 percent of the way to 2 x CO2. Amazingly, this factor-of-three range of warming projections has not changed in the more than 30 years of climate-model improvements. This proves that climate-model forecasts are not, as is often claimed, based on proven physics. If they were, they would all produce about the same amount of warming.
This disagreement is because the Earth is not simply a ball absorbing and radiating heat according to the concentration of atmospheric greenhouse gases; it is, rather a chaotic system that involves all sorts of positive and negative feedback — and the very essence of chaotic, nonlinear dynamic systems is that they are “algorithmically incompressible”: there is no model of lower complexity than the system itself that can accurately predict its future state. So every climate model is necessarily based on guesses and intuitions in the places where, in a God’s-eye view, the chaos would be. And so they vary in their predictions. Nearly all of them have overpredicted warming so far — and of course the direst predictions are the ones that make the news, and that our institutions seize upon for fearmongering when seeking money and power.
There is much more. Read the whole thing here.
Bob Zimmerman of the space-sciences website Behind The Black comments here.
Meanwhile, here is a report that claims there is good reason to suspect a significant warming bias in our surface-temperature measurement stations.
Also: not mentioned in Spencer’s essay is natural variation in solar output. I’ve mentioned over the years a book called The Neglected Sun, which discusses this in detail.
Finally, here’s a ten-year-old post of my own, outlining my view of this “crisis” at the time. All that’s changed in 2024 is that I think the ulterior motives are vastly more obvious now, and the social and economic havoc wrought by this fanaticism has only accelerated.