20th century global warming - "There is nothing new under the Sun" - Part I

With the exception of those who have been living in a dark cave, it is well known that Earth has warmed over the 20th century. Anyone who reads papers or watches TV knows that it is humans who are those responsible for this warming. According to most media sources, and unfortunately according to many academics, there is no room anymore for any discussion of questions concerning the nature of the warming and its causes. The consensus is simply that we are at the face of a catastrophe, and the discussion should now concentrate on how to diminish the inevitable damage expected from the inexorable future warming. But are the underlying facts really correct?

Before jumping into conclusions, and in particular on the forms of action presumably required to solve the problems at work, we should carefully analyze the basic climate questions. Then, we will find out that the full picture is significantly more complicated than that presented in the media, and luckily for humanity, significantly less bleak. If we are to ascertain future climate variations, we should first understand the past variations—What is their origin? How important are they? Only then will we be able to understand present climate change and even predict the expected change over the 21st century.

A significant amount of evidence indicates that the global temperature did increase during the 20th century. For example, direct thermometer measurements indicate that the temperature increased by perhaps 0.8°C. However, before jumping into conclusions, which we will soon do, we should consider the following.

First, measuring the actual temperature is in many cases far from trivial, and even the 0.8°C value should be considered cautiously. There are many effects which introduce systematic errors which are often hard to account for and which may mimic an apparent heating. The classic example is that of the “urban heat island effect”, whereby many ground stations which are located in populated areas measure average warming, not because of global warming itself, but because of the proximity of the stations to human heat sources (such as A/C’s), or simply because the larger amounts of concrete or asphalt surfaces absorb more solar radiation. Measurements of the tropospheric temperature using satellite data over the past 30 years reveals in fact less warming than the surface stations detect.

In addition, many alleged indicators of global warming are not necessarily indicators of warming at all, let alone human induced warming. For example, there are claims that hurricane activity increased due to global warming. Not only is there no clear evidence for this, it is not clear at all whether hurricane activity should in fact increase under warmer conditions (Note that under a warmer Earth, hurricane activity should increase with the elevated ocean temperatures but decrease because of the smaller latitudinal temperature differences. Today, it is not clear which of the two terms dominates!). Another example is Mt. Kilimanjaro. Its ice cap may be melting, but it is probably related to other processes not directly related to 20th century warming. (For example, see Cullen, N.J. et al. 2006. “Kilimanjaro glaciers: Recent areal extent from satellite data and new interpretation of observed 20th century retreat rates”, Geophys. Res. Lett. 33: 10.1029/2006GL027084, who show that most of the glacial melt has taken place in the first half of the 20th century, as an adjustment to a previous drying of the average climate state.)

In any case, the burning question to ask is not the exact size of the total 20th century warming, but instead how large is the human induced component and of course its future effect. One should note that even if there is evidence for some warming (and there is ample of it), this evidence does not necessarily imply that this warming is due to anthropogenic greenhouse gases. The linkage between the observed warming and humans is not proven and not a necessity, and it is one of the most common mistakes in the current public debate. In Al Gore’s movie, “An Inconvenient Truth” for example, we have seen many pieces of evidence pointing to the occurrence of global warming, but not even one indicator that this warming is due to greenhouse gases, or in fact that it is due to any anthropogenic activity whatsoever. This of course does not prove or disprove the existence of such a link, but it does say that we have to be extra careful.

So, is it all a figment of the media? What is the evidence supporting the claim that most of the warming is anthropogenic? It turns out that there is no direct evidence supporting this link! There is no fingerprint which proves that the warming is caused primarily by CO2 or other anthropogenic greenhouse gases. In fact, the two primary “proofs” are circumstantial, and as we shall soon see, very problematic.

The first claim proceeds as follows. We emit greenhouse gases. We know that the greenhouse gases should cause some warming. We also know that there was some warming over the past century. Since we do not have any other satisfactory explanation, the warming should be the result of the elevated greenhouse gas levels. In fact, the proponents of this claim also point out that only if the human contribution to the global energy budget is included in the numerical climate models, only then is it possible to explain the observed warming. With “natural causes” only, the same numerical models cannot explain the observations, they under predict the warming.

The second claim is simple. Given the fact that the temperature increase over the 20th century is apparently unusually fast, it cannot have been the result of natural causes alone. Since we have not seen any similar rise over the past millennium, not in terms of rate nor in absolute temperature change, it is clear that something unnatural is the cause of this “unprecedented” warming, and therefore it must be due to humans.

Both these claims, even if they were true, still cannot convict CO2 as the culprit responsible for the warming. First, we should probably emphasize the obvious. The fact that no other explanation is known for the warming does not prove that an alternative does not exist. As is happens, there is one which is as clear as the light of day. It is the sun. However, addressing this possibility would take out the gist out of the first aforementioned claim. As a result, this alternative explanation was pushed aside as much as possible by the anthropogenic warming protagonists.

Second, the temperature increase over the 20th century is not unique at all. The temperature increase between 1970 and 2000, for example, is very similar to the increase measured between 1910 and 1940, both in terms of rate and absolute size (e.g., see fig. 1). Moreover, we know of past periods during which, without human intervention, it was as warm as it was in the latter half of the 20th century, and perhaps even more. Significant evidence indicates, for example, that during the middle ages it was as warm as today. With the presently receding ice on Greenland, it is possible to find Viking graves which were until recently under a long permafrost. Similarly, at different places in the Alps where glacial ice now recedes, it is now possible to find human activity dated to Roman times. Clearly, climate has always changed.

[collapse title="Figure 1"]

Figure 1: Global Circulation Models can fit the long term trend of the 20th century warming, as can be seen is this graph taken from the IPCC TAR. The gray region depicts different model results and the red line the actual surface temperature measurements. This fit is not surprising given the large uncertainties in model sensitivities and in the net anthropogenic radiative forcing changes over the 20th century, which implies than any warming could have been explained. Nevertheless, the fit reveals troublesome inconsistencies. Following large volcanic eruptions, the sensitive gcm models predict large temperature decreases which are absent in the observational data. Note that a similar IPCC AR4 graph exists, but it commences in 1900, thus covering up the large discrepancy following Krakatoa’s eruption. [/collapse]
Using borehole measurements it is possible to reconstruct long term temperature variations. Such measurements reveal that the global temperature in the middle ages was as high as the end of the 20th century, or even warmer. During the 17th century on the other hand, the global temperature was notably lower than the average over the 20th century. (For example, see Huang et al., Geophys. Res. Lett. 24, 1947, 1997, who used more than 6,000 global borehole heat flux data to reconstruct the average global temperature over the past 20,000 years. They found that the mid-Holocene (8,000 years ago) was warmer by of order 0.5°C than today as was the medieval warm period, while the little ice age was cooler than present temperatures by a similar value.)

Thus, the arguments supposedly proving that the warming is anthropogenic are problematic, to say the least. But besides lacking any “teeth”, it turns out that there are several cardinal problems with the standard anthropogenic explanation. That is, not only is it impossible to prove the claim that most of the 20th century warming is due to CO2, it can be shown to be inconsistent with the observational evidence when scrutinized in detail.

The theoretical predictions for the greenhouse effects of CO2 are not for just for the average global temperature increase, they include predictions as to where the temperature rise will be larger or smaller. The interesting point is that it is generally predicted that the temperature will increase rather uniformly up to a height of about 15 km (with a warming at higher altitudes which is somewhat larger than the heating near the surface). In reality, the warming over the past 30 years is only up to an altitude of about 10 km, and it primarily takes place near the surface (see fig. 2). The observational latitudinal dependence does not agree with model predictions either. While it is predicted that the equatorial regions should have warmed more than the sub-tropical ones, in reality it was the opposite. In other words, if there there is something which could have been a CO2 fingerprint, then it points to another direction.

[collapse title="Figure 2"]

Figure 2: Temperature trends at the tropics (20°S to 20°N) for the satellite era, from Douglass et al., Int. J. Climatol. 28, 1693 (2008). Plotted in red is the altitudinal dependence (and the ±2σ variations) obtained by averaging the results of 22 different climate models, which were tuned to fit the observed 20th century temperature variations. The blue, green and purple data sets are four different radiosonde results. The yellow symbols on the right denote different satellite based warming at the lower troposphere (T2lT) or averaged over the whole troposphere (T2). More information in the above reference. Evidently, present climate models grossly fail to describe the altitudinal dependence of the warming over the tropics.

[/collapse]
A central problem in the theory of anthropogenic warming is that in order to associate the relatively small human induced changes in the energy budget with the observed temperature change, Earth’s climate needs to be very sensitive to changes in the energy budget. However, different empirical indications reveal that in contrast to the numerical models, the real climate sensitivity is on the low side. Already a decade ago, the physicist Richard Lindzen from MIT brought the example of volcanos to demonstrate that the sensitivity is small.

Massive volcanic eruptions, such as those of Krakatoa in 1883 or Pinatubo in 1992, raise large amounts of dust into the stratosphere (in the bottom of which commercial planes fly). Because the stratosphere is stable and does not mix with the lower atmosphere, this dust can reside for as long as two years, thereby blocking some of the sunlight. In other words, such massive eruptions should decrease the energy budget of Earth. As mentioned before, the numerical models which explain the 20th century warming as the consequence of anthropogenic activity, require a high temperature sensitivity in response to variations in the energy budget. Therefore, the same models predict relatively large temperature reductions in response to massive volcanic eruption, typically up to half a degree. In reality, the average temperature reduction following the six largest eruptions since (and including) Krakatoa, is only 0.1°C! (see fig. 1). Namely, Earth’s climate sensitivity must be small, but then one cannot explain the 20th century temperature increase primarily as a result of anthropogenic activity.

Because the question of Earth’s sensitivity to changes in the radiative budget is the key question to the understanding of future climate change, let us mention more evidence which indicates that the sensitivity is on the low side, significantly lower than the claims of the anthropogenic global warming protagonists.

On a time scale of tens of millions of years, there were large variations in the amount of CO2. These variations arise from a varying deposition rate of limestone on the ocean floor and the emission rate of CO2 in volcanic activity. As a consequence, there were periods during which there was much more CO2 in Earth’s atmosphere. For example, there was probably 10 times more CO2 450 million years ago than there is today. However, during that time, it was as cold as it is presently!4 If CO2 has (or had) a large effect on the global temperature, Earth back then should have been significantly warmer, but it wasn’t. In other words, there is no correlation on long time scales between the atmospheric CO2 level and the average global temperature (see fig. 3).

[collapse title="Figure 3"]

Figure 3: Top: A reconstructed (the GEOCARB III model - Berner and Kothavala, 2001) and paleosol based CO2 variations (all measurements with less than x3 total error in the Berner compilation) over the past 500 million years. Bottom: 18O/16O isotope ratio based temperature reconstruction of Veizer et al., 2001. The lack of correlation between CO2 variations and the climate can be used to place an upper limit on the effects of CO2. (ΔT < 1.5°C per CO2 doubling). See also fig. 9. [/collapse]
Note that in the more recent past, there were variations of 10’s of percent in the amount of atmospheric CO2. However, these variations are due to emission and absorption of CO2 into the oceans. On this short time scale, of 10’s of thousands of years, there is a clear correlation between the varying CO2 and variations in the global temperature, as can be reconstructed using ice-cores (as was seen, for example, in Al Gore’s movie). However, this correlation is a result of the dependence of the complex CO2 atmospheric/oceanic equilibrium (i.e., the solubility in the oceans), which is temperature dependent. This is clearly supported by the fact that when there is sufficient temporal resolution in the ice-cores, one sees that the CO2 variations lag behind the temperature variations by several hundred years. (see fig. 4).

[collapse title="Figure 4"]

Figure 4a: Al Gore uses pyrotechnics to lead his audience to the wrong conclusion. If CO2 affects the temperature, as this graph supposedly demonstrates, then the 20th century CO2 rise should cause a temperature rise larger than the rise seen from the last ice-age to today's interglacial. This is of course wrong. All it says is that the dissolution balance of CO2 in the oceans is temperature dependent. If we were to stop burning fossil fuels (which is a good thing in general, but totally irrelevant here), then the large CO2 increase would turn into a CO2 decrease, returning back to pre-industrial levels.

Figure 4b: Analysis of ice core data from Antarctica by Indermühle et al. (GRL, vol. 27, p. 735, 2000), who find that CO2 lags behind the temperature by 1200±700 years. Other analyses find the same. Fischer et al. (Science, vol. 283, p. 1712, 1999) reported a time lag 600±400 yr during early de-glacial changes in the last 3 glacial–interglacial transitions. Siegenthaler et al. (Science, vol. 310, p. 1313, 2005) find a best lag of 1900 years in the Antarctic data. Monnin et al. (Science vol. 291, 112, 2001) find that the onset of the CO2 increase in the beginning of the last interglacial lagged the onset of the temperature increase by 800 years. [/collapse]
In summary, there is no direct evidence showing that CO2 caused the 20th century warming, or as a matter of fact, any warming. The question to ask is therefore can we point to some other culprit? If humans are not the only ones responsible for climate change, what else is responsible?

Next to Part II

ContentType: