Published on:

Orbital wobbles, carbon dioxide and dust all seem to contribute

An expanded version of my recent Times column on ice ages:

Record cold in America has brought temperatures as low as minus 44C in North Dakota, frozen sharks in Massachusetts and iguanas falling from trees in Florida. Al Gore blames global warming, citing one scientist to the effect that this is “exactly what we should expect from the climate crisis”. Others beg to differ: Kevin Trenberth, of America’s National Centre for Atmospheric Research, insists that “winter storms are a manifestation of winter, not climate change”.

Forty-five years ago a run of cold winters caused a “global cooling” scare. “A global deterioration of the climate, by order of magnitude larger than any hitherto experienced by civilised mankind, is a very real possibility and indeed may be due very soon,” read a letter to President Nixon in 1972 from two scientists reporting the views of 42 “top” colleagues. “The cooling has natural causes and falls within the rank of the processes which caused the last ice age.” The administration replied that it was “seized of the matter”.

In the years that followed, newspapers, magazines and television documentaries rushed to sensationalise the coming ice age. The CIA reported a “growing consensus among leading climatologists that the world is undergoing a cooling trend”. The broadcaster Magnus Magnusson pronounced on a BBC Horizon episode that “unless we learn otherwise, it will be prudent to suppose that the next ice age could begin to bite at any time”.

Newsweek ran a cover story that read, in part: “The central fact is that, after three quarters of a century of extraordinarily mild conditions, the Earth seems to be cooling down. Meteorologists disagree about the cause and extent of the cooling trend, as well as over its specific impact on local weather conditions. But they are almost unanimous in the view that the trend will reduce agricultural productivity for the rest of the century.”

This alarm about global cooling has largely been forgotten in the age of global warming, but it has not entirely gone away. Valentina Zharkova of Northumbria University has suggested that a quiescent sun presages another Little Ice Age like that of 1300-1850. I’m not persuaded. Yet the argument that the world is slowly slipping back into a proper ice age after 10,000 years of balmy warmth is in essence true. Most interglacial periods, or times without large ice sheets, last about that long, and ice cores from Greenland show that each of the past three millennia was cooler than the one before.

However, those ice cores, and others from Antarctica, can now put our minds to rest. They reveal that interglacials start abruptly with sudden and rapid warming but end gradually with many thousands of years of slow and erratic cooling. They have also begun to clarify the cause. It is a story that reminds us how vulnerable our civilisation is. If we aspire to keep the show on the road for another 10,000 years, we will have to understand ice ages.The oldest explanation for the coming and going of ice was based on carbon dioxide. In 1895 the Swede Svante Arrhenius, one of the scientists who first championed the greenhouse theory, suggested that the ice retreated because carbon dioxide levels rose, and advanced because they fell. If this were true, he thought, then industrial emissions could head off the next ice age.

Burning coal, Arrhenius said, was therefore a good thing: “By the influence of the increasing percentage of carbonic acid in the atmosphere, we may hope to enjoy ages with more equable and better climates.”

There is indeed a correlation in the ice cores between temperature and carbon dioxide. There is less CO2 in the air when the world is colder and more when it is warmer. An ice core from Vostok in Antarctica found in the late 1990s that CO2 is in lock-step with temperature — more CO2, warmer; less CO2, colder. As Al Gore put it sarcastically in his 2006 film An Inconvenient Truth, looking at the Vostok graphs: “Did they ever fit together? Most ridiculous thing I ever heard.” So Arrhenius was right? Is CO2 level the driver of ice ages?

Well, not so fast. Inconveniently, the correlation implies causation the wrong way round: at the end of an interglacial, such as the Eemian period, over 100,000 years ago, carbon dioxide levels remain high for many thousands of years while temperature fell steadily. Eventually CO2 followed temperature downward. Here is a chart showing that. If carbon dioxide was a powerful cause, it would not show such a pattern. The world could not cool down while CO2 remained high.

In any case, what causes the carbon dioxide levels to rise and fall? In 1990 the oceanographer John Martin came up with an ingenious explanation. During ice ages, there is lots of dust blowing around the world, because the continents are dry and glaciers are grinding rocks. Some of that dust falls in the ocean, where its iron-rich composition fertilizes plankton blooms, whose increased photosynthesis draws down the carbon dioxide from the air. When the dust stops falling, the plankton blooms fail and the carbon dioxide levels rise, warming the planet again.

Neat. But almost certainly too simplistic. We now know, from Antarctic ice cores, that in each interglacial, rapid warming began when CO2 levels were very low. Temperature and carbon dioxide rise together, and there is no evidence for a pulse of CO2 before any warming starts, if anything the reverse. Well, all right, said scientists, but carbon dioxide is a feedback factor – an amplifier. Something else starts the warming, but carbon dioxide reinforces it. Yet the ice cores show that in each interglacial cooling returned when CO2 levels were very high and they remained high for tens of thousands of years as the cooling continued. Even as a feedback, carbon dioxide looks feeble.

Here is an essay by Willis Eschenbach discussing this issue. He comes to five conclusions as to why CO2 cannot be the main driver and why the feedback effect is probably small:

The correspondence with log(CO2) is slightly worse than that with CO2. The CO2 change is about what we’d expect from oceanic degassing. CO2 lags temperature in the record. Temperature Granger-causes CO2, not the other way round. And (proof by contradiction) IF the CO2 were controlling temperature the climate sensitivity would be seven degrees per doubling, for which there is no evidence.

Now, the standard response from AGW supporters is that the CO2, when it comes along, is some kind of positive feedback that makes the temperature rise more than it would be otherwise. Is this possible? I would say sure, it’s possible … but that we have no evidence that that is the case. In fact, the changes in CO2 at the end of the last ice age argue that there is no such feedback. You can see in Figure 1 that the temperatures rise and then stabilize, while the CO2 keeps on rising. The same is shown in more detail in the Greenland ice core data, where it is clear that the temperature fell slightly while the CO2 continued to rise.

As I said, this does not negate the possibility that CO2 played a small part. Further inquiry into that angle is not encouraging, however. If we assume that the CO2 is giving 3° per doubling of warming per the IPCC hypothesis, then the problem is that raises the rate of thermal outgassing up to 17 ppmv per degree of warming instead of 15 ppmv. This is in the wrong direction, given that the cited value in the literature is lower at 12.5 ppmv

So what does cause ice ages to come and go?

A Serbian scientist named Milutin Milankovich, writing in 1941, published a lengthy book  called “Canon of Insolation of the Earth and Its Application to the Problem of the Ice Ages”. He argued that ice ages and interglacials were caused by changes in the orbit of the Earth around the sun. These changes, known as eccentricity, obliquity and precession, sometimes combined to increase the relative warmth of northern hemisphere summers, melting ice caps in North America and Eurasia and spreading warmth worldwide. This, said Milankovich, was “the hitherto missing link between celestial mechanics and geology”.The northern hemisphere matters because no matter how warm the southern summer gets, Antarctica, being at much higher latitude, stays cold and (reflective) white.
In 1976 Nicholas Shackleton, a Cambridge physicist, and his colleagues published a paper called “Variations in the Earth’s Orbit – Pacemaker of the Ice Ages” with evidence from deep-sea cores of cycles in the warming and cooling of the Earth over the past half million years which fitted Milankovich’s orbital wobbles.
In a brilliant insight, Shackleton had realised that sediments taken from the ocean floor and analysed for different isotopes of oxygen could serve as a proxy for climate. The lighter isotopes of oxygen evaporated more readily from the sea, and therefore were more likely to fall as snow and get stuck on ice caps in cold periods, returning to the sea when the ice melted. So the relative concentration of the lighter isotopes in sea-floor sediments were a sort of thermometer.
Precession, which decides whether the Earth is closer to the sun in July or in January, is on a 23,000-year cycle; obliquity, which decides how tilted the axis of the Earth is and therefore how warm the summer is, is on a 41,000-year cycle; and eccentricity, which decides how rounded or elongated the Earth’s orbit is and therefore how close to the sun the planet gets, is on a 100,000-year cycle. When these combine to make a “great summer” in the north, the ice caps shrink.
Game, set and match to Milankovich? Not quite. The Antarctic ice cores, going back 800,000 years, then revealed that there were some great summers when the Milankovich wobbles should have produced an interglacial warming, but did not. To explain these “missing interglacials”, a recent paper in Geoscience Frontiers by Ralph Ellis and Michael Palmer argues we need carbon dioxide back on the stage, not as a greenhouse gas but as plant food.
The argument goes like this. Colder oceans evaporate less moisture and rainfall decreases. At the depth of the last ice age, Africa suffered long mega-droughts; only small pockets of rainforest remained. Crucially, the longer an ice age lasts, the more carbon dioxide is dissolved in the cold oceans. When the level of carbon dioxide in the atmosphere drops below 200 parts per million (0.02 per cent), plants struggle to grow at all, especially at high altitudes. Deserts expand. Dust storms grow more frequent and larger. In the Antarctic ice cores, dust increased markedly whenever carbon dioxide levels got below 200 ppm. The dust would have begun to accumulate on the ice caps, especially those of Eurasia and North America, which were close to deserts. Next time a Milankovich great summer came along, and the ice caps began to melt, the ice would have grown dirtier and dirtier, years of deposited dust coming together as the ice shrank. The darker ice would have absorbed more heat from the sun and a runaway process of collapsing ice caps would have begun.
Here is an extract from the paper:

A more logical explanation for the inverse correlation between dust and CO2can be seen through the effect that CO2 concentrations have on plant life. Fig. 8 also shows that CO2 levels during each ice-age came all the way down to 190–180 ppm, and that is approaching dangerously low levels for C3 photosynthesis-pathway plant life. CO2 is a vital component of the atmosphere because it is an essential plant food, and without CO2 all plants die. In her comprehensive analysis of plant responses to reduced CO2 concentrations, Gerhart says of this fundamental issue:

It is clear that modern C3 plant genotypes grown at low CO2 (180–200 ppm) exhibit severe reductions in photosynthesis, survival, growth, and reproduction … Such findings beg the question of how glacial plants survived during low CO2 periods … Studies have shown that the average biomass production of modern C3 plants is reduced by approximately 50% when grown at low (180–220 ppm) CO2, when other conditions are optimal … (The abortion of all flower buds) suggested that 150 ppm CO2 may be near the threshold for successful completion of the life cycle in some C3 species (Gerhart and Ward, 2010 Section II).

It is clear that a number of plant species would have been under considerable stress when world CO2 concentrations reduced to 200 or 190 ppm during the glacial maximum, especially if moisture levels in those regions were low (Gerhart and Ward, 2010; Pinto et al., 2014). And palaeontological discoveries at the La Brea tar pits in southern California have confirmed this, where oxygen and carbon isotopic analysis of preserved juniperus wood dating from 50 kyr ago through to the Holocene interglacial has shown that: ‘glacial trees were undergoing carbon starvation’ (Ward et al., 2005). And yet these stresses and biomass reductions do not appear to become lethal until CO2 concentrations reach 150 ppm, which the glacial maximums did not achieve – unless we add altitude and reducing CO2 partial pressures into the equation.

All of human civilisation happened in an interglacial period, with a relatively stable climate, plentiful rainfall and high enough levels of carbon dioxide to allow the vigorous growth of plants. Agriculture was probably impossible before then, and without its hugely expanded energy supply, none of the subsequent flowering of human culture would have happened.
That interglacial will end. Today the northern summer sunshine is again slightly weaker than the southern. In a few tens of thousands of years, our descendants will probably be struggling with volatile weather, dust storms and air that cannot support many crops. But that is a very long way off, and by then technology should be more advanced, unless we prevent it developing. The key will be energy. With plentiful and cheap energy our successors could thrive even in a future ice age, growing crops, watering deserts, maintaining rainforests and even melting ice caps.

By Matt Ridley | Tagged:  rational-optimist  the-times