Published on:

A history of failed predictions of doom

I have a long article on apocalyptic predictions in Wired Magazine. Here’s a version with about 70
links to sources. I have also added in a few paragraphs on falling
sperm counts and on species extinction: these were edited from the
published version of the article for space reasons.

 

“Who or what will cause the 20120 apocalypse?” This is the
question posed by the website 2012apocalypse.net. “Super volcanos?
Pestilence and disease? Asteroids? Comets? Antichrist? Global
warming? Nuclear war?” the site’s authors are impressively
open-minded about the cause of the catastrophe that is coming at
11:11 pm on December 21 this year. but they have no doubt it will
happen. after all, not only does the Mayan Long Count calendar end
that day, but “the sun will be aligned with the center of the Milky
Way for the first time in about 26,000 years.”

Case closed: Sell your possessions and live for today.

When the sun rises on December 22, as it surely will, do not
expect apologies or even a rethink. No matter how often apocalyptic
predictions fail to come true, another one soon arrives. And the
prophets of apocalypse always draw a following-from the 100,000
Millerites who took to the hills in 1843, awaiting the end of the
world, to the thousands who believed in Harold Camping, the
Christian radio broadcaster who forecast the final rapture in both
1994 and 2011.

Religious zealots hardly have a monopoly on apocalyptic
thinking. Consider some of the environmental cataclysms that so
many experts promised were inevitable. Best-selling economist
Robert Heilbroner in 1974: “The outlook for man, I believe, is
painful, difficult, perhaps desperate, and the hope that can be
held out for his future prospects seem to be very slim indeed.” Or
best-selling ecologist Paul Ehrlich in 1968: “The battle to feed
all of humanity is over. In the 1970s [“and 1980s” was added in a
later edition] the world will undergo famines-hundreds of millions
of people are going to starve to death in spite of any crash
programs embarked on now … nothing can prevent a substantial
increase in the world death rate.” Or Jimmy Carter in a televised
speech in 1977: “We could use up all of the proven reserves of oil
in the entire world by the end of the next decade.”

Predictions of global famine and the end of oil in the 1970s
proved just as wrong as end-of-the-world forecasts from
millennialist priests. Yet there is no sign that experts are
becoming more cautious about apocalyptic promises. If anything, the
rhetoric has ramped up in recent years. Echoing the Mayan calendar
folk, theBulletin of the Atomic Scientists moved its Doomsday Clock one minute closer to
midnight at the start of 2012, commenting: “The global community
may be near a point of no return in efforts to prevent catastrophe
from changes in Earth’s atmosphere.”

Over the five decades since the success of Rachel
Carson’s Silent Spring in 1962 and the four decades since
the success of the Club of Rome’s The Limits to Growth in
1972, prophecies of doom on a colossal scale have become routine.
Indeed, we seem to crave ever-more-frightening predictions-we are
now, in writer Gary Alexander’s word, apocaholic. The past half century has
brought us warnings of population explosions, global famines,
plagues, water wars, oil exhaustion, mineral shortages, falling
sperm counts, thinning ozone, acidifying rain, nuclear winters, Y2K
bugs, mad cow epidemics, killer bees, sex-change fish,
cell-phone-induced brain-cancer epidemics, and climate
catastrophes.

So far all of these specters have turned out to be exaggerated.
True, we have encountered obstacles, public-health emergencies, and
even mass tragedies. But the promised Armageddons-the thresholds
that cannot be uncrossed, the tipping points that cannot be
untipped, the existential threats to Life as We Know It-have
consistently failed to materialize. To see the full depth of our
apocaholism, and to understand why we keep getting it so wrong, we
need to consult the past 50 years of history.

The classic apocalypse has four horsemen, and our modern version
follows that pattern, with the four riders being chemicals (DDT,
CFCs, acid rain), diseases (bird flu, swine flu, SARS, AIDS, Ebola,
mad cow disease), people (population, famine), and resources (oil,
metals). Let’s visit them each in turn.

The first horseman: chemicals

Silent Spring, published 50 years ago this year,
was instrumental in the emergence of modern environmentalism.
“Without this book, the environmental movement might have been long
delayed or never have developed at all,” Al Gore wrote in his introduction to the 1994 edition.
Carson’s main theme was that the use of synthetic pesticides-DDT in
particular-was causing not only a massacre of wildlife but an
epidemic of cancer in human beings. One of her chief inspirations
and sources for the book was Wilhelm Hueper, the first director of
the environmental arm of the National Cancer Institute. So obsessed
was Hueper with his notion that pesticides and other synthetic
chemicals were causing cancers (and that industry was covering this
up) that he strenuously opposed the suggestion that tobacco-smoking
take any blame. Hueper wrote in a 1955 paper called “Lung Cancers
and Their Causes,” published in CA: A Cancer Journal for
Clinicians, “Industrial or industry-related atmospheric pollutants
are to a great part responsible for the causation of lung cancer …
cigarette smoking is not a major factor in the causation of lung
cancer.”

In fact, of course, the link between smoking and lung cancer was
found to be ironclad. But the link between modern chemicals and
cancer is sketchy at best. Even DDT, which clearly does pose health
risks to those unsafely exposed, has never been definitively linked
to cancer. In general, cancer incidence and death rates, when
corrected for the average age of the population, have been falling now for 20 years.

By the 1970s the focus of chemical concern had shifted to air
pollution. Life magazine set the scene in January 1970: “Scientists
have solid experimental and theoretical evidence to support … the
following predictions: In a decade, urban dwellers will have to
wear gas masks to survive air pollution … by 1985 air pollution
will have reduced the amount of sunlight reaching earth by one
half.” Instead, driven partly by regulation and partly by
innovation, both of which dramatically cut the pollution coming
from car exhaust and smokestacks, ambient air quality improved
dramatically in many cities in the developed world over the
following few decades. Levels of carbon monoxide, sulphur dioxide,
nitrogen oxides, lead, ozone, and volatile organic compounds fell and continue to fall.

In the 1980s it was acid rain’s turn to be the source of
apocalyptic forecasts. In this case it was nature in the form of
forests and lakes that would bear the brunt of human pollution. The
issue caught fire in Germany, where a cover story in the news
magazine Der Spiegel in November 1981 screamed: “THE FOREST DIES.” Not to be
outdone, Stern magazine declared that a third of
Germany’s forests were already dead or dying. Bernhard Ulrich, a
soil scientist at the University of Göttingen, said it was already too late for the country’s
forests: “They cannot be saved.” Forest death,
or waldsterben, became a huge story across
Europe. “The forests and lakes are dying. Already the damage may be
irreversible,” journalist Fred Pearce wrote in New Scientist in 1982. It
was much the same in North America: Half of all US lakes
were said to be becoming dangerously acidified, and forests from
Virginia to central Canada were thought to be suffering mass
die-offs of trees.

Conventional wisdom has it that this fate was averted by prompt
legislative action to reduce sulphur dioxide emissions from power
plants. That account is largely false. There was no net loss of
forest in the 1980s to reverse. In the US, a 10-year
government-sponsored study involving some 700 scientists and
costing about $500 million reported in 1990 that “there is no evidence of
a general or unusual decline of forests in the United States and
Canada due to acid rain” and “there is no case of forest decline in
which acidic deposition is known to be a predominant cause.” (See
also: here and here.) In Germany, Heinrich Spiecker,
director of the Institute for Forest Growth, was commissioned by a
Finnish forestry organization to assess the health of European
forests. He concluded that they were growing faster and healthier
than ever and had been improving throughout the 1980s. “Since we
began measuring the forest more than 100 years ago, there’s never
been a higher volume of wood … than there is now,” Spiecker said. (Ironically, one of the chief
ingredients of acid rain-nitrogen oxide-breaks down naturally to
become nitrate, a fertilizer for trees.) As for lakes, it turned out that their rising acidity was likely
caused more by reforestation than by acid rain; one study suggested
that the correlation between acidity in rainwater and the pH in the
lakes was very low. The story of acid rain is not of catastrophe
averted but of a minor environmental nuisance somewhat abated.

The threat to the ozone layer came next. In the 1970s scientists
discovered a decline in the concentration of ozone over Antarctica
during several springs, and the Armageddon megaphone was dusted off
yet again. The blame was pinned on chlorofluorocarbons, used in
refrigerators and aerosol cans, reacting with sunlight. The
disappearance of frogs and an alleged rise of melanoma in people
were both attributed to ozone depletion. So too was a supposed rash
of blindness in animals: Al Gore wrote in 1992 about blind salmon and rabbits
[“hunters now report finding blind rabbits; fisherman catch blind
salmon.”], while The New York Times reported “an increase
inTwilight Zone-type reports of sheep and rabbits with cataracts”
in Patagonia. But all these accounts proved incorrect. The frogs were dying of a
fungal disease spread by people; the sheep had viral pinkeye; the
mortality rate from melanoma actually leveled off during the growth
of the ozone hole; and as for the blind salmon and rabbits, they
were never heard of again.

There was an international agreement to cease using CFCs by
1996. But the predicted recovery of the ozone layer never happened:
The hole stopped growing before the ban took effect, then failed to
shrink afterward. The ozone hole still grows every Antarctic
spring, to roughly the same extent each year. Nobody quite knows
why. Some scientists think it is simply taking longer than expected
for the chemicals to disintegrate; a few believe that the cause of the hole was
misdiagnosed in the first place. Either way, the ozone hole cannot
yet be claimed as a looming catastrophe, let alone one averted by
political action.

[The next chemical scare was “endocrine disruptors”, chemicals
that mimic sex hormones. In a book entitled Our Stolen
Future
, published in 1996, many plastics, pesticides and
other man-made chemicals stood accused of changing the sex of fish,
shrinking the penises of alligators and depressing the sperm counts
of men. “Chemicals that disrupt hormone messages have the power to
rob us of rich possibilities that have been the legacy of our
species and, indeed, the essence of our humanity. There may be
fates worse than extinction,” warned the three authors melodramatically.

In 1992, Danish researchers reported that human sperm counts had
fallen by 50% in 50 years, but they did so by comparing different
studies in different places at different times. Other studies
failed to replicate the results and by 2011 the sperm-count fall
had been laid to rest as a myth following a 15-year study of Danish
national-service recruits, which found “no indication that semen quality has
changed”. It also noted that “there is only very limited
epidemiologic evidence to support the broader endocrine disruption
hypothesis”. Few researchers now believe there was ever much of an
issue here.]

The second horseman: disease

Repeatedly throughout the past five decades,the
imminent advent of a new pandemic has been foretold. The 1976 swine
flu panic was an early case. Following the death of a single
recruit at Fort Dix, the Ford administration vaccinated more than 40 million Americans, but
more people probably died from adverse reactions to the vaccine
than died of swine flu.

A few years later, a fatal virus did begin to spread at an
alarming rate, initially through the homosexual community. AIDS was
soon, rightly, the focus of serious alarm. But not all the dire
predictions proved correct. “Research studies now project that one
in five-listen to me, hard to believe-one in five heterosexuals
could be dead from AIDS at the end of the next three years. That’s
by 1990. One in five,” Oprah Winfrey warned in 1987 (Quoted in
“Bias”, by Bernard Goldberg. Regnery Publishing 2002.)

Bad as AIDS was, the broad-based epidemic in the Americas,
Europe, and Asia never materialized as feared, though it did in
Africa. In 2000 the US National Intelligence Council predicted that HIV/AIDS would worsen in the
developing world for at least 10 years and was “likely to aggravate
and, in some cases, may even provoke economic decay, social
fragmentation and political destabilization in the hardest hit
countries in the developing and former communist worlds.”

Yet the peak of the epidemic had already passed in the late
1990s, and today AIDS is in slow retreat throughout the world. New
infections were 20 percent lower in 2010 than in 1997, and the
lives of more than 2.5 million people have been saved since 1995 by
antiretroviral treatment. “Just a few years ago, talking about
ending the AIDS epidemic in the near term seemed impossible, but
science, political support, and community responses are starting to
deliver clear and tangible results,” UNAIDS executive director
Michel Sidibé wrote last year.

The emergence of AIDS led to a theory that other viruses would
spring from tropical rain forests to wreak revenge on humankind for
its ecological sins. That, at least, was the implication of Laurie
Garrett’s 1994 book, The Coming Plague: Newly Emerging
Diseases in a World Out of Balance. The most prominent candidate
was Ebola, the hemorrhagic fever that starred in Richard
Preston’s The Hot Zone, published the same year. Writer
Stephen King called the book “one of the most horrifying
things I’ve ever read.” Right on cue, Ebola appeared again in the
Congo in 1995, but it soon disappeared. Far from being a harbinger,
HIV was the only new tropical virus to go pandemic in 50 years.

In the 1980s British cattle began dying from mad cow disease,
caused by an infectious agent in feed that was derived from the
remains of other cows. When people, too, began to catch this
disease, predictions of the scale of the epidemic quickly turned
terrifying: Up to 136,000 would die, according to one study. A pathologist warned that the British “have to prepare for
perhaps thousands, tens of thousands, hundreds of thousands, of
cases of vCJD [new variant Creutzfeldt-Jakob disease, the human
manifestation of mad cow] coming down the line.” Yet the total
number of deaths so far in the UK has
been
176, with just five occurring in 2011 and none so far in
2012.

In 2003 it was SARS, a virus from civet cats, that ineffectively
but inconveniently led to quarantines in Beijing and Toronto amid
predictions of global Armageddon. SARS subsided within a year, after killing just 774
people. In 2005 it was bird flu, described at the time by a United Nations
official as being “like a combination of global warming and
HIV/AIDS 10 times faster than it’s running at the moment.” The
World Health Organization’s official forecast was 2 million to 7.4 million dead. In fact, by
late 2007, when the disease petered out, the death toll was roughly
200. In 2009 it was Mexican swine flu. WHO director general
Margaret Chan said: “It really is all of humanity that is
under threat during a pandemic.” The outbreak proved to be a normal
flu episode.

The truth is, a new global pandemic is growing less likely, not
more. Mass migration to cities means the opportunity for viruses to
jump from wildlife to the human species has not risen and has
possibly even declined, despite media hype to the contrary. Water-
and insect-borne infections-generally the most lethal-are declining
as living standards slowly improve. It’s true that casual-contact
infections such as colds are thriving-but only by being mild enough
that their victims can soldier on with work and social engagements,
thereby allowing the virus to spread. Even if a lethal virus does
go global, the ability of medical science to sequence its genome
and devise a vaccine or cure is getting better all the time.

The third horseman: people

Of all the cataclysmic threats to human
civilization
envisaged in the past 50 years, none has drawn
such hyperbolic language as people themselves. “Human beings are a
disease, a cancer of this planet,” says Agent Smith in the filmThe
Matrix. Such rhetoric echoes real-life activists like Paul Watson of the Sea Shepherd
Conservation Society: “We need to radically and intelligently
reduce human populations to fewer than one billion … Curing a body
of cancer requires radical and invasive therapy, and therefore,
curing the biosphere of the human virus will also require a radical
and invasive approach.”

On a “stinking hot” evening in a taxi in Delhi in 1966, as Paul
Ehrlich wrote in his best seller, The Population
Bomb, “the streets seemed alive with people. People eating, people
washing, people sleeping. People visiting, arguing, and screaming.
People thrusting their hands through the taxi window, begging.
People defecating and urinating. People clinging to buses. People
herding animals. People, people, people, people.” Ehrlich’s
conclusion was bleak: “The train of events leading to the
dissolution of India as a viable nation” was already in progress.
And other experts agreed. “It is already too late to avoid mass
starvation,” said Denis Hayes, organizer of the first Earth Day in
1970. Sending food to India was a mistake and only postponed the
inevitable, William and Paul Paddock wrote in their best
seller,Famine-1975!

What actually happened was quite different. The death rate fell.
Famine became rarer. The population growth rate was cut in half,
thanks chiefly to the fact that as babies stop dying, people stop
having so many of them. Over the past 50 years, worldwide food
production per capita has risen, even as the global population has
doubled. Indeed, so successful have farmers been at increasing
production that food prices fell to record lows in the early 2000s
and large parts of western Europe and North America have been
reclaimed by forest. (A policy of turning some of the world’s grain
into motor fuel has reversed some of that decline and driven
prices back up.)

Meanwhile, family size continues to shrink on every continent. The world
population will probably never double again, whereas it quadrupled
in the 20th century. With improvements in seeds, fertilizers,
pesticides, transport, and irrigation still spreading across
Africa, the world may well feed 9 billion inhabitants in 2050-and
from fewer acres than it now uses to feed 7 billion.

The fourth horseman: resources

In 1977 President Jimmy Carter went on television
and declared: “World oil production can probably
keep going up for another six or eight years. But sometime in the
1980s, it can’t go up anymore. Demand will overtake production.” He
was not alone in this view. The end of oil and gas had been
predicted repeatedly throughout the 20th century. In 1922 President
Warren Harding created the US Coal Commission, which undertook an
11-month survey that warned, “Already the output of [natural] gas
has begun to wane. Production of oil cannot long maintain its
present rate.” (Quoted in Bradley, R.L. 2007. Capitalism at Work.
Scrivener Press. P 206.) In 1956, M. King Hubbert, a Shell
geophysicist, forecast that gas production in the US would
peak at about 14 trillion cubic feet per year sometime around
1970.

All these predictions failed to come true. Oil and gas
production have continued to rise during the past 50 years. Gas
reserves took an enormous leap upward after 2007, as engineers
learned how to exploit abundant shale gas. In 2011 the
International Energy Agency estimated that global gas resources would last
250 years. Although it seems likely that cheap sources of oil may
indeed start to peter out in coming decades, gigantic quantities of
shale oil and oil sands will remain available, at least at a price.
Once again, obstacles have materialized, but the apocalypse has
not. Ever since Thomas Robert Malthus, doomsayers have tended to
underestimate the power of innovation. In reality, driven by price
increases, people simply developed new technologies, such as the
horizontal drilling technique that has helped us extract more oil
from shale.

It was not just energy but metals too that were supposed to run
out. In 1970 Harrison Brown, a member of the National Academy of
Sciences, forecast in Scientific American that
lead, zinc, tin, gold, and silver would all be gone by 1990. The
best-selling book The Limits to Growth was published 40
years ago by the Club of Rome, a committee of prominent
environmentalists with a penchant for meeting in Italy. The book
forecast that if use continued to accelerate exponentially, world
reserves of several metals could run out by 1992 and help
precipitate a collapse of civilization and population in the
subsequent century, when people no longer had the raw materials to
make machinery. These claims were soon being repeated in
schoolbooks. “Some scientists estimate that the world’s known
supplies of oil, tin, copper, and aluminum will be used up within
your lifetime,” one read. In fact, as the results of a famous wager between Paul Ehrlich and economist
Julian Simon later documented, the metals did not run out. Indeed,
they grew cheaper. Ehrlich, who claimed he had been
“goaded” into the bet, growled, “The one thing we’ll never run out of
is imbeciles.”

[Far from being congratulated for this feat, Simon was widely
attacked. So he offered one of his critics, William Conway of
the New York Zoological Society, a bet on species extinction: “I’ll
bet that the number of scientifically-proven species extinctions in
the world in the year 2000 is not even one-hundredth as large as
the 40,000 as conventionally forecast; any other year will be fine,
too.”

The estimate of 40,000 species going extinct a year came from the conservationist Norman Myers in
1979, though it was originally more an assumption than a
measurement: “Let us suppose that, as a consequence of this
man-handling of natural environments, the final one quarter of this
century witnesses the elimination of one million species – a far
from unlikely prospect. This would work out, during the course of
25 years, at an average extinction rate of 40,000 species per
year.”

Not that Myers’s number was much different from those being
suggested by others. The Harvard biologist E.O.Wilson has regularly
spoken of 27,000 species going extinct each year, a number reached
by calculating how much habitat is being lost and applying a
mathematical formula called the species-area curve. However, a
recent study by Stephen Hubbell and Fangliang He, of the University
of California at Los Angeles,
found
that these “estimated” extinction rates are “almost
always much higher than those actually observed” — loss of forest
habitat does not result in species loss at the rate predicted by
the theory.

This may explain why actual recorded extinction rates, though
bad enough, are so much lower than predicted. Whereas Wilson’s
27,000 annual extinctions should be producing 26 bird and 13 mammal
extinctions a year, in fact, on a comprehensive list kept by the
American Museum of Natural History, extinctions of bird and mammal
species peaked at 1.6 a year around 1900 and have since
dropped to about 0.2 a year. So far 1.3% of mammals (69/4428) and
1.4% of birds (129/8971) have gone extinct in four centuries.

Each extinction is a tragedy. But this is a far cry from the
extinction rates forecast by Dillon Ripley, secretary of the
Smithsonian Institute, (75-80% of species by 1995), Paul and Anne
Ehrlich (50% by 2005) and Thomas Lovejoy for the Global 2000 Report
to President Carter (15-20% by 2000).]

Conclusion

Over the past half century, none of our threatened
eco-pocalypses have played out as predicted. Some came partly true;
some were averted by action; some were wholly chimerical. This
raises a question that many find discomforting: With a track record
like this, why should people accept the cataclysmic claims now
being made about climate change? After all, 2012 marks the
apocalyptic deadline of not just the Mayans but also a prominent
figure in our own time: Rajendra Pachauri, head of the
Intergovernmental Panel on Climate Change, who said in 2007 that “if there’s no action before
2012, that’s too late … This is the defining moment.”

So, should we worry or not about the warming climate? It is far
too binary a question. The lesson of failed past predictions of
ecological apocalypse is not that nothing was happening but that
the middle-ground possibilities were too frequently excluded from
consideration. In the climate debate, we hear a lot from those who
think disaster is inexorable if not inevitable, and a lot from
those who think it is all a hoax. We hardly ever allow the moderate
“lukewarmers” a voice: those who suspect that the net positive
feedbacks from water vapor in the atmosphere are low, so that we
face only 1 to 2 degrees Celsius of warming this century; that the
Greenland ice sheet may melt but no faster than its current rate of
less than 1 percent per century; that net increases in rainfall
(and carbon dioxide concentration) may improve agricultural
productivity; that ecosystems have survived sudden temperature
lurches before; and that adaptation to gradual change may be both
cheaper and less ecologically damaging than a rapid and brutal
decision to give up fossil fuels cold turkey.

We’ve already seen some evidence that humans can forestall
warming-related catastrophes. A good example is malaria, which was
once widely predicted to get worse as a result of climate change.
Yet in the 20th century, malaria retreated from large parts of the
world, including North America and Russia, even as the world
warmed. Malaria-specific mortality plummeted in the first decade of
the current century by an astonishing 25 percent. The weather may
well have grown more hospitable to mosquitoes during that time. But
any effects of warming were more than counteracted by pesticides,
new antimalarial drugs, better drainage, and economic development.
Experts such as Peter Gething at Oxford argue that these trends will continue,
whatever the weather.

Just as policy can make the climate crisis worse-mandating
biofuels has not only encouraged rain forest destruction, releasing
carbon, but driven millions into poverty and hunger-technology can
make it better. If plant breeders boost rice yields, then people
may get richer and afford better protection against extreme
weather. If nuclear engineers make fusion (or thorium fission)
cost-effective, then carbon emissions may suddenly fall. If gas
replaces coal because of horizontal drilling, then carbon emissions
may rise more slowly. Humanity is a fast-moving target. We will
combat our ecological threats in the future by innovating to meet
them as they arise, not through the mass fear stoked by worst-case
scenarios.

By Matt Ridley | Tagged:  rational-optimist