Published on:

What keeps scientists accurate is rivals’ scepticism, not their own

My latest Mind and Matter column for the Wall Street Journal:

If, as I argued last week, scientists are just as prone as
everybody else to confirmation bias ­ to looking for evidence to
support rather than test their ideas ­ then how is it that science,
unlike cults and superstitions, does change its
mind and find new things?

The answer was spelled out by the psychologist Raymond Nickerson
of Tufts University in a paper written in 1998: “It is not so much
the critical attitude that individual scientists have taken with
respect to their own ideas that has given science the success it
has enjoyed… but more the fact that individual scientists have
been highly motivated to demonstrate that hypotheses that are held
by some other scientist(s) are false.”

Most scientists do not try to disprove their ideas; rivals do it
for them. Only when those rivals fail is the theory bomb-proof. The
physicist Robert Millikan, (who showed minor confirmation bias in
his own work on the charge of the electron by omitting outlying
observations that did not fit his hypothesis) devoted more than 10
years to trying to disprove Einstein¹s theory that light consisted
of particles (photons). His failure convinced almost everybody but
himself that Einstein was right.

The solution to confirmation bias in science, then, is not to
try to teach it out of people, for that goes too much against the
grain of human nature. Dr. Nickerson points out that the history of
science is replete not only with examples of great scientists
tenaciously persisting with theories “long after the evidence
against them had become sufficiently strong to persuade others
without the same vested interests to discard them” but also with
brilliant people who remained wedded to their pet hates.

Galileo rejected Kepler’s lunar explanation of tides; Huygens
objected to Newton’s concept of gravity; Humphrey Davy detested
John Dalton’s atomic theory; Einstein denied quantum theory.

No, the reason that science progresses despite confirmation bias
is partly that it makes testable predictions, but even more that it
prevents monopoly. By dispersing its incentives among many
different centers, it allows scientists to check each other’s
prejudices. When a discipline defers to a single authority, and
demands adherence to a set of beliefs, then it becomes a cult.
Medicine did this with Galen and psychoanalysis with Freud.

A recent example is the case of malaria and climate. In the
early days of global-warming research, scientists argued that
warming would worsen malaria by increasing the range of mosquitoes.
“Malaria and dengue fever are two of the mosquito-borne diseases
most likely to spread dramatically as global temperatures head
upward,” said the Harvard Medical School’s Paul Epstein
in Scientific American in 2000, in a warning typical of many.

Carried away by confirmation bias, scientists modeled the future
worsening of malaria, and the Intergovernmental Panel on Climate
Change accepted this as a given. When Paul Reiter, an expert on
insect-borne diseases at the Pasteur Institute, begged to
differ-pointing out that malaria¹s range was shrinking and was
limited by factors other than temperature-he had an uphill
struggle. “After much effort and many fruitless discussions,” he
said
, “I … resigned from the IPCC project [but] found that my
name was still listed. I requested its removal, but was told it
would remain because ‘I had contributed.’ It was only after strong
insistence that I succeeded in having it removed.”

Yet Dr. Reiter has now been vindicated. In a recent paper, Peter Gething of Oxford
University and his colleagues concluded that widespread claims that
rising mean temperatures had already worsened malaria mortality
were “largely at odds with observed decreasing global trends” and
that proposed future effects of rising temperatures are “up to two
orders of magnitude smaller than those that can be achieved by the
effective scale-up of key control measures.”

The IPCC, in other words, learned the hard way the value of
letting mavericks and gadflies challenge confirmation bias.

By Matt Ridley | Tagged:  rational-optimist  wall-street-journal