6 May 2011

Science versus closed minds

RM sent this interesting article [pdf] on why people don't believe science (or evidence that contradicts their world view).

Here are some relevant bits:
"A man with conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point."


We're not driven only by emotions, of course -- we also reason, deliberate. But reasoning comes later, works slower -- and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.*


We may think we're being scientists, but we're actually being lawyers.


Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation.


...one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues... These individuals are just as emotionally driven and biased as the rest of us, but they're able to generate more and better reasons to explain why they're right -- and so their minds become harder to change.


If you want someone to accept new evidence, make sure to present it to them in a context that doesn't trigger a defensive, emotional reaction... you don't lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.
On the "accept new evidence" front, I always say that higher water prices are necessary if we want greater reliability in water services. Some people hear that (reliability good!) before their knee-jerk reaction against paying more kicks in. Some people don't.

Bottom Line: It takes work to really be objective; many people avoid it, and our politics and policies are weakened as a result.
* An example on gun control.


  1. There is a theory in social psychology called cognitive dissonance theory which says that presented with "dissonant" or contrary evidence a person's beliefs get stronger, not weaker.

    Some examples: a prophecy that a messianic religious figure is going to return to earth on a certain day ends up wrong. But the adherents of this prophecy become stronger in their beliefs and their numbers grow. Same sort of thing happens with flying saucer cults and with end of the world environmental prophecies.

    Sociologist Peter Berger has advanced the observation that religious Fundamentalists are the other side of the coin to secular Relativists and vice versa - both seek cognitive certainty. This cuts both ways against religionists who believe with certainty that they know when life starts and against those who assert with great certainty that they know that global warming has started.

    As for how this applies to water policy, we don't fully know if the seas are going to rise or what the consequences of a Delta Plan might be. We can't model all of reality. We have an idea that a fresh water Delta would be better than a salt water Delta; but the environment would thrive either way -- one with salmon the other with catfish. It's a value judgment, not a scientific finding, as to which is best. Those who say with great certainty that the Delta should be restored should be believed with great caution. Restored to what? An inland sea which it was in the 1800's? We can let the sea reclaim the land as it did in Sendai, Japan recently and bifurcate California in two in wet years or we can reclaim the land for human and plant and animal use. Science can't say with any certainty which is best - that is a cultural value.

  2. We may think we're being scientists, but we're actually being lawyers.

    Loved that quote….I’d add engineers along with scientists.

  3. 1. Modern neuroscience predicts these behaviors.
    2. Negotiation practices help to get past them.

  4. Speaking of neuroscience, I attended a lecture by Walter Freeman, Jr., interestingly the son of the “ice pick lobotomist”, where he said that sensory input is not even allowed directly into the brain. Instead, the input is compared against pre-existing perception, and then discarded. The act of changing perception requires “unlearning,” which is very different that forgetting. Unlearning happens most quickly under duress or highly emotional conditions, where perceptions become malleable. Examples include the Stockholm syndrome, some religious rituals, hazing, military training, and PTSD. In the end, higher water prices will hopefully change minds, but if it doesn’t, thirst and deprivation will certainly change perceptions on a large scale.

  5. @Richard,
    At least to us, 'directly into the brain', when considered at the molecular level, is a tricky concept. Which neurons are part of 'the brain' and which are not and why not? On the other hand, we seem to be fast at pattern recognition because many patterns are already preprocessed and can be recognized quickly.
    Extending this thought to water issues and to negotiation, the preprocessing of each person's mind in the domain of water control means that to reach and change a person, we have to either present our argument so that preprocessing does not throw it out or so that the preprocessing must adapt to the new realities of water scarcity caused by drought or by inefficient usage. Some of these adaptations, over the last 15,000 years of human development are covered in Morris' excellent book "Why the west rules--for now."

  6. In addition to cognitive dissonance, there's the Dunning-Kruger effect:

    The Dunning-Kruger effect is a cognitive bias in which unskilled people make poor decisions and reach erroneous conclusions, but their incompetence denies them the metacognitive ability to appreciate their mistakes.[1] The unskilled therefore suffer from illusory superiority, rating their ability as above average, much higher than it actually is, while the highly skilled underrate their own abilities, suffering from illusory inferiority. Actual competence may weaken self-confidence, as competent individuals may falsely assume that others have an equivalent understanding. As Kruger and Dunning conclude, "the miscalibration of the incompetent stems from an error about the self, whereas the miscalibration of the highly competent stems from an error about others" (p. 1127).[2]

    The effect is about paradoxical defects in cognitive ability, in oneself and others.


  7. "I always say that higher water prices are necessary if we want greater reliability in water services." Case in point: I recently had a conversation with a visiting Guatemalan musician who says he only has water service to his home 5 hours per day, 3 am to 8 am. Insufficient investment in the infrastructure for reliability.

  8. @ Wayne: Even within the environmental community, we see philosophical differences between moralist preservationists and utilitarian aggregators. The question being whether we preserve nature as it would be without us or manage resources for optimal human consumption and exploitation by society.


Read this first!

Make sure you copy your comment before submitting because sometimes the system will malfunction and you will lose your comment.

Spam will be deleted.

Comments on older posts must be approved (do not submit twice).

If you're having problems posting, email your comment to me