“Forget the science — I still don’t believe it!”

When Mother Jones  published “The Science of Why We Don’t Believe Science” recently, it was another of those moments when one side of a debate struggles to understand how the arguments that are so clear and convincing to them are so obscure and unconvincing to those on the other side of the issue.

What’s wrong with those people? Why can’t they see the obvious?

These questions are not confined to one side of the political divide, and articles equivalent to this one, which comes from the left, can be found in equal numbers on the right.

We have trouble believing that others don’t believe what we believe when they’re in possession of the same information.

The answer to this mystery has much to do with the fact that we don’t all  have the same information, even when we all have the same information. More precisely, while we all perceive and process the same information in the same ways, the output of that perception and processing greatly depends on the initial emotional and rational conditions: What did we already believe? How did we already feel about this issue? What’s our opinion of the people presenting the information?

Much has been written here lately about the central cognitive role played by emotion, which precedes and underlies rational thought, which is often a post hoc explanation of ourselves to ourselves.

The same sorts of processes are at work here.

In “The Science of Why We Don’t Believe Science,” author Chris Mooney puts it this way:

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

Mooney quotes Jonathan Haidt, who wrote that “We may think we’re being scientists, but we’re actually being lawyers.” In this view, we don’t reason to determine what we will believe; we reason to make the case for what we already believe.  Mooney writes:

This tendency toward so-called “motivated reasoning” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and religion of the president, and much else.

No one should fail to notice that the list of issues Mooney presents is itself typically skewed. Here and elsewhere in the article, Mooney’s examples are a catalogue of issues on which the right has it wrong, and the left has it right: climate change, “death panels,” the “Birther” crusade, attitudes towards gay marriage, gun control, and Saddam’s WMD.

He does include one “typically left” case of science denial, the discredited link between vaccines and autism, just to show that he’s being “balanced,” but the great majority of the issues he names on which the evidence is “unequivocal” are those on which the left has the “correct” position. There’s no way of telling how aware he was of his own bias when he was writing this article, but it is a striking feature, and the fact that I agree with him that the facts are clear and convincing on each of these issues is irrelevant to the fact of the tendency.

The point is, of course, that we are all susceptible to the same biases and blind spots, because we all share the same cognitive structures and emotional contexts. And while there is an increasing number of studies showing that there are consistent differences in the ways the left and the right respond to issues — “conservatives” valuing stability and authority, while “progressives” value change and diversity — all that these studies really show is that all of us are influenced by our core values. We end up in different places, but we get there on the same highways.

First we feel, then we think, recalling memories that fit our preconceptions and preparing arguments that justify those beliefs. This isn’t a voluntary process, nor is it usually conscious, except on later self-reflection. The physical structures and evolved mechanisms of our brains operate on us before we’re aware that they’ve been triggered, and function whether or not we ever become aware of them. As Mooney points out, feeling and thinking are inextricably linked:

Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it.

Mooney writes that “we apply fight-or-flight reflexes not only to predators, but to data itself.” We respond quickly to threats, emotional as well as physical; incongruity and mental disorder are threatening to us. So we have devised ways to ignore, recast, or deny input that clashes with our worldview — and with our self-image.

If our first response is typically a defensive, protecting one, then what does that mean for the kinds of strategies we should adopt when we try to persuade others to see things the way we see them? Mooney explains:

People rejected the validity of a scientific source because its conclusion contradicted their deeply held views … and that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect.

So how important can the article’s insight be? Haven’t people always acted like this? Yes, we have, but there is a key difference now, Mooney warns, in the newly fragmented ways in which we get our information.

We use the media, and increasingly the social media, to reinforce our pre-existing positions on issues. There is less and less rounded discussion, less and less real debate. Many of us seek and select only those sources of information — and, importantly, affirmation — that mirror and strengthen the way we already feel and think. Growth, change, and insight all dwindle, and we threaten to fragment into ever smaller and ever more isolated and self-referential subgroups. If this goes on long enough and strongly enough, the postmodernists may yet be right, and the post-capitalist consumer gulag will indeed come to pass.

*-*

There is a lot of research on this topic, and one online article is an insufficient resource to deal with the issues involved. Next time, then, we’ll look at a more complete treatment, with a review of the most relevant sections of Thomas Gilovich’s book, How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life.

After that, since it’s the conceptual elephant in the room, we’ll take a brief look at the contentious topic of mimetics, at the nature and dynamics of the social transmission of attitudes and ideas.


Advertisements

One thought on ““Forget the science — I still don’t believe it!”

  1. This already happened to communities with the introduction of the automobile and other easy means of communication. People communicated only with their neighbours in prior times and over a relatively short period they adapted to free and easy choice of who to communicate with over a wide area. Forced connection with neighbours became unnecessary with good and bad results. Someone could take a course with like minded people at a local college or drive to a distant village to rob an anonymous house. There should be some scope for prediction here.

Comments are closed.