What makes us misread information?


Last time
, we looked briefly at the question of why different people see the “same” information differently,  coming to different conclusions about the truth, and even the content, of identical facts.

This part of the human cognitive arsenal is a big topic, so today let’s examine it in more detail by outlining some of the key ideas in Thomas Gilovich’s How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life.

Gilovich’s book summarizes psychological research into belief and persuasion. All of his points are backed by experimental results, but for reasons of space I won’t go into specific studies here. So while you’ll see what appear to be mere assertions or speculative hypotheses, they’re all backed by research outcomes.

In Part I, Gilovich reviews some of the contexts in which we misjudge the information with which we are presented (by experience, by others, by third-hand reports, etc.).

Gilovich surveys the many ways in which we misapply the rules of chance, with the result that we see patterns and trends where none exist — everything from a gambler’s losing streak to the idea that a tossed coin has come up tails six times in a row is “due” to come up heads. Gilovich explains: “Human nature abhors a lack of predictability and the absence of meaning. As a consequence, we tend to “see” order where there is none, and we spot meaningful patterns where only the vagaries of chance are operating.” This tendency is part of our innate cognitive equipment, and in other circumstances (identifying food resource patterns on the savannah, for instance) it is adaptationally advantageous. Once we have discerned a false pattern or trend, we quickly find ways to explain and justify it. As Gilovich writes, “People are extraordinarily good at ad hoc explanation.”

Another key feature of our cognitive habits is our tendency to pay more attention, and give more credit, to confirmatory information. That is, we notice and highlight information that fits the false pattern we’ve perceived more than we do information that doesn’t fit or is contradictory. The same tendency exists when we gather our own information, rather than just evaluate information from other sources. Positives are given more credence than negatives in either case. This tendency holds true in other, related situations. Gilovich reports that study subjects look for and value similarities when they look for evidence that supports a hypothesis of similarity, and they seek differences when they look for evidence that supports difference. The result is that “people establish an insufficient threshold of what constitutes adequate support for a belief, and they run the risk of believing things that are not true.”

A related problem exists when an important kind of data is hidden or missing. If our confirmatory bias is working strongly, we will have a hard time noticing that some information is not available, or that the information we know that we are missing is important to evaluating the current situation.

A major area of conceptual error is the operation of bias. In fact, in the popular media, a substantial majority of what’s published on the dynamics of human judgement is focused on bias, especially its effects on political debate and policy. Gilovich writes:

People are inclined to see what they expect to see, and conclude what they expect to conclude. Information that is consistent with our pre-existing beliefs is often accepted at face value, whereas evidence that contradicts them is critically scrutinized and discounted. Our beliefs may thus be less responsive than they should to the implications of new information.

Gilovich also repeats a humourous but meaningful “slip of the tongue,” attributed to psychologist Thane Pittman: “I’ll see it when I believe it.”

Bias is complicated, Gilovich notes, for the simple reason that it’s not always unjustified. When someone says that there’s a purple alien living in his basement, it’s reasonable to place a very high “proof threshold” on a claim that so thoroughly contradicts our previous experience and knowledge. Gilovich  writes that “we are justified in allowing our beliefs and theories to influence our assessments of new information in direct proportion to how plausible and well-substantiated they are in the first place.” He continues:

Well-supported beliefs and theories have earned a bit of inertia, and should not be easily modified or abandoned because of isolated antagonistic “facts.” In marked contrast, many ethnic, gender, and occupational stereotypes are particularly troublesome because they often rest on such flimsy or non-existent evidence to begin with.

This leads naturally to a consideration of bias and science. Gilovich acknowledges that scientists are no less susceptible to the effects of bias than are other people, but he also notes that “scientists utilize a set of formal procedures to guard against … sources of bias and error. … Much of the scientific enterprise can be construed as the use of formal procedures for determining when to throw out bad ideas.” These procedures are not perfect, of course, but they do go far beyond the standards demanded of “truth” in other forms of investigation and debate.

(In passing, not wishing to miss a chance to slag relativism, I think that the fact that postmodernist “social scientists” do not experience the same kind of empirical rigour in their work as that to which physical scientists are subjected leads them to underestimate the “truth” of experimental results, which causes them to undervalue the objectivity and “truth” of these outcomes.)

Part II focuses on the unsurprising truth that we believe what we want to believe — among other things, that we are more convinced by arguments that support positions we already hold than we are by those that make us change what we think or how we behave. There is a certain logical economy here, of course, for if we have invested considerable time and energy in acquiring information and beliefs, we won’t want to discard them at the first sign of opposition.

Believing what we want to believe is most evident, Gilovich writes, in assessments of ourselves:

Most of the evidence indicating that people tend to believe what they want to believe comes from research on people’s assessments of their own abilities, and their explanations of their own actions. … People are also prone to self-serving assessments when it comes to apportioning responsibility for their successes and failures. … If a person tries to succeed at something, then any success is at least partly due to his or her efforts and thus warrants some internal attributional credit. Failure, on the other hand, generally defies one’s efforts and intentions, and therefore necessitates looking elsewhere, often externally, for its cause.

One of the main means by which we seek confirmation for those beliefs we already hold is “the ways in which we cognitively process information relevant to a given belief. What evidence do we consider? How much of it do we consider? What criteria do we use as sufficient evidence for a belief? Cognition and motivation collude to allow our preferences to exert influence over what we believe.”

A simple but powerful way that we shape the facts to fit our existing beliefs is the way that we ask questions. When we prefer to believe something, we typically ask ourselves “What evidence is there to support this thesis?” Since most propositions have at least some truth, we will seldom fail to identify supporting evidence. On the contrary, if we prefer not to believe something, we search more diligently for contradictory evidence. One less obvious instance of this tactic emerges when we have a choice of sources of information. We will tend to seek information from those sources which we already hold in high regard, and those sources typically will be confirmatory.

Another subtle factor in assessing the truth of a proposition is related to the truth threshold we apply to it:

People’s preferences influence not only the kind of information they consider, but also the amount they examine. When the initial evidence supports our preferences, we are generally satisfied and terminate our search; when the initial evidence is hostile, however, we often dig deeper, hoping to find more comforting information, or to uncover reasons to believe that the original evidence was flawed.

When we add the dynamics of communication to the mix, things become even more complicated. When we tell stories, including “factual” accounts, we tend to sharpen the more dramatic details and level or delete the more mundane. Gilovich notes, “Information about the person and the action tends to be sharpened, whereas information about the surrounding context and various mitigating circumstances tends to be leveled.” This is particularly true in mass media reports of complex scientific findings, where context and qualifications are crucial to a clear and complete understanding of the issue.

Gilovich adds, “The desire to be informative can also lead people to stretch the facts to make sure the audience gets the point.”

Finally, there is the “infotainment” factor when considering the impact of communication on the information conveyed:

The possibility of inaccuracy obviously increases enormously when the worth of the message is measured by how well it entertains rather than how well it informs. … The desire to entertain often creates a conflict for the speaker between satisfying the goal of accuracy and the goal of entertainment. The desire to entertain can sometimes be the stronger of the two, putting the truth in jeopardy. … Often the speaker’s desire to entertain is matched by a listener’s desire to be entertained, and an implicit understanding develops. … More ominously, the desire to entertain can also lead a speaker to take liberties with the facts without any tacit agreement on the part of the listener.

Taking note of all of the cognitive tendencies in this article should make us more aware of the non-factual components of our “rational” evaluations of new information. Being more aware won’t eliminate our innate tendencies, but it may give us enough pause to try, at least, to enlighten some of our blind spots.

The truth may not make us free, but a little more freedom from our biases may bring the truth a little bit closer to us.

Advertisements

One thought on “What makes us misread information?

  1. In a related essay reported in Arts and Letters recently Julian Baggini discusses the complexity of truth in itself. Two people may make a series of true non overlapping statements on the same topic eg in describing a scene and both appear credible. He concludes however that the truth requires that the individual delivering the truth must consider the listeners or readers with some understanding of the way what is being said or read will be taken . Thus there is lawyer’s truth which may be misleading in what it does not say or where it places emphasis and there is truth delivered to meet a perceived need – “the honest truth” ?

Comments are closed.