It’s hard to be rational about irrationality

Last time, I wrote a rather frustrated little piece about how hard it is for a species as habitually irrational as we are to have real democracy.

The more I thought about what I’d written, and about the many books and articles that had prompted it, the more I appreciated an article that I had read way back in April. (In online terms, that’s a couple of decades ago, not just a couple of months.)

In a Scientific American blog piece titled “The Irrationality of Irrationality: The Paradox of Popular Psychology,” Samuel McNerney cautions us to tread lightly when we draw conclusions from the recent flood of popularized psychology explanations of how and why we’re not really rational creatures at all — at least, not often, and never entirely.

McNerney doesn’t join the loud complaints of some critics, who object that much of contemporary neuropsychology is little more than scanner games. McNerney’s point is more analytical. He cautions that the materials that inform us about our irrational biases are themselves both instances of and subject to those very biases. Hence the “paradox” of his title: we can’t eliminate our own irrationality from our assessment of the accuracy and utility of explanations of irrationality.

McNerney cites prominently a 1996 study by Brenner, Koehler, and Tversky. That study showed that even when participants are aware that they are being given only one side of the story in a dispute, they are nevertheless influenced to favour the side of the story to which they have been exposed. In other words, even when I know that I’m getting only limited information, I’m still influenced by that information.

The key seems to be narrativity. If the one-sided story makes sense as a narrative, it exerts an impact on us. Other research has shown that we strongly favour information that can be construed as a straightforward narrative. We appear to be cognitively drawn to stories that organize and simplify the complexities we encounter.

If we are susceptible to influence even when we know that influence is being exerted, can we ever be secure that we are making a rational decision, or an informed choice? If we are so thoroughly preconditioned to accept bias that any well-wrought narrative carries weight, how are we to mitigate our tendency to accept what we’ve heard? Can we escape our bias toward biased narrative?

This is McNerney’s core point. He writes that “we only require a small amount of information, often times a single factoid, to confidently form conclusions and generate new narratives to take on new, seemingly objective, but almost entirely subjective and inaccurate, worldviews.”

And, according to McNerney, even as we’re reading the last sentence, narrative biases are at work. His claim may be entirely wrong, but if it sounds right  — if it’s narratively consistent — it influences us.

It’s hard to read with the kind of restrained objectivity that minimizing narrative bias requires. It may be impossible. According to McNerney, “Narratives are also irrational because they sacrifice the whole story for one side of a story that conforms to one’s worldview. Relying on them often leads to inaccuracies and stereotypes.”

The shortcomings of our rationality have been thoroughly exposed to the lay audience. But there’s a peculiar inconsistency about this trend. People seem to absorb these books uncritically, ironically falling prey to some of the very biases they should be on the lookout for: incomplete information and seductive stories. That is, when people learn about how we irrationally jump to conclusions they form new opinions about how the brain works from the little information they recently acquired.

McNerney concludes that, ironically, readers of books about irrationality “jump to conclusions about how the brain jumps to conclusions and fit their newfound knowledge into a larger story that romantically and naively describes personal enlightenment.”

To some extent, the explanation of irrationality we believe will be the last narratively consistent version to which we have been exposed. This preference for the recent is, of course, an irrational bias in itself. Around and around we go.

How do we minimize the problem? McNerney urges: “let’s remember that the next book on rationality isn’t a tell-all – it’s merely another piece to the puzzle. This same approach could also help correct the problem of being too swayed by narratives.”

Does this mean that my strong preference for rationality is irrational? That is, if I have a set of internalized narratives that are consistent with an impulse to rationality, then choosing rationality has been motivated irrationally.

If rationality is essentially irrational, then reason tells me that I should reject rationality. But the reason that tells me to reject rationality is itself irrational, so on what rational basis should I follow its advice? That would make it irrational to reject the irrationality of reason. And that makes no sense.

I have to go now. I’m getting a headache.

Advertisements

One thought on “It’s hard to be rational about irrationality

  1. But do we develop a more rational narrative over time as science claims to do? Ah.The wisdom of the elders of course.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s