heuristics and biases

Introduction

The Heuristics and Biases research program has been tremendously influential in all fields that involve decision making. Its classic framework was established more than four decades ago, with Amos Tversky & Daniel Kahneman’s publication of their findings in “Judgment under Uncertainty: Heuristics and Biases” (1974). This classic study has its critics and revisionary defenders (see, for example, here, herehere and here).

Regardless, it will be helpful to know about common heuristics and biases. It’s also important to recognize that we all operate with them in one way or another and, for that reason, often have our decisions unknowingly influenced by marketing and sales tricks–including those employed by politicians and their strategists. So, heuristics & biases literacy is vital to preserving our rational agency (not just a mere illusion thereof).

Also, keep in mind that we all have bias blind spots: one person is not more or less biased than another; it’s just that we are biased in different ways. Taking oneself to be less biased than others is likely a case of the Dunning-Kruger Effect, as illustrated in this video:

For further details, David Dunning’s “We are All Confident Idiots” (2014) is worth a read.  It’s very accessible, with a lot of interesting studies and stories. Particularly worth mentioning is the section “Motivated Reasoning,” in which Dunning talks about how education can produce illusory confidence and how best to eradicate misbeliefs in today’s Wild West settings of internet and news media. He ends with this sobering note:

“The built-in features of our brains, and the life experiences we accumulate, do in fact fill our heads with immense knowledge; what they do not confer is insight into the dimensions of our ignorance. As such, wisdom may not involve facts and formulas so much as the ability to recognize when a limit has been reached. Stumbling through all our cognitive clutter just to recognize a true “I don’t know” may not constitute failure as much as it does an enviable success, a crucial signpost that shows us we are traveling in the right direction toward the truth.”

Another way to describe the situation is that we are all subject to the illusion of understanding or the knowledge illusion, as Steven Sloman and Philip Fernbach demonstrate in their book The Knowledge Illusion: Why We Never Think Alone (reviewed here). Here is Sloman discussing the basic ideas with the Data Skeptic podcast host:

Here is Fernbach explaining the illusion of understanding (or illusion of explanatory depth) in a way that highlights the importance of making both ourselves and others cognizant of such illusion, especially in a polarized, passion-driven society like ours:

And here is a neat video illustration, which provides a good model for how you may help others to recognize that they are less knowledgeable than they think and, hopefully, to develop a sense of intellectual humility:



 

Some Common Examples

You can get short and accessible lessons about basic heuristics and biases from The Cognitive Bias Podcast by David Dylan Thomas, which is regularly updated. Here we sample just a few.

1. Representativeness heuristic, which often leads to stereotyping.

2. Availability heuristic, which can lead us to exaggerate the likelihood and magnitude of an outcome. This effect is most salient with media exposure.

3. Affect heuristic, the role of which in voter decisions is well understood by political strategists.

4. Anchoring and adjustment heuristic,[1] as neatly explained in this video:

5. Framing effect, as explained here:

Anchoring and Framing, thanks to their powerful joint effect in affecting consumer behavior, are commonly used together–sometimes also paired with loss aversion–in sales & marketing strategy and business consulting. They are also reflected in savvy restaurant menu designs–either to entice you to choose expensive meals or to nudge you into making healthier choices. Some career advisors recommend them for effective salary negotiation. Some are also exploring their use in news reporting, in building support for climate policies, and in bringing about social change generally speaking.

Here is a presentation about heuristics and biases based on Kahneman’s Thinking, Fast and Slow:

Here is one that speaks more to our times:

This one talks about heuristics and biases in connection with implicit bias:

And here is one from Khan Academy that connects various heuristics with decision making:


[1] In Think, Fast and Slow, Kahneman mentions two hypotheses about the causal mechanisms of anchoring. He writes: “There is a form of anchoring that occurs in a deliberate process of adjustment, an operation of System 2. And there is anchoring that occurs by priming effect, an automatic manifestation of System 1.” (120) Anchoring as adjustment is described as “a strategy for estimating uncertain quantities: start from an anchoring number, assess whether it is too high or too low, and gradually adjust your estimate by mentally ‘moving’ the anchor” (120). By contrast, anchoring as priming effect works by “suggestion” (priming) and “associative coherence”(122-23).

“The main moral of priming research,” Kahnman states, “is that our thoughts and behavior are influenced, much more than we know or want, by the environment of the moment.” For priming effects work in such a way that you may be “influenced by stimuli to which you pay no attention at all, and even by stimuli of which you are completely unaware” (128). Sounds plausible, right? For a long time, many researchers indeed bought into the priming theory. Here is a video illustration:

Unfortunately, the studies that Kahneman used to pitch the priming theory turned out to be deeply problematic: it was one of those theories that triggered the replicability crisis in social psychology. This article, with its telling title “Reconstruction of a Train Wreck: How Priming Research Went off the Rails,” offers a most compelling analysis of Kahneman’s blunder in putting so much weight on the priming theory. Kahneman makes a similar mistake with the theory of ego depletion. So, keep this in mind: when it comes to any theory in social psychology, don’t take anybody’s word for it, not even when it’s from a Nobel Prize winner.