Lazy Thinking: Cognitive Ease
This framework for looking at the way our minds function was introduced by Kahneman in Thinking Fast and Slow and divides the mind into two processing systems, System 1 and System 2. According to Kahneman, “System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control, ”1 and System 2 “allocates attention to the effortful mental activities that demand it, including complex computations.”2 While System 1 is highly energy-efficient and helpful to draw conclusions from “old” information, our brain runs into trouble when we face unfamiliar challenges; and we need to summon System 2. Without us realizing it, most of our life runs on System 1, on a low-power cognitive mode because we naturally turn to ideas or concepts that will make the associative mind run as smoothly as possible. For example, we prefer to watch our favourite TV show at the end of the day rather than a historical documentary because it will demand less of our energy resources. Kahneman calls this subconscious decision-making process cognitive ease.
System 2 is required to apply problem-solving techniques and allocates more concentration to perform a particular task. The System 1 and 2 theory provides us with a simple lens through which we can view our own thought processes and use discomfort to our advantage. It is important to be aware of the naturally lazy cognitive process that leads us to think, communicate, and act in a comfortable way. Just like a daily routine makes us act systematically, automatically, almost robot-like, cognitive ease confines our way of thinking and snoozes our intellectual curiosity and creativity. Hence, it is particularly relevant in an academic community with people of different backgrounds to combat our tendency to self-absorb in old biases that constrain our way of thinking. There is clearly a trade-off between cognitive effort and creativity. What is best is up to each one of us.
Confirmation Bias
Confirmation bias is an ever-present force in our thinking, lurking beneath what we believe to be our rational interpretation of the world around us. Confirmation bias occurs when people privilege their preconceived notions over actual evidence. As Caleb Lack describes in Why We can’t Trust our Brains, “if information confirms what we already believe… we unreservedly accept it, and are happy to have been shown it.”3 If, on the other hand, “the information contradicts our beliefs… we nitpick any possible flaw in the information, even though the same flaw would not get a mention if the information confirmed our beliefs.”4 While cognitive ease is the pitfall of lazy thinking, confirmation bias is an active tendency to find confirming evidence to justify one’s preexisting beliefs.
Understanding and being aware of the confirmation bias is critical in order to navigate our complex and divisive times. Awareness of confirmation bias is necessary if one hopes to be an informed media consumer. Today, it is particularly easy to seek out information that corroborates existing views– social media, news apps, and TV channels only contribute to this problem. The use of algorithms to curate and customize information for individual consumers prevents people from being able to see the complete picture. Even in the rare cases that people are exposed to opposing information, confirmation bias encourages viewers to ignore these diverse arguments and accept those they already agree with. By extension, in the political sphere, confirmation bias is the reason why people are reluctant to switch viewpoints despite compelling evidence. Thus it serves as a fundamental component of the ever-growing divide between parties and other groups, as people are quick to denounce information that threatens their rigid worldviews.
For students, confirmation bias is particularly relevant. In order to learn and develop, one needs to first set aside confirmation bias. If people allow confirmation bias to influence the learning experience, they will never be able to adapt their views of the world. When one is closed off from exploring viewpoints that contradict one’s own, true learning becomes impossible. In terms of academic research, awareness of the confirmation bias is particularly important, because it means that students often find sources to fit their argument, rather than adapting their argument to fit the sources. Thus, understanding of confirmation bias is key in establishing intellectual humility and flexibility.
Understanding the Mind:
Interdependence of Thoughts, Feelings, and Desires
The mind serves three primary functions: thinking, feeling, and desiring. Whenever one of these functions are present, the other two are present as well. This implies that thoughts influence emotions and desires, emotions influence our thoughts and desires, and our desires influence our thoughts and emotions.5 Thus, whenever an emotion or desire arises, one can trace the line of thinking that prompted it. This insight is extremely useful because it allows us to better understand our mind and behavior. If we find ourselves bombarded with negative emotions, we can stop and analyze the thinking pattern that led to it. If the thoughts seem irrational, we can replace it with more rational/positive thoughts, which should, in turn, lead to more desirable feelings and desires. We cannot immediately alter our feelings and desires, but we do have a sense of control over our thoughts. Thus, we can use this knowledge of the mind to our advantage and increase our general wellbeing. Belief drives behavior, and this insight can allow us to craft the life experiences that we want.
Sample Framework for Changing your Behaviour:
- You notice a questionable emotion: You feel extremely lonely one night
- Identify the precise thinking that is leading to this feeling: You believe that you have a weak social life
- Analyze whether this thinking is justified: It isn’t, because you have many friends and family in your life
- Develop reasonable thinking: You have a thriving social life, this is the one night you’re not hanging out with someone
- Actively attack the unreasonable thinking with reasonable thinking: Every time the emotion arises, think to yourself “I have many friends and family who love me.”
With time, your new, reasonable thoughts should result in more reasonable feelings and desires.
WYSIATI/Availability Heuristic/Egocentric Immediacy
The availability heuristic, covered by Lack in Why Can’t We Trust our Brains, describes how people are more likely to rely on information they can recall most quickly from past experience.6 This phenomenon can decrease the accuracy of our thinking since we prioritize things “that sounds right” instead of taking an unbiased perspective. Lack points out that instinctive answers to multiple choice questions can be impacted by availability heuristics; it seems like a fitting issue about which to raise awareness among students since it can easily influence test results by biasing students in favor of a familiar-sounding answer. Don’t bubble in the first answer you see that has words you know- think through each to make sure your heuristics aren’t being manipulated.
WYSIATI—what you see is all there is—is the notion that humans deal with limited information as if it were all there is to know.7 Similar to the idea of the availability heuristic, people will build stories out of the information available to them, and so long as the narrative makes sense they believe it. This concept can also involve a number of other cognitive biases such as the halo effect, in which learning one good thing about a person or organization triggers an assumption that everything about them is good; outcome bias, in which a person judges a decision based on its outcome rather than on whether it was rationally chosen; and hindsight bias, in which a person gains overconfidence in their powers of prediction by convincing themselves they “knew” a certain situation would occur after it already had. All of these biases entail a level of ignorance regarding the information that is available to a person, and they all contribute to narrative fallacies and can lead to poor thinking. WYSIATI is important to understand because it can help people to recognize weaknesses and inaccuracies in their thinking. Rather than an assumption that the facts you have are all the facts there are, grow toward adopting the view that you could find out more–on that point, at least, you’ll almost always be right.
“Egocentric immediacy” refers to the tendency of the mind to overgeneralize beyond one’s actual experience. As Jean Piaget noted, children especially tend to use their immediate feelings to form a warped view of their lives. If something good happens to them, the entire world looks good to them, and vice versa.8 This behavior tends to decrease somewhat in adolescence but often pops back up in situations where one has limited information to use in forming assessments. This reaction pattern can lead to an irrational “broad-based pessimism or…foolish optimism.”9 Understanding how this cognitive fallacy works leads to a capacity to fight it– recognizing when one’s beliefs are based on very little experience or information can help us give them less weight and be more ready to reassess them when new information comes along.
Egocentric thinking also occurs when a person fails to see a situation from someone else’s point of view. When a conversation turns into a heated discussion, people generally become more defensive and more inclined to subjectivity. Deconstructing a past experience step-by-step can help render a clearer, more objective image of the situation. Being able to pinpoint the moment egocentrism kicks in–and a separation from critical thinking occurs–is essential to improving one’s future interactions. This doesn’t mean we can’t be emotionally involved in discussion or that we must always be willing to compromise on our beliefs– only that the debate of them must acknowledge the understanding of the personhood and position of the “other”.
Group members: Gonzalo Aguirre, Kevin Chen, Flannery Murphy, Sander Parson
Notes
1 Daniel Kahneman, Thinking, Fast and Slow (New York: Farrar, Straus and Giroux, 2015), 20.
2 Ibid. 21.
3 Caleb W. Lack and Jacques Rousseau, “Why Can’t We Trust Our Brains?” in Critical Thinking, Science, and Pseudoscience: Why We Can’t Trust Our Brains (New York: Springer Publishing Company, 2016), 73.
4 Ibid. 74.
5 Paul, Richard, and Linda Elder, “Strategic Thinking” in Critical Thinking: Tools for Taking Charge of Your Professional and Personal Life. 2nd ed., (Foundation for Critical Thinking, 2014), 315-9.
6 Lack and Rousseau, ibid. 77
7 Kahneman, ibid. 201
8 Paul, ibid. 331-5
9 Paul, ibid. 339