Heuristics
Heuristics are “mental shortcuts or rules of thumbs that significantly decrease the mental effort required to solve problems or make decisions” (Lack 72; paraphrasing from Kahneman, Slovic & Tversky). While you might not have heard the term “heuristics” used before, you have likely been exposed to many of these throughout your life. Two examples of heuristics that you may have encountered in previous classes are the confirmation bias, where your brain favors new information that aligns with your beliefs, and hindsight bias, where your brain draws assumptions on past events using information present currently. Another example from day-to-day life would be the availability heuristic, where the easier something is to remember or recall, the greater importance your brain places on it. This can be seen when trying to assess whether car accidents or lung cancer cause more deaths per year. While it is easier to recall car accidents due to their greater news coverage, lung cancer kills three times as many people per year. (Lack 77).
So are heuristics bad? Well, heuristics are often help when attempting to draw conclusions from incomplete information and can help to arrive at decisions that are “good enough” based on the available information (Lack 74). This can be useful when we don’t have enough information to form an objective conclusion. However, heuristics can also “lead to an oversimplification of reality” and can “cause us to make systematic errors that can then become biases” (Lack 73). We are presented with huge amounts of information every day that our brain must interpret, from readings for classes to news stories on current events. Our heuristics play a large role in deciding what information we hold on to, and what we forget, often retaining information that strengthens our pre-existing beliefs and dismissing information that disagrees with them. If we fail to recognize how our heuristics decide what information we retain, we can become set in our biases and make decisions using only some of the information instead of examining all available information.
Being aware of when we use heuristics in our thinking can help us to think critically. By recognizing when we are interpreting information based on our heuristics we can ensure we re-evaluate the information from a more objective view, assessing each piece of information regardless of how it lines up with our previous beliefs or experiences. This can help us recognize our own biases and reach more objective conclusions about information, thereby letting us make more informed decisions in our lives.
Cognitive Ease
A bat and a ball cost $1.10 together, and the bat costs $1 more than the ball. How much does the ball cost? Without seeing this problem before our brains immediately jump to an answer: 10 cents! Upon review this is clearly false and it takes very elementary arithmetic to solve for the correct value (5 cents). Why then are we so inclined to answer incorrectly? Daniel Kahneman posits that our mind functions with two basic systems, one which controls automatic and seamless functions (called system 1), while the other performs tasks that require attention and focus (called system 2). We have a preference toward system 1 functioning because of what Kahneman calls “cognitive ease”: a state where the mind’s stimuli are neither challenging nor uncomfortable (Kahneman 59). When we experience cognitive ease system 1 goes unchecked and we often wind up fully believing that we have answered a simple question correctly, when in fact we are fundamentally wrong. The human mind is by default a lazy apparatus and without constant checking it is prone to faulty conclusions, often with more severe repercussions than feeling slightly dull (Kahneman 46). By recognizing the dangers of trusting cognitive ease and remaining on system 1 autopilot, we gain an awareness that will prompt us to become more active thinkers who reach more correct conclusions.
Inert Information and Activated Knowledge:
Inert information is “information that, though memorized, is not understood, and, hence, cannot be used”, while activated knowledge is “information that is not only true but, when insightfully understood, leads the thinker to…more knowledge, deeper understandings, and rational actions” (Paul 104, 107). Students of all levels are probably all too familiar with the first concept. “The mitochondria is the powerhouse of the cell”, “the government is for the people, of the people, and by the people”, “a^2+b^2=c^2”: all this information has become readily available from years of schooling, yet I bet most of us could not explain these concepts further if prompted. We assume that any information we consume is knowledge, but shallow facts and definitions do not suffice, since they lead to neither deeper understandings nor purposeful actions. When we take for granted the merit of information without pursuing activated knowledge, we risk having a cognitive bank of only shallow information to pick from.
The largest manufacturer of inert information is the education system. The cycle of memorizing, testing, then forgetting reflects the majority of our experiences in school. Most assignments test how well we can cram facts into our mind rather than how we can apply these concepts to gain a more complex understanding of the world. Assessments that allow students to develop ideas rather than regurgitate them (such as essays and projects) would help combat inert information. Through these types of assessments, we understand influential concepts deeply, rather than consume a multitude of basic facts. This learning experience then leads to activated knowledge, as long as the student is engaged. We can use activated knowledge to change our mindset or actions, ultimately affecting our worldview.
Activated Ignorance
Activated Ignorance is “taking into the mind and actively using information that is false, though it is mistakenly taken to be true” (Paul 105). Activated ignorance often leads one to believe that they genuinely understand other people, events, etc. that in reality they do not (Paul 105). Sometimes activated ignorance operates on a large scale, such as when a dictator acts on internalized ignorance, leading to discrimination against an entire group of people. However, activated ignorance is more often a part of our daily lives.
One such example of a college student’s activated ignorance demonstrates the real implications of how this type of ignorance impairs judgement and decision-making that could have detrimental impacts further down the road. A college student bought a juul (the electronic cigarette) and after having been told that he should be careful because it’s addictive he said, “I won’t get addicted because I don’t have an addictive personality.” In reality one develops an addiction if the chemical substance has addictive qualities, not due to one’s personality traits. A few months later the student was clearly addicted to juuling, and at that point hopefully realized that what he said about his personality preventing him from becoming addicted was false.
By identifying and acknowledging our activated ignorance we can change the way that we think and act, ultimately paving the way to activated knowledge. This will not only allow us to re-evaluate what we already know and hopefully improve our knowledge base outside of the classroom, but it will also enable us to make informed and rational decision-making in our daily lives.
Group members: Erik Schofer, Max Thomas, Stephanie Leow, Jessie Powers
Bibliography
Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
Lack, Caleb W., and Rousseau, Jacques. Critical Thinking, Science, and Pseudoscience : Why We Can’t Trust Our Brains. New York, Springer Publishing Company, 2016.
Paul, Richard, and Linda Elder. Critical Thinking: Tools for taking charge of your professional and personal life. Pearson Education, 2014.