Cognitive Biases

We will perceive it, when we believe it

Cognitive Biases
Author: Stijn Dejongh
Published on: 2024-07-28 00:00:00 +0000 UTC

Definition

A cognitive bias is an “error in thinking” that affects how we perceive the world, make decisions, and interact with others. Biases are systematic patterns of deviation from norm or rationality in judgment, where individuals create their own subjective reality from their perception of the input. These biases often stem from the brain’s attempt to simplify information processing, leading to errors in decision-making and judgment.

Key Components

Evolutionary psychologists have discovered that our brains are not designed to be perfect, but to be good enough. Their reasoning is that we survived as a species not because we were the smartest or the strongest, but because we were the most adaptable. We were able to make quick decisions based on incomplete information and act on them in ways that benefited our survival. This is why we have a tendency to avoid pain, seek pleasure, and make decisions based on emotion rather than reason. These shortcuts, or heuristics, were critically important in the past when we lived in a world of scarcity, danger, and uncertainty. However, in our modern, information-rich environment, they often lead to significant errors in judgment. Here is a non-exhaustive list of different types of biases that you are likely to encounter:

Social biases

These biases are based on how we perceive and interact with others. Examples include the halo effect (where our overall impression of a person influences our judgment of their specific traits), the bandwagon effect (where we adopt beliefs because others do), and the fundamental attribution error (where we overemphasize personal characteristics and ignore situational factors in judging others’ behavior). Examples include:

  • Halo Effect: Our overall impression of a person influences our judgment of their specific traits.
  • Bandwagon Effect: We adopt beliefs because others do.
  • Fundamental Attribution Error: We overemphasize personal characteristics and ignore situational factors in judging others’ behavior.

Memory biases

These biases are related to how we remember past events. Examples include the availability heuristic (where we judge the likelihood of events based on how easily examples come to mind), recency bias (where we give undue Learning weight to recent events), and the hindsight bias (where we see events as having been predictable after they have already occurred). Examples include:

  • Availability Heuristic: We judge the likelihood of events based on how easily examples come to mind.
  • Recency Bias: We give undue weight to recent events.
  • Hindsight Bias: We see events as having been predictable after they have occurred.

Emotional biases

These biases are based on how we feel about certain situations. Examples include the sunk cost fallacy (where we continue an endeavour because of previously invested resources), loss aversion (where we fear losses more than we value gains), and the status quo bias (where we prefer things to stay the same). Examples include:

  • Sunk Cost Fallacy: We continue an endeavour because of previously invested resources.
  • Loss Aversion: We fear losses more than we value gains.
  • Status Quo Bias: We prefer things to stay the same.

Attention and Decision-making biases

These biases are related to how we pay attention to information and make decisions. Examples include the anchoring effect (where we rely too heavily on the first piece of information we receive), confirmation bias (where we seek out information that confirms our beliefs), and survivorship bias (where we focus on successful examples and overlook failures). Examples include:

  • Anchoring Effect: We rely too heavily on the first piece of information we receive.
  • Confirmation Bias: We seek out information that confirms our beliefs.
  • Survivorship Bias: We focus on successful examples and overlook failures.

Background

Origin

The concept of cognitive biases has roots in psychology and behavioural economics. It gained prominence in the 1970s through the work of psychologists Amos Tversky and Daniel Kahneman, who identified and described many of these biases in their research on judgment and decision-making.

Application

In practice, recognizing cognitive biases is crucial for improving decision-making in personal and professional contexts. By being aware of these biases, individuals and organizations can implement strategies to minimize their impact, such as seeking diverse perspectives, relying on data-driven decision-making, and practicing critical thinking.

Comparisons

Bias or Fallacy?

Logical fallacies are errors in reasoning and argumentation. While cognitive biases are systematic patterns of deviation from norm or rationality in judgment, logical fallacies are specific errors in the structure of arguments. Both can lead to flawed conclusions, but they arise from different cognitive processes.

Psychological Priming

Psychological priming involves the exposure to a stimulus influencing the response to a subsequent stimulus, often without conscious guidance or intention. Priming can shape perceptions, behaviours, and attitudes by activating certain associations in memory. For instance, if individuals are primed with words related to old age, they may walk more slowly afterward, demonstrating how subtle cues can affect behaviour. While cognitive biases are often unconscious errors in thinking that skew judgment, psychological priming is a process where exposure to certain stimuli can unconsciously influence subsequent responses. Both concepts illustrate the powerful, often hidden forces that shape human thought and behaviour, but they operate through different mechanisms: biases distort our decision-making processes, whereas priming subtly alters our reactions and behaviours based on prior stimuli.

Examples

WWII Fighter Planes

During World War II, the US Navy was analyzing the damage to returning fighter planes to determine where to add armour. Initially, they considered reinforcing the areas most frequently hit by enemy fire, based on the damage observed on the planes that made it back.

However, Abraham Wald, a statistician, pointed out that this approach was flawed due to survivorship bias. The planes that returned safely did so despite the damage. Thus, the damage on these planes indicated areas where planes could sustain hits and still survive. Wald argued that the Navy should instead reinforce the areas where the returning planes showed no damage, as planes hit in these areas likely did not return.

By acknowledging and addressing survivorship bias, the military was able to make more informed decisions. This strategic change is believed to have saved many lives and aircraft, showcasing the importance of considering unseen failures and not just visible successes.

Workplace Pigeonholing

Imagine you believe that people from a particular country are generally late to meetings. One day, a colleague from that country arrives late to a meeting. Your confirmation bias kicks in, reinforcing your stereotype. You recall other instances when people from that country were late, while conveniently forgetting times they were punctual. This selective memory and interpretation strengthen your existing belief, making you more confident that your stereotype is correct. As a result, you become more prone to noticing and remembering instances that confirm your bias, while ignoring evidence that contradicts it.

This confirmation bias can lead to significant misunderstandings and hinder collaboration. For example, if you consistently expect certain colleagues to be late, you might start scheduling meetings without their input or fail to communicate important details in advance. This creates a negative work environment where individuals are judged based on stereotypes rather than their actual performance. Over time, this bias can damage professional relationships and reduce the overall effectiveness of the team.

Who caused the plague?

During the time of the plague, different cultures interpreted the cause of the devastating disease in ways that reflected their own beliefs and values, often committing the fundamental attribution error. In Europe, the plague was seen as divine punishment for the sins of the people. This attribution to moral failing overlooked the actual cause: fleas carried by rats that spread the disease along trade routes. Similarly, in the Middle East, the plague was viewed as a test of faith, while in Africa, it was seen as a curse, and in the Americas, indigenous people saw it as a sign of the end of the world.

These interpretations exemplify the fundamental attribution error, where people attributed the cause of the plague to internal, moral, or spiritual failings rather than external, situational factors. This bias led to a widespread misunderstanding of the disease and hindered effective responses to the crisis. By recognizing this error, we can learn to consider broader situational factors in our judgments, leading to more accurate and compassionate understanding of events and behaviours.

A Very Expensive Crater

In the late nineties, a simple misunderstanding caused a NASA orbiter to crash into Mars. Controller teams from different parts of the world were collaborating to guide the spacecraft into orbit around the red planet. But as both teams used a different measurement system (imperial vs. metric), they ended up creating a very expensive new crater on Mars. Luckily, most misunderstandings do not have a price tag running in the hundreds of millions. Still, it is best to avoid them if we can.