Notes
Notes
1. Role of Heuristics:
Heuristics are mental shortcuts that simplify decision-making but can lead to errors:
○ Representativeness Heuristic: Judging the likelihood of an event based on
how similar it is to a prototype, often leading to ignoring base rates or
statistical reality.
○ Availability Heuristic: Estimating frequency or likelihood based on how
easily examples come to mind, which can be biased by recent or familiar
events.
○ Anchoring and Adjustment Heuristic: Making decisions by starting with an
initial anchor and insufficiently adjusting away from it.
2. Framing Effect:
Decisions are influenced by how options are presented:
○ People tend to avoid risks when potential gains are highlighted but seek risks
when losses are emphasized.
3. Overconfidence:
Individuals often overestimate their knowledge, predictions, or abilities, leading to
errors in judgment and planning (e.g., the planning fallacy).
4. Hindsight Bias:
After an event occurs, people overestimate their ability to have predicted the
outcome ("I knew it all along").
Heuristics in Decision-Making
Definition:
Heuristics are mental shortcuts or rules of thumb that simplify decision-making by reducing
cognitive effort. While generally effective, heuristics can sometimes lead to systematic errors
or biases.
1. Representativeness Heuristic
● Definition:
Decisions are based on how similar an event or object is to a prototype or stereotype
rather than considering statistical probabilities.
● Example:
A coin toss sequence of HHTHTT is judged as more random than HHHHHH, even
though both have the same probability.
● Research Insight:
○ Tversky and Kahneman (1972) demonstrated how people judge probabilities
based on representativeness rather than base rates.
○ Kahneman and Tversky (1983) studied the conjunction fallacy, where
people incorrectly believe that the probability of two events (e.g., "bank teller"
and "feminist") is higher than the probability of one event alone.
● Biases Associated:
○ Base Rate Neglect: Ignoring statistical probabilities in favor of stereotypes.
○ Small Sample Fallacy: Assuming that small samples represent the
population.
2. Availability Heuristic
● Definition:
Decisions are influenced by how easily examples or instances come to mind. Events
that are recent, vivid, or emotionally charged tend to be judged as more frequent or
likely.
● Example:
After hearing about a plane crash in the news, individuals might overestimate the risk
of air travel.
● Research Insight:
○ Tversky and Kahneman (1973) highlighted that availability impacts judgments
of frequency and probability.
○ MacLeod and Campbell (1992) showed that recalling positive or negative past
events can shape expectations for future outcomes.
● Biases Associated:
○ Recency Effect: Recent events are perceived as more likely.
○ Familiarity Effect: Frequently mentioned items are perceived as more
common (e.g., media coverage distorting population estimates).
● Definition:
Decisions are influenced by an initial reference point (anchor), and subsequent
adjustments are often insufficient to reach a correct answer.
● Example:
When asked if the Mississippi River is longer or shorter than 500 miles, estimates are
anchored near 500 miles, even though the actual length is over 2,300 miles.
● Research Insight:
○ Tversky and Kahneman (1974) showed that arbitrary anchors influence
numerical judgments.
○ Englich and Mussweiler (2001) demonstrated anchoring in courtroom
sentencing; judges' decisions were swayed by anchors provided by
inexperienced prosecutors.
● Biases Associated:
○ Over-reliance on the anchor: Even irrelevant or arbitrary anchors affect
decisions.
○ Confidence Intervals: People fail to adjust sufficiently, leading to overly
narrow estimates.
4. Affect Heuristic
● Definition:
Decisions are influenced by immediate emotional responses rather than logical
analysis. Positive or negative feelings shape judgments of risks and benefits.
● Example:
People may avoid nuclear energy due to fears of radiation, even if statistics show it is
safer than other energy sources.
● Research Insight:
○ Slovic et al. (2002) demonstrated how emotional reactions guide perceptions
of risk and reward, often overriding factual information.
● Biases Associated:
○ Negativity Bias: Negative emotions (fear, anxiety) have a stronger influence
than positive ones.
○ Overgeneralization: Quick emotional judgments overshadow detailed
analysis.
5. Recognition Heuristic
● Definition:
When comparing two items, the one that is recognized is often judged as having
higher value or likelihood.
● Example:
If asked which city has a larger population, Milan or Modena, most people would
choose Milan because it is more recognizable.
● Research Insight:
○ Goldstein and Gigerenzer (2002) found that the recognition heuristic often
leads to accurate judgments in real-world scenarios.
● Biases Associated:
○ Familiarity Bias: Recognized options are favored over unrecognized ones,
even if irrelevant.
Applications of Heuristics
1. Healthcare:
○ Doctors using the availability heuristic may overdiagnose conditions recently
seen in other patients.
2. Finance:
○ Investors may anchor their decisions on previous stock prices, ignoring new
market information.
3. Policy and Marketing:
○ Framing effects (related to heuristics) influence consumer and voter behavior
by presenting choices as gains or losses.
1. Awareness Training:
Educating individuals about common heuristics and their potential biases.
2. Use of Statistical Analysis:
Encouraging reliance on data over intuition.
3. De-Biasing Techniques:
Methods like the "crystal-ball technique" (Cannon-Bowers & Salas, 1998) or
considering alternative scenarios to reduce overconfidence.
Understanding heuristics helps in recognizing their utility in simplifying decisions and the
necessity of caution to avoid biases in critical contexts.
Biases in Decision-Making
Definition:
Biases are systematic deviations from rational judgment or logical decision-making. They
occur due to the influence of cognitive shortcuts, emotions, social pressures, and
environmental factors, often leading individuals to make flawed or suboptimal decisions.
1. Confirmation Bias
● Definition:
The tendency to seek, interpret, and remember information that confirms pre-existing
beliefs while ignoring evidence that contradicts them.
● Example:
A person who believes in astrology might remember instances where their horoscope
predictions were accurate but overlook times when they were not.
● Research Insight:
○ Wason (1960) demonstrated this bias through the "Wason Selection Task,"
where participants tended to confirm their hypotheses rather than testing for
disconfirmation.
2. Anchoring Bias
● Definition:
The tendency to rely heavily on the first piece of information encountered (the
"anchor") when making decisions, even if it is irrelevant.
● Example:
In a negotiation, the first offer serves as an anchor, influencing subsequent
discussions and the final agreement.
● Research Insight:
○ Tversky and Kahneman (1974) showed that arbitrary numbers, such as a
random wheel spin, influenced participants’ numerical estimates in unrelated
tasks.
3. Availability Bias
● Definition:
The tendency to judge the likelihood of an event based on how easily examples
come to mind, often influenced by recent, vivid, or emotional experiences.
● Example:
After hearing about a plane crash in the news, people might overestimate the risk of
flying, despite statistics showing its relative safety.
● Research Insight:
○ Tversky and Kahneman (1973) highlighted how this bias skews perceptions
of risk and frequency judgments.
4. Representativeness Bias
● Definition:
The tendency to judge probabilities or categories based on how closely an instance
resembles a stereotype or prototype, while neglecting base rates or statistical
probabilities.
● Example:
Believing someone with a quiet demeanor is more likely to be a librarian than a
salesperson, despite the larger number of salespeople overall.
● Research Insight:
○ Kahneman and Tversky (1972) showed how stereotypes overshadow
base-rate information in probability judgments.
5. Overconfidence Bias
● Definition:
The tendency to overestimate one's knowledge, abilities, or predictions about future
events.
● Example:
Students frequently underestimate the time needed to complete assignments, a
phenomenon known as the planning fallacy.
● Research Insight:
○ Kahneman and Tversky (1995) demonstrated overconfidence in estimating
probabilities of events, even in situations with known uncertainties.
6. Hindsight Bias
● Definition:
The inclination to perceive events as having been predictable after they have
occurred, often referred to as the "knew-it-all-along" effect.
● Example:
After a stock market crash, investors claim they always knew it was going to happen.
● Research Insight:
○ Fischhoff (1975) identified this bias and showed how it distorts individuals'
understanding of past events.
7. Framing Effect
● Definition:
Decisions are influenced by how information is presented, particularly as gains or
losses, even if the underlying facts remain the same.
● Example:
People are more likely to opt for a treatment with a "90% survival rate" than one with
a "10% mortality rate."
● Research Insight:
○ Tversky and Kahneman (1981) showed that the framing of outcomes affects
whether people take risks (gain vs. loss scenarios).
8. Loss Aversion
● Definition:
The tendency to prefer avoiding losses over acquiring equivalent gains, often leading
to risk-averse behavior.
● Example:
Losing $100 feels more painful than the pleasure of gaining $100.
● Research Insight:
○ Kahneman and Tversky (1979) proposed this as a cornerstone of their
Prospect Theory, explaining how losses weigh more heavily than gains in
decision-making.
9. Cognitive Dissonance
● Definition:
The discomfort experienced when holding contradictory beliefs or behaviors, often
leading individuals to justify or rationalize their choices.
● Example:
A smoker who knows smoking causes cancer might downplay the risks to reduce
discomfort.
● Research Insight:
○ Festinger (1957) introduced the theory of cognitive dissonance, showing how
individuals seek consistency in their beliefs and behaviors.
● Definition:
The preference for maintaining the current state of affairs, avoiding change even
when it could lead to better outcomes.
● Example:
Choosing the default option in retirement plans instead of exploring alternatives that
may offer higher returns.
● Research Insight:
○ Samuelson and Zeckhauser (1988) found this bias prevalent in scenarios
involving complex decisions.
● Definition:
The tendency to attribute positive outcomes to personal efforts and negative
outcomes to external factors.
● Example:
A student attributing good grades to hard work but blaming poor grades on unfair
exams.
● Research Insight:
○ Miller and Ross (1975) studied this bias and its role in maintaining
self-esteem.
● Definition:
The tendency for an overall impression of a person or entity to influence judgments
about their specific traits or abilities.
● Example:
Assuming a physically attractive person is also more competent or intelligent.
● Research Insight:
○ Thorndike (1920) documented this effect in his studies on human judgment.
1. Everyday Life:
○ People rely on biases to navigate complex environments quickly but often at
the expense of accuracy.
○ Example: Overconfidence bias can lead to underpreparedness in personal or
professional tasks.
2. Policy and Leadership:
○ Loss aversion and status quo bias influence policymaking, as leaders avoid
risky changes even when evidence supports them.
○ Example: Framing effects in public health campaigns influence vaccine
uptake.
3. Business and Marketing:
○ Anchoring and framing are widely exploited in advertising and pricing
strategies to influence consumer behavior.
○ Example: Presenting a product as "50% off" rather than "half-price" increases
perceived value.
Mitigating Biases in Decision-Making
Definition
The framing effect refers to the phenomenon where people's decisions are influenced by
how information is presented, rather than the content itself. Depending on whether an option
is framed as a gain or a loss, individuals may exhibit risk-averse or risk-seeking behavior.
● Study: Participants were asked to choose between two public health programs
designed to combat an epidemic.
● Scenarios:
○ In the gain frame, participants chose between:
1. Saving 200 people with certainty.
2. A one-third chance of saving 600 people and a two-thirds chance of
saving no one.
Result: 72% preferred the certainty of saving 200 lives (risk-averse).
○ In the loss frame, participants chose between:
1. 400 people dying with certainty.
2. A one-third chance that no one would die and a two-thirds chance that
600 would die.
Result: 78% preferred the risky option (risk-seeking).
● Conclusion:
The framing of the problem (saving lives vs. avoiding deaths) strongly influenced
decision-making, even though the outcomes were statistically identical.
● Showed how the framing effect shapes public opinion in political contexts.
● Example: The same policy can receive differing support based on whether it is
framed as "tax relief" (positive connotation) or "tax cuts" (potentially negative).
Mechanisms Behind the Framing Effect
1. Emotional Reactions:
○ Gains are associated with positive emotions (e.g., relief), while losses trigger
negative emotions (e.g., fear or anger). These emotional reactions drive risk
preferences.
2. Loss Aversion (Kahneman & Tversky, 1979):
○ People experience losses more intensely than equivalent gains, making loss
frames more compelling.
3. Cognitive Processing:
○ Gain frames align with risk-aversion tendencies, encouraging cautious
behavior.
○ Loss frames activate risk-seeking tendencies, as people are more willing to
gamble to avoid a loss.
1. Healthcare Decisions
● Policies are framed to emphasize either gains (e.g., "job creation") or losses (e.g.,
"job cuts") to sway public opinion.
● Example: Framing environmental policies as "protecting future generations" (gain
frame) versus "preventing climate disasters" (loss frame).
● Lawyers and negotiators use framing to influence judgments about settlements and
penalties. For example:
○ A defense attorney may frame a plea deal as avoiding a "20-year sentence"
rather than focusing on the reduced "5-year punishment."
Limitations and Criticism of the Framing Effect
1. Individual Differences:
○ Not all individuals respond equally to framing; factors like personality traits,
cultural norms, and decision-making experience can moderate the effect.
2. Cognitive Load:
○ Under high cognitive load or stress, people may rely more heavily on frames,
exacerbating their susceptibility to the framing effect.
3. Consistency of Influence:
○ While the framing effect is robust, its impact can diminish when individuals are
aware of the manipulation or when incentives for accuracy are high.
1. Critical Thinking:
○ Encourage decision-makers to focus on the underlying facts, not the framing.
○ Example: Reframing loss scenarios into equivalent gain terms to assess if
preferences change.
2. Awareness and Education:
○ Educating individuals about cognitive biases can reduce susceptibility to
framing.
3. Structured Decision-Making:
○ Using decision trees or algorithms to compare options without the influence of
emotional framing.
4. Neutral Framing:
○ Presenting information in a neutral and balanced manner reduces the
likelihood of biased decisions.
Process of Decision-Making
2. Gathering Information
● Collecting relevant data or insights about possible alternatives, risks, and benefits.
● Information may come from memory, external sources, or sensory cues.
Research Insight:
Simon (1957) proposed the concept of bounded rationality, suggesting that humans cannot
process all available information due to cognitive limitations. Instead, they settle for a
"satisficing" solution—good enough but not optimal.
3. Evaluating Alternatives
● Weighing the pros and cons of different choices based on factors such as feasibility,
risks, and potential benefits.
● This stage involves both rational (Type 2) and intuitive (Type 1) thinking.
Research Insight:
Kahneman and Tversky (1974) studied the anchoring effect, showing that initial impressions
or reference points heavily influence evaluations, even when they are arbitrary.
● Selecting an option that aligns with priorities, values, and the evaluation of
alternatives.
● Emotional and cognitive biases may influence this step.
Research Insight:
Kahneman (2011) distinguished between fast, intuitive decisions (System 1) and deliberate,
analytical decisions (System 2), emphasizing their interplay in human choice.
5. Taking Action
● Assessing the effectiveness of the decision. Was the goal achieved? Were there
unexpected consequences?
● Feedback from this step helps refine future decision-making processes.
Research Insight:
Janis and Mann (1977) explored decision evaluation frameworks, emphasizing the
importance of reviewing outcomes to avoid post-decisional regret or "buyer's remorse."
1. Healthcare:
○ Improving patient choices through shared decision-making models.
○ Example: Reducing bias in treatment decisions using decision aids (Elwyn et
al., 2012).
2. Policy-Making:
○ Understanding framing effects to design better public health campaigns.
○ Example: Nudges for vaccination uptake (Thaler and Sunstein, 2008).
3. Education:
○ Teaching critical thinking and awareness of biases in decision-making (Levy,
2010).
4. Business and Management:
○ Using anchoring effects in pricing strategies and negotiations.
○ Example: Englich and Mussweiler (2001) showed that arbitrary anchors
influenced courtroom sentencing.