0% found this document useful (0 votes)
2 views

Notes

Decision-making is a cognitive process involving the evaluation of alternatives to achieve goals, often influenced by heuristics and biases. Key heuristics include representativeness, availability, and anchoring, which can simplify decisions but also lead to systematic errors. Biases such as confirmation bias, overconfidence, and the framing effect further impact decision-making, highlighting the need for awareness and strategies to mitigate their influence.

Uploaded by

Jeon Krystal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Notes

Decision-making is a cognitive process involving the evaluation of alternatives to achieve goals, often influenced by heuristics and biases. Key heuristics include representativeness, availability, and anchoring, which can simplify decisions but also lead to systematic errors. Biases such as confirmation bias, overconfidence, and the framing effect further impact decision-making, highlighting the need for awareness and strategies to mitigate their influence.

Uploaded by

Jeon Krystal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Definition

Decision-making is a cognitive process where individuals evaluate and choose among


alternatives to achieve a goal or resolve uncertainty. Unlike reasoning, decision-making
lacks a definitive set of rules and often involves incomplete information or emotional
influences.

Key Elements in Decision Making

1. Role of Heuristics:
Heuristics are mental shortcuts that simplify decision-making but can lead to errors:
○ Representativeness Heuristic: Judging the likelihood of an event based on
how similar it is to a prototype, often leading to ignoring base rates or
statistical reality.
○ Availability Heuristic: Estimating frequency or likelihood based on how
easily examples come to mind, which can be biased by recent or familiar
events.
○ Anchoring and Adjustment Heuristic: Making decisions by starting with an
initial anchor and insufficiently adjusting away from it.
2. Framing Effect:
Decisions are influenced by how options are presented:
○ People tend to avoid risks when potential gains are highlighted but seek risks
when losses are emphasized.
3. Overconfidence:
Individuals often overestimate their knowledge, predictions, or abilities, leading to
errors in judgment and planning (e.g., the planning fallacy).
4. Hindsight Bias:
After an event occurs, people overestimate their ability to have predicted the
outcome ("I knew it all along").

Heuristics in Decision-Making

Definition:
Heuristics are mental shortcuts or rules of thumb that simplify decision-making by reducing
cognitive effort. While generally effective, heuristics can sometimes lead to systematic errors
or biases.

Key Types of Heuristics in Decision-Making

1. Representativeness Heuristic

● Definition:
Decisions are based on how similar an event or object is to a prototype or stereotype
rather than considering statistical probabilities.
● Example:
A coin toss sequence of HHTHTT is judged as more random than HHHHHH, even
though both have the same probability.
● Research Insight:
○ Tversky and Kahneman (1972) demonstrated how people judge probabilities
based on representativeness rather than base rates.
○ Kahneman and Tversky (1983) studied the conjunction fallacy, where
people incorrectly believe that the probability of two events (e.g., "bank teller"
and "feminist") is higher than the probability of one event alone.
● Biases Associated:
○ Base Rate Neglect: Ignoring statistical probabilities in favor of stereotypes.
○ Small Sample Fallacy: Assuming that small samples represent the
population.

2. Availability Heuristic

● Definition:
Decisions are influenced by how easily examples or instances come to mind. Events
that are recent, vivid, or emotionally charged tend to be judged as more frequent or
likely.
● Example:
After hearing about a plane crash in the news, individuals might overestimate the risk
of air travel.
● Research Insight:
○ Tversky and Kahneman (1973) highlighted that availability impacts judgments
of frequency and probability.
○ MacLeod and Campbell (1992) showed that recalling positive or negative past
events can shape expectations for future outcomes.
● Biases Associated:
○ Recency Effect: Recent events are perceived as more likely.
○ Familiarity Effect: Frequently mentioned items are perceived as more
common (e.g., media coverage distorting population estimates).

3. Anchoring and Adjustment Heuristic

● Definition:
Decisions are influenced by an initial reference point (anchor), and subsequent
adjustments are often insufficient to reach a correct answer.
● Example:
When asked if the Mississippi River is longer or shorter than 500 miles, estimates are
anchored near 500 miles, even though the actual length is over 2,300 miles.
● Research Insight:
○ Tversky and Kahneman (1974) showed that arbitrary anchors influence
numerical judgments.
○ Englich and Mussweiler (2001) demonstrated anchoring in courtroom
sentencing; judges' decisions were swayed by anchors provided by
inexperienced prosecutors.
● Biases Associated:
○ Over-reliance on the anchor: Even irrelevant or arbitrary anchors affect
decisions.
○ Confidence Intervals: People fail to adjust sufficiently, leading to overly
narrow estimates.

4. Affect Heuristic

● Definition:
Decisions are influenced by immediate emotional responses rather than logical
analysis. Positive or negative feelings shape judgments of risks and benefits.
● Example:
People may avoid nuclear energy due to fears of radiation, even if statistics show it is
safer than other energy sources.
● Research Insight:
○ Slovic et al. (2002) demonstrated how emotional reactions guide perceptions
of risk and reward, often overriding factual information.
● Biases Associated:
○ Negativity Bias: Negative emotions (fear, anxiety) have a stronger influence
than positive ones.
○ Overgeneralization: Quick emotional judgments overshadow detailed
analysis.

5. Recognition Heuristic

● Definition:
When comparing two items, the one that is recognized is often judged as having
higher value or likelihood.
● Example:
If asked which city has a larger population, Milan or Modena, most people would
choose Milan because it is more recognizable.
● Research Insight:
○ Goldstein and Gigerenzer (2002) found that the recognition heuristic often
leads to accurate judgments in real-world scenarios.
● Biases Associated:
○ Familiarity Bias: Recognized options are favored over unrecognized ones,
even if irrelevant.

General Observations About Heuristics


1. Advantages:
○ Heuristics are efficient, often yielding quick and reasonably accurate
decisions.
○ They work well in familiar or low-stakes situations.
2. Disadvantages:
○ Heuristics can lead to systematic errors or biases, particularly in complex or
unfamiliar situations.
○ Over-reliance on heuristics can result in suboptimal or irrational decisions.
3. Dual-Process Theory:
○ Heuristics (System 1: fast and intuitive thinking) often contrast with deliberate,
analytical reasoning (System 2: slow and logical thinking), as described by
Kahneman (2011).

Applications of Heuristics

1. Healthcare:
○ Doctors using the availability heuristic may overdiagnose conditions recently
seen in other patients.
2. Finance:
○ Investors may anchor their decisions on previous stock prices, ignoring new
market information.
3. Policy and Marketing:
○ Framing effects (related to heuristics) influence consumer and voter behavior
by presenting choices as gains or losses.

Strategies to Mitigate Heuristic Biases

1. Awareness Training:
Educating individuals about common heuristics and their potential biases.
2. Use of Statistical Analysis:
Encouraging reliance on data over intuition.
3. De-Biasing Techniques:
Methods like the "crystal-ball technique" (Cannon-Bowers & Salas, 1998) or
considering alternative scenarios to reduce overconfidence.

Understanding heuristics helps in recognizing their utility in simplifying decisions and the
necessity of caution to avoid biases in critical contexts.

Biases in Decision-Making

Definition:
Biases are systematic deviations from rational judgment or logical decision-making. They
occur due to the influence of cognitive shortcuts, emotions, social pressures, and
environmental factors, often leading individuals to make flawed or suboptimal decisions.

Biases are a by-product of our reliance on heuristics—mental shortcuts used to simplify


complex decision-making processes.

Key Types of Biases

1. Confirmation Bias

● Definition:
The tendency to seek, interpret, and remember information that confirms pre-existing
beliefs while ignoring evidence that contradicts them.
● Example:
A person who believes in astrology might remember instances where their horoscope
predictions were accurate but overlook times when they were not.
● Research Insight:
○ Wason (1960) demonstrated this bias through the "Wason Selection Task,"
where participants tended to confirm their hypotheses rather than testing for
disconfirmation.

2. Anchoring Bias

● Definition:
The tendency to rely heavily on the first piece of information encountered (the
"anchor") when making decisions, even if it is irrelevant.
● Example:
In a negotiation, the first offer serves as an anchor, influencing subsequent
discussions and the final agreement.
● Research Insight:
○ Tversky and Kahneman (1974) showed that arbitrary numbers, such as a
random wheel spin, influenced participants’ numerical estimates in unrelated
tasks.

3. Availability Bias

● Definition:
The tendency to judge the likelihood of an event based on how easily examples
come to mind, often influenced by recent, vivid, or emotional experiences.
● Example:
After hearing about a plane crash in the news, people might overestimate the risk of
flying, despite statistics showing its relative safety.
● Research Insight:
○ Tversky and Kahneman (1973) highlighted how this bias skews perceptions
of risk and frequency judgments.

4. Representativeness Bias

● Definition:
The tendency to judge probabilities or categories based on how closely an instance
resembles a stereotype or prototype, while neglecting base rates or statistical
probabilities.
● Example:
Believing someone with a quiet demeanor is more likely to be a librarian than a
salesperson, despite the larger number of salespeople overall.
● Research Insight:
○ Kahneman and Tversky (1972) showed how stereotypes overshadow
base-rate information in probability judgments.

5. Overconfidence Bias

● Definition:
The tendency to overestimate one's knowledge, abilities, or predictions about future
events.
● Example:
Students frequently underestimate the time needed to complete assignments, a
phenomenon known as the planning fallacy.
● Research Insight:
○ Kahneman and Tversky (1995) demonstrated overconfidence in estimating
probabilities of events, even in situations with known uncertainties.

6. Hindsight Bias

● Definition:
The inclination to perceive events as having been predictable after they have
occurred, often referred to as the "knew-it-all-along" effect.
● Example:
After a stock market crash, investors claim they always knew it was going to happen.
● Research Insight:
○ Fischhoff (1975) identified this bias and showed how it distorts individuals'
understanding of past events.

7. Framing Effect
● Definition:
Decisions are influenced by how information is presented, particularly as gains or
losses, even if the underlying facts remain the same.
● Example:
People are more likely to opt for a treatment with a "90% survival rate" than one with
a "10% mortality rate."
● Research Insight:
○ Tversky and Kahneman (1981) showed that the framing of outcomes affects
whether people take risks (gain vs. loss scenarios).

8. Loss Aversion

● Definition:
The tendency to prefer avoiding losses over acquiring equivalent gains, often leading
to risk-averse behavior.
● Example:
Losing $100 feels more painful than the pleasure of gaining $100.
● Research Insight:
○ Kahneman and Tversky (1979) proposed this as a cornerstone of their
Prospect Theory, explaining how losses weigh more heavily than gains in
decision-making.

9. Cognitive Dissonance

● Definition:
The discomfort experienced when holding contradictory beliefs or behaviors, often
leading individuals to justify or rationalize their choices.
● Example:
A smoker who knows smoking causes cancer might downplay the risks to reduce
discomfort.
● Research Insight:
○ Festinger (1957) introduced the theory of cognitive dissonance, showing how
individuals seek consistency in their beliefs and behaviors.

10. Status Quo Bias

● Definition:
The preference for maintaining the current state of affairs, avoiding change even
when it could lead to better outcomes.
● Example:
Choosing the default option in retirement plans instead of exploring alternatives that
may offer higher returns.
● Research Insight:
○ Samuelson and Zeckhauser (1988) found this bias prevalent in scenarios
involving complex decisions.

11. Self-Serving Bias

● Definition:
The tendency to attribute positive outcomes to personal efforts and negative
outcomes to external factors.
● Example:
A student attributing good grades to hard work but blaming poor grades on unfair
exams.
● Research Insight:
○ Miller and Ross (1975) studied this bias and its role in maintaining
self-esteem.

12. Halo Effect

● Definition:
The tendency for an overall impression of a person or entity to influence judgments
about their specific traits or abilities.
● Example:
Assuming a physically attractive person is also more competent or intelligent.
● Research Insight:
○ Thorndike (1920) documented this effect in his studies on human judgment.

How Biases Impact Decision-Making

1. Everyday Life:
○ People rely on biases to navigate complex environments quickly but often at
the expense of accuracy.
○ Example: Overconfidence bias can lead to underpreparedness in personal or
professional tasks.
2. Policy and Leadership:
○ Loss aversion and status quo bias influence policymaking, as leaders avoid
risky changes even when evidence supports them.
○ Example: Framing effects in public health campaigns influence vaccine
uptake.
3. Business and Marketing:
○ Anchoring and framing are widely exploited in advertising and pricing
strategies to influence consumer behavior.
○ Example: Presenting a product as "50% off" rather than "half-price" increases
perceived value.
Mitigating Biases in Decision-Making

1. Awareness and Education:


○ Teaching individuals about common biases helps them recognize and
counteract these tendencies.
2. Structured Decision-Making:
○ Using decision matrices, statistical models, or decision-support systems can
minimize heuristic-driven errors.
3. Encouraging Type 2 Thinking:
○ Emphasizing slow, deliberate, and analytical processing (as per Kahneman’s
Thinking, Fast and Slow) can help overcome biases.
4. Seeking Diverse Perspectives:
○ Collaborating with individuals from different backgrounds reduces groupthink
and self-serving bias.
5. Critical Evaluation:
○ Actively questioning assumptions and seeking disconfirming evidence
combats confirmation bias and overconfidence.

Understanding biases is crucial for improving decision-making quality. By recognizing their


influence, individuals and organizations can implement strategies to minimize their impact,
leading to more rational and effective choices.

Framing Effect: Understanding Its Role in Decision-Making

Definition

The framing effect refers to the phenomenon where people's decisions are influenced by
how information is presented, rather than the content itself. Depending on whether an option
is framed as a gain or a loss, individuals may exhibit risk-averse or risk-seeking behavior.

Key Principles of the Framing Effect

1. Gain vs. Loss Framing:


○ When outcomes are framed as gains, people tend to avoid risks
(risk-averse).
○ When outcomes are framed as losses, people tend to seek risks
(risk-seeking).
○ Example: In healthcare, a treatment described as having a "90% survival
rate" is preferred over one with a "10% mortality rate," even though both
describe the same outcome.
2. Emphasis on Context:
○ The context or "frame" shapes how individuals perceive the stakes involved in
a decision.
○ This effect is consistent with Prospect Theory (Kahneman & Tversky, 1979),
which highlights that people weigh losses more heavily than equivalent gains.
3. Surface Structure Over Deep Structure:
○ People are often influenced by the superficial presentation of information
rather than its underlying logical equivalence.

Research and Evidence on the Framing Effect

1. Tversky and Kahneman (1981): Framing and Risk Preferences

● Study: Participants were asked to choose between two public health programs
designed to combat an epidemic.
● Scenarios:
○ In the gain frame, participants chose between:
1. Saving 200 people with certainty.
2. A one-third chance of saving 600 people and a two-thirds chance of
saving no one.
Result: 72% preferred the certainty of saving 200 lives (risk-averse).
○ In the loss frame, participants chose between:
1. 400 people dying with certainty.
2. A one-third chance that no one would die and a two-thirds chance that
600 would die.
Result: 78% preferred the risky option (risk-seeking).
● Conclusion:
The framing of the problem (saving lives vs. avoiding deaths) strongly influenced
decision-making, even though the outcomes were statistically identical.

2. Levin, Schneider, and Gaeth (1998): Types of Framing

● Identified three types of framing effects:


1. Risky Choice Framing: How risk preferences change with gain/loss framing
(e.g., the epidemic problem).
2. Attribute Framing: Positive vs. negative descriptions of an attribute (e.g.,
"90% lean meat" vs. "10% fat meat").
3. Goal Framing: Emphasizing the benefits of an action versus the
consequences of inaction (e.g., "You will improve health by exercising" vs.
"You risk poor health if you don’t exercise").

3. Druckman (2001): Political Framing

● Showed how the framing effect shapes public opinion in political contexts.
● Example: The same policy can receive differing support based on whether it is
framed as "tax relief" (positive connotation) or "tax cuts" (potentially negative).
Mechanisms Behind the Framing Effect

1. Emotional Reactions:
○ Gains are associated with positive emotions (e.g., relief), while losses trigger
negative emotions (e.g., fear or anger). These emotional reactions drive risk
preferences.
2. Loss Aversion (Kahneman & Tversky, 1979):
○ People experience losses more intensely than equivalent gains, making loss
frames more compelling.
3. Cognitive Processing:
○ Gain frames align with risk-aversion tendencies, encouraging cautious
behavior.
○ Loss frames activate risk-seeking tendencies, as people are more willing to
gamble to avoid a loss.

Real-World Applications of the Framing Effect

1. Healthcare Decisions

● Gain Frame Example: "This treatment has a 90% survival rate."


● Loss Frame Example: "This treatment has a 10% mortality rate."
● Research shows gain-framed messages are more effective in promoting preventative
behaviors (e.g., vaccinations), while loss-framed messages are better at motivating
detection behaviors (e.g., cancer screenings).

2. Marketing and Consumer Behavior

● Advertisements use attribute framing to influence consumer choices:


○ Positive Frame: "80% of users reported satisfaction."
○ Negative Frame: "Only 20% of users reported dissatisfaction."

3. Public Policy and Campaigns

● Policies are framed to emphasize either gains (e.g., "job creation") or losses (e.g.,
"job cuts") to sway public opinion.
● Example: Framing environmental policies as "protecting future generations" (gain
frame) versus "preventing climate disasters" (loss frame).

4. Legal and Negotiation Contexts

● Lawyers and negotiators use framing to influence judgments about settlements and
penalties. For example:
○ A defense attorney may frame a plea deal as avoiding a "20-year sentence"
rather than focusing on the reduced "5-year punishment."
Limitations and Criticism of the Framing Effect

1. Individual Differences:
○ Not all individuals respond equally to framing; factors like personality traits,
cultural norms, and decision-making experience can moderate the effect.
2. Cognitive Load:
○ Under high cognitive load or stress, people may rely more heavily on frames,
exacerbating their susceptibility to the framing effect.
3. Consistency of Influence:
○ While the framing effect is robust, its impact can diminish when individuals are
aware of the manipulation or when incentives for accuracy are high.

Strategies to Mitigate the Framing Effect

1. Critical Thinking:
○ Encourage decision-makers to focus on the underlying facts, not the framing.
○ Example: Reframing loss scenarios into equivalent gain terms to assess if
preferences change.
2. Awareness and Education:
○ Educating individuals about cognitive biases can reduce susceptibility to
framing.
3. Structured Decision-Making:
○ Using decision trees or algorithms to compare options without the influence of
emotional framing.
4. Neutral Framing:
○ Presenting information in a neutral and balanced manner reduces the
likelihood of biased decisions.

Process of Decision-Making

Decision-making involves a series of cognitive steps where individuals evaluate options,


assess outcomes, and choose a course of action. The process can be broken down into the
following key stages:

1. Identifying the Problem or Goal

● Recognizing a need or desire to make a choice, whether resolving a conflict,


achieving a goal, or addressing uncertainty.
● Example: Deciding whether to pursue a career opportunity or higher education.

2. Gathering Information
● Collecting relevant data or insights about possible alternatives, risks, and benefits.
● Information may come from memory, external sources, or sensory cues.

Research Insight:
Simon (1957) proposed the concept of bounded rationality, suggesting that humans cannot
process all available information due to cognitive limitations. Instead, they settle for a
"satisficing" solution—good enough but not optimal.

3. Evaluating Alternatives

● Weighing the pros and cons of different choices based on factors such as feasibility,
risks, and potential benefits.
● This stage involves both rational (Type 2) and intuitive (Type 1) thinking.

Research Insight:
Kahneman and Tversky (1974) studied the anchoring effect, showing that initial impressions
or reference points heavily influence evaluations, even when they are arbitrary.

4. Making the Choice

● Selecting an option that aligns with priorities, values, and the evaluation of
alternatives.
● Emotional and cognitive biases may influence this step.

Research Insight:
Kahneman (2011) distinguished between fast, intuitive decisions (System 1) and deliberate,
analytical decisions (System 2), emphasizing their interplay in human choice.

5. Taking Action

● Implementing the chosen decision. This could involve communication, resource


allocation, or initiating tasks related to the decision.

6. Reviewing the Outcome

● Assessing the effectiveness of the decision. Was the goal achieved? Were there
unexpected consequences?
● Feedback from this step helps refine future decision-making processes.
Research Insight:
Janis and Mann (1977) explored decision evaluation frameworks, emphasizing the
importance of reviewing outcomes to avoid post-decisional regret or "buyer's remorse."

Key Researches and Insights in Decision-Making

1. Herbert Simon (1957):


○ Concept of bounded rationality.
○ Humans simplify decision-making due to limited cognitive resources.
2. Amos Tversky and Daniel Kahneman (1974):
○ Identified heuristics like representativeness, availability, and anchoring.
○ Demonstrated how cognitive shortcuts can lead to systematic biases.
3. Kahneman and Tversky (1981):
○ Explored the framing effect, showing how decisions are influenced by how
information is presented (e.g., lives saved vs. lives lost).
4. Janis and Mann (1977):
○ Developed the conflict model of decision-making.
○ Highlighted how stress and time pressure affect decision outcomes.
5. Daniel Kahneman (2011):
○ Distinguished between intuitive and analytical thinking in Thinking, Fast and
Slow.
○ Explained the role of biases like overconfidence and hindsight.
6. Buehler, Griffin, and Ross (1994):
○ Coined the planning fallacy.
○ Found that people consistently underestimate the time needed to complete
tasks.
7. Lavie (1995):
○ Developed the Perceptual Load Theory.
○ Demonstrated how attentional capacity influences decision-making in high-
vs. low-load tasks.
8. Gigerenzer and Goldstein (1996):
○ Studied fast and frugal heuristics.
○ Showed how simple decision rules often perform as well as complex models
in everyday life.
9. Johnson and Tversky (1983):
○ Investigated the role of emotions in decision-making.
○ Found that moods can alter risk perception, making individuals more or less
cautious.
10. Slovic, Finucane, Peters, and MacGregor (2002):
○ Proposed the affect heuristic.
○ Decisions are influenced by immediate emotional reactions to stimuli.

The Role of Neuroscience in Decision-Making


1. Damasio (1994):
○ Proposed the Somatic Marker Hypothesis.
○ Emotional signals (somatic markers) guide decision-making, especially in
uncertain or risky situations.
2. Hillyard et al. (1973):
○ Studied the role of event-related potentials (ERPs).
○ Showed how attention affects neural processing during decision-making.
3. Bechara et al. (1997):
○ Investigated decision-making in individuals with brain damage.
○ Found that damage to the prefrontal cortex impairs emotional regulation and
leads to poor choices.

Applications of Research in Decision-Making

1. Healthcare:
○ Improving patient choices through shared decision-making models.
○ Example: Reducing bias in treatment decisions using decision aids (Elwyn et
al., 2012).
2. Policy-Making:
○ Understanding framing effects to design better public health campaigns.
○ Example: Nudges for vaccination uptake (Thaler and Sunstein, 2008).
3. Education:
○ Teaching critical thinking and awareness of biases in decision-making (Levy,
2010).
4. Business and Management:
○ Using anchoring effects in pricing strategies and negotiations.
○ Example: Englich and Mussweiler (2001) showed that arbitrary anchors
influenced courtroom sentencing.

You might also like