Group 9 BE Report
Group 9 BE Report
Group Project
Subject: Understanding Behavioural Biases in Decision-Making: An Empirical Analysis
of Cognitive Heuristics and Their Real-World Implications
Submitted by Group - 9
AADARSH RAGHUVANSHI 230601001
KAVISH CHOUDHARY 230601042
P AKHIL KUMAR 230601048
PAYAL SWAR 230601049
RISHAB AGARWAL 230601058
USHAB SETHIYA 230601084
Distribution of Data
Age Group: The majority of respondents were between the ages of 22–26, with 24
being the most frequently observed age.
Educational Background: Most participants held a master's degree, while others had
qualifications ranging from bachelor's and high school diplomas to doctorates.
Survey Structure: The questionnaire was divided into six sections, each
corresponding to a key behavioural economics theme.
Research Background
Behavioural economics examines the decision-making processes that deviate from traditional
economic theories of rationality. By incorporating psychological insights, it explains why
individuals often rely on heuristics and exhibit biases. Research in this domain has provided
crucial insights into consumer behaviour, financial decision-making, negotiation strategies,
and policymaking. The six behavioural games designed in this study aim to test and validate
findings from established research, such as Nash Equilibrium in strategic games, the impact
of framing effects, and the influence of default choices on decision-making.
Game Design and Theoretical Framework
Each game in this study was designed based on fundamental behavioural economic theories,
incorporating real-world decision-making elements to assess how individuals make choices in
different strategic settings. The games were structured with specific scenarios, choices, and
incentives to elicit behavioural responses that align with—or deviate from—rational decision-
making principles
Theories Involved:
Nash Equilibrium: A game theory concept where no player can improve their
outcome by unilaterally changing their strategy, assuming the other players keep their
strategies unchanged. It represents a stable state where every participant's choice is
optimal given the choices of others.
Bounded Rationality: A concept in behavioral economics that suggests that people
do not make perfectly rational decisions because of cognitive limitations, lack of
information, and time constraints. Instead of finding the optimal solution, individuals
use simplified decision-making processes, often relying on heuristics to make choices
that are "good enough" rather than perfect.
Findings:
About 72% of participants initially chose to cooperate, showing that people tend to
trust strangers in the absence of prior negative experiences.
When cooperation was met with cooperation, nearly 80% continued to cooperate,
reinforcing the idea that positive actions encourage mutual trust.
The Prisoner's Dilemma is a classic game theory scenario where two individuals must choose
between cooperating or betraying each other.
If one betrays while the other cooperates, the betrayer gets ₹2 lakh, while the
cooperator gets ₹0.
Round 1
First Impressions: You are paired with a stranger for the first time. Without any prior
information, do you:
Round 2
Positive Feedback: In the previous round, you learned that your partner chose to cooperate.
Now, do you:
Round 3
A Breach of Trust: Your partner unexpectedly betrayed in Round 2 while you had
cooperated. In this round, do you:
Round 4
Opportunity to Reconcile: After Round 3, you sense that your partner might be trying to
apologize through subtle cues in their behavior. Do you:
Round 5
Rebuilding Trust: In the final round, your partner sends a clear signal (a direct apology or
promise) that they wish to cooperate moving forward. Do you:
1. Round 1: Axelrod’s (1984) seminal work on the Evolution of Cooperation often found
that a sizable fraction of subjects start with cooperation in repeated PD settings, hoping
for reciprocal cooperation. Fehr & Fischbacher (2002) also show that a substantial
proportion of individuals exhibit “conditional cooperation,” starting cooperatively unless
betrayed.
2. Round 2: Nowak & Sigmund (2005) found that “tit-for-tat” or similar reciprocal
strategies are highly successful in repeated dilemmas. Our data align with the idea that
cooperation begets cooperation (positive reciprocity).
3. Round 3: Fehr & Gächter (2002) show that punishment (or withholding cooperation) is
common when trust is broken. Our split result reflects the heterogeneity in punishment vs.
forgiveness—some use “grim trigger” or “tit-for-tat,” others may use “win–stay, lose–
shift.”
4. Round 4: Studies of repeated PD (e.g., Ostrom, Gardner, & Walker, 1994) suggest that
communication or conciliatory signals often boost cooperation. Our finding that three‐
quarters forgave underscores the power of an apology or cooperative cue to restore trust.
5. Round 5: Axelrod (1984) notes that in finite repeated PDs, a “backward induction”
argument would predict end‐game defection, but real participants often don’t fully adopt
that logic. Our data confirm that many remain cooperative despite it being the last round.
Fischbacher & Gächter (2010) also found that unconditional cooperators persist longer
than pure rational game‐theory might predict.
After receiving cooperation, even more participants (almost 80%) stayed cooperative.
This reinforces the reciprocity principle: people tend to match positive actions with
positive actions.
Responses are almost evenly split. About half tried to re‐establish cooperation, while
the other half defected in response. This suggests a tension: some participants are
more “forgiving,” while others punish betrayal right away.
A strong majority (about 74%) were open to reconciling. This indicates that many
participants are willing to move past a single betrayal if they receive a genuine sign of
cooperation.
A strong majority (about 74%) were open to reconciling. This indicates that many
participants are willing to move past a single betrayal if they receive a genuine sign of
cooperation.
Game 1 Conclusion
The survey findings indicate that most individuals initially opt for collaboration,
demonstrating a general sense of goodwill when interacting with an unfamiliar person. When
cooperation is reciprocated, even more participants continue to engage positively,
emphasizing the tendency to mirror constructive behavior. However, when faced with
deception, responses diverge, with some striving to restore harmony while others take a more
defensive stance. A significant portion of respondents, however, remain open to mending
relationships, suggesting a preference for resolution over prolonged conflict. Overall, the
results highlight a mix of trust, cautiousness, and adaptability, shaping how people navigate
strategic decision-making in uncertain social situations.
Theories Involved:
Default Bias: Default Bias (also known as the status quo bias) is a cognitive bias in
which people tend to stick with pre-selected or default options rather than actively
making a different choice. This happens because defaults are seen as the easiest, safest, or
most socially accepted choice, even when alternatives might be better.
Choice Architecture: Choice Architecture refers to the way in which options are
presented, structured, and designed to influence decision-making. It is a core concept in
behavioral economics, emphasizing how subtle changes in the framing, ordering, and
defaults of choices can nudge people toward specific decisions without restricting their
freedom.
Findings:
More than 56% of participants stuck with the default pre-selected choice, showing
that people tend to follow the path of least resistance when making decisions.
Around 59% of participants were swayed by "Most Popular" labels, reaffirming that
highlighting popularity significantly influences consumer behavior.
About 33% of respondents acted on limited-time offers, showing that scarcity can
drive decisions, but a majority still prefer to evaluate options without pressure.
Policy Implications:
Governments and businesses should set healthier, safer, or more sustainable options as
defaults, making positive decisions easier for consumers.
Policies should ensure transparency in marketing tactics, preventing excessive
reliance on urgency or social proof strategies that pressure consumers into hasty
decisions.
Default Decision
Visual Emphasis
Ordering Effect
Limited Time Offer
Social Proof Influence
Default Decision: When ordering a meal, the system pre-selects a “healthy option” by
default. Do you:
Social Proof Influence: A product option is labelled “Most Popular” next to it. When all
else is equal, do you:
Visual Emphasis: In a list of several offers, one option is highlighted with a bold color and
larger font. Do you:
1. Default Decision: Johnson & Goldstein (2003) found opt‐in vs. opt‐out defaults
drastically change organ donation rates. Our finding (a modestly strong default effect)
is consistent with the broader literature: a significant, though not overwhelming,
portion of participants accept the default.
2. Social Proof Influence: Cialdini’s (2001, 2007) work on social influence repeatedly
shows that people often conform to what they perceive as the norm. Our ratio (about
60–40) aligns well with typical findings where a substantial fraction of people follow
recommended or “most popular” choices.
3. Visual Emphasis: Many marketing studies (e.g., Chandon, Hutchinson, & Young,
2009) show that salience, color, and shelf position in stores can significantly affect
product choice. Actual effect sizes vary. The fact that ~30% admitted to leaning
toward emphasized items is still meaningful; real‐world effects could be larger.
4. Limited Time Offer: Ariely & Simonson (2003) showed that scarcity tactics (e.g.,
“Only 2 seats left!”) can significantly raise purchase rates. Our data show that a
sizeable minority (33%) are indeed influenced, which is in line with studies
suggesting strong but not absolute effects of scarcity.
5. Ordering Effect: Mantonakis et al. (2009) found that people tend to select earlier-
presented wine options, even if later ones better match their preferences. Häubl &
Trifts (2000) demonstrated that product sequence in online shopping impacts
purchasing decisions, favoring top-listed items. Thaler & Sunstein’s (2008) "choice
architecture" highlights how ordering acts as a nudge, subtly guiding consumer
behavior. In your data, the effect is not overwhelming, with a near 50–50 split,
suggesting some preference for the first option but also a willingness to explore
alternatives. This aligns with past studies, where primacy effects exist but do not
entirely dominate consumer decisions, especially as users become more experienced
or skeptical.
Over half chose to remain with the healthy default. This indicates that defaults can be
a fairly strong nudge.
A clear majority (59%) succumbed to social proof. This reaffirms how highlighting
popularity can steer consumer decisions.
A smaller but still notable group (~31%) reported being swayed by visual emphasis.
The majority (69%) claimed to evaluate all options on content alone—but keep in
mind stated behavior may not always match actual behavior in real choice settings.
One‐third of respondents are swayed by urgency, while two‐thirds resist or want more
thorough evaluation. This suggests urgency can be an effective nudge but not
universally so.
Nearly half of your participants (about 49%) opted for the top‐listed, mid‐priced item.
This reveals a mild to moderate primacy effect: while slightly fewer than half
defaulted to the first item, it is still a notable proportion.
Real World Applications
E-Commerce & Online Shopping – Websites use default selections, social proof (e.g.,
"Best Seller"), and urgency tactics ("Only 2 left!") to influence purchasing decisions.
Subscription Services – Platforms like Netflix and Spotify set default plans to
encourage users to stick with pre-selected options.
Game 2 Conclusion
Default Effect, Social Proof, Visual Emphasis, Limited‐Time Offer, and Ordering all
show that contextual cues significantly shape behavior—though not everyone is
influenced to the same degree.
The Ordering Effect in your results is a classic example of how many participants take
the “path of least resistance” when faced with an attractive first choice, reflecting
well‐documented primacy biases in consumer and behavioral research.
Theories Involved:
Anchoring Effect: The Anchoring Effect is a cognitive bias in which people rely too
heavily on the first piece of information (the "anchor") they receive when making
decisions. This initial reference point influences their subsequent judgments and
choices, even if it is arbitrary or unrelated.
Heuristics & Biases: A framework in behavioral economics and cognitive
psychology that explains how people make decisions quickly and efficiently using
mental shortcuts (heuristics), which often lead to systematic errors (biases) in
judgment.
Findings:
A majority of participants estimated prices close to the given anchor, showing that
initial reference points significantly shape judgments, even when arbitrary.
More participants preferred "Save ₹200" over "20% off", highlighting that absolute
savings feel more impactful than percentage discounts.
Most participants believed celebrity scandals were more common than everyday
events, demonstrating how media exposure influences perception.
Policy Implications:
Anchoring Effect
Availability Heuristic
Intution vs. Analysis
Framing a Discount
Representativeness
Anchoring Effect: Before guessing the average price of a new gadget, you are told that
similar gadgets typically cost around 500 Rs. What is your estimate?
Framing a Discount: You see a product advertised in two ways: one version states “Save
200 Rs.” while another says “20% off” (with both being equivalent discounts). Which
presentation makes you more likely to purchase?
Availability Heuristic: Which event feels more common to you, even if statistically it isn’t?
Representativeness: You meet someone who is quiet and detail oriented. Which role do you
assume they are more likely to have?
Intuition vs. Analysis: When you have to make a split-second decision, do you:
1. Anchoring Effect: Tversky & Kahneman (1974) originally demonstrated that people’s
numerical estimates remain close to the initial anchor even when they know it may be
arbitrary. Mussweiler & Strack (1999) further showed that anchoring effects persist across
various domains (e.g., real estate pricing, population estimates). Our findings align well
with these studies: most participants gravitate to the anchor despite the possibility that the
actual price could be lower or higher.
2. Framing a Discount: Chen, Marmorstein, & Tsiros (2008) in consumer research found
that absolute currency discounts are often more impactful when the numeric difference
“feels” significant. Kahneman & Tversky (1981) (and subsequent framing research) show
that people’s preferences switch based on how the choice is posed (gain vs. loss, absolute
vs. relative). Our results confirm that how you phrase a discount can sway consumer
perception—even when the actual economic value is the same.
3. Availability Heuristic: Tversky & Kahneman (1973) originally defined the availability
heuristic, showing that events recounted more vividly or frequently in media are
perceived as more probable. More recent work (e.g., Combs & Slovic, 1979) found that
disproportionate media attention to certain events (e.g., plane crashes, high‐profile
scandals) leads people to overestimate their frequency. Our findings echo these
conclusions: vivid, news‐worthy events inflate perceived prevalence.
4. Representativeness: Kahneman & Tversky (1972, 1973) showed that people frequently
neglect base rates and focus on how well a description fits a prototype. This is
reminiscent of the “Linda problem,” where participants choose a more representative but
statistically less likely option. Our result strongly aligns with classic findings on
representativeness, underscoring that stereotypes can override logical/statistical
considerations.
5. Intution vs. Analysis: Kahneman (2011) and others (e.g., Evans & Stanovich, 2013)
show that while many decisions occur via quick System 1 processes, individuals can
override these with deliberation (System 2) if they are motivated and have time. Our
results underscore a pragmatic middle ground—many people do try to incorporate “a little
bit of thinking” before finalizing their snap judgments, aligning with the notion that real‐
world decision‐making often mixes both systems.
Over half chose to remain with the healthy default. This indicates that defaults can be
a fairly strong nudge.
A clear majority (59%) succumbed to social proof. This reaffirms how highlighting
popularity can steer consumer decisions.
A smaller but still notable group (~31%) reported being swayed by visual emphasis.
The majority (69%) claimed to evaluate all options on content alone—but keep in
mind stated behavior may not always match actual behavior in real choice settings.
One‐third of respondents are swayed by urgency, while two‐thirds resist or want more
thorough evaluation. This suggests urgency can be an effective nudge but not
universally so.
Nearly half of your participants (about 49%) opted for the top‐listed, mid‐priced item.
This reveals a mild to moderate primacy effect: while slightly fewer than half
defaulted to the first item, it is still a notable proportion.
E-Commerce & Online Shopping – Websites use default selections, social proof (e.g.,
"Best Seller"), and urgency tactics ("Only 2 left!") to influence purchasing decisions.
Subscription Services – Platforms like Netflix and Spotify set default plans to
encourage users to stick with pre-selected options.
Game 3 Conclusion
Theories Involved:
Expected Utility Theory: Proposed by John von Neumann and Oskar Morgenstern,
suggests that individuals make decisions by choosing the option that maximizes their
expected utility rather than expected monetary value. Utility is a subjective measure
of value, reflecting personal preferences and risk tolerance.
Prospect Theory: It explains how people perceive gains and losses differently,
introducing the concepts of loss aversion and probability weighting. Unlike EUT, it
shows that decisions are influenced more by potential losses than equivalent gains.
Findings:
Most participants preferred a sure reward over a higher expected value risky option,
aligning with Prospect Theory’s finding that people tend to avoid risk when dealing
with potential gains.
A significant minority reported being swayed by a peer’s past success, indicating that
observational learning and herd behavior can influence personal financial decisions.
Many participants found a 10% chance to win a large reward appealing, supporting
the "possibility effect", where people overestimate low-probability events.
Policy Implications:
Safe vs. Risky Choice: You can either receive a guaranteed 2000 Rs. or take a 50% chance
to win 6,000 Rs. (with a 50% chance to win nothing). Which do you choose?
Lottery Appeal: You’re offered a lottery ticket with a 10% chance to win 5,00,000 Rs. How
attractive is this opportunity?
Influence of a Peer: A friend shares that they recently took a significant risk which paid off
well. Does this influence you to take risks?
Decision Strategy: When faced with uncertainty, do you tend to rely more on:
Investment Style: Given two investment options—one with low risk and modest returns, and
another with high risk but the potential for high returns—which are you more likely to
choose?
Comparison with Published Research
1. Safe vs. Risky Choice: According to Expected Utility Theory (EUT), a rational decision-
maker should choose the option with the highest expected value. However, Prospect
Theory (Kahneman & Tversky, 1979) explains why most people prefer the guaranteed
₹2,000, as they are risk-averse for gains, even when the risky option has a higher
expected value.
2. Lottery Appeal: Prospect Theory’s probability distortion shows that people tend to
overestimate low-probability events, making a 10% chance to win ₹5,00,000 seem more
attractive than its actual expected value. Cumulative Prospect Theory (Tversky &
Kahneman, 1992) further supports this by showing how individual’s overweight small
probabilities and underweight high probabilities.
3. Influence of a Peer: Social Influence Theory (Cialdini, 2001) and Herd Behavior
(Banerjee, 1992) suggest that observing a friend’s successful risky decision can increase
personal risk-taking, as individuals often rely on social proof rather than objective
probabilities when making uncertain decisions.
4. Decision Strategy: Statistics vs. Intuition – Dual-Process Theory (Kahneman, 2011)
explains the difference between System 1 (fast, intuitive decision-making) and System 2
(slow, analytical thinking). Participants who rely on gut feelings are engaging System 1,
whereas those who prefer statistical reasoning use System 2.
5. Investment Style: Modern Portfolio Theory (Markowitz, 1952) suggests that rational
investors balance risk and return efficiently. However, Prospect Theory explains why
some investors are risk-seeking for potential high gains, while others prefer low-risk,
modest returns to avoid losses, even if the expected return is lower.
Risk Aversion for Sure Gains: Most participants prefer a guaranteed reward over a
risky option with higher expected value, aligning with Prospect Theory’s finding of
risk aversion in the gain domain.
Stock Market & Investments – Investors often prefer low-risk assets (bonds, mutual
funds) due to risk aversion, but herd behavior drives speculative bubbles (e.g., crypto,
meme stocks).
Gambling & Lotteries – People overestimate small probabilities, making lotteries and
casino games attractive despite poor odds.
Insurance Industry – Loss aversion leads people to buy insurance even when the
expected payout is lower than the premium.
Marketing & Sales Tactics – Urgency-based promotions (e.g., “Flash Sale!”) exploit
risk perception to drive impulsive purchases.
Entrepreneurship – Entrepreneurs take high risks, but sunk cost fallacy makes some
persist in failing ventures.
Peer Influence in Finance – Social proof and herd behavior lead people to chase
trendy investments (e.g., Bitcoin, GameStop).
Game 4 Conclusion
Game 4 highlights how individuals perceive and respond to risk, often deviating from rational
decision-making models. Prospect Theory explains why people prefer certainty over risk for
gains but take higher risks to avoid losses. Probability distortion makes lotteries seem more
attractive than they are, while social influence and herd behavior impact financial decisions.
The study also reveals that some rely on statistical reasoning, while others depend on
intuition when faced with uncertainty. These insights are crucial for understanding
investment behavior, consumer choices, and financial decision-making in real-world
scenarios.
Game 5: The Fair Split Challenge
The game examines how individuals make decisions about fairness, equity, and personal gain
in resource allocation scenarios. It explores whether
people accept unfair offers (Ultimatum Game Theory),
willingly sacrifice their own rewards to promote fairness
(Inequity Aversion Theory), or prioritize merit-based vs.
equal distribution (Meritocratic vs. Egalitarian
Preferences). The game also tests social preferences, such
as generosity in dictator-style decision-making and
responses to perceived unfairness.
Theories Involved:
Findings:
Policy Implications:
Ultimatum Dilemma
Dictator Decision
Merit vs. Equality
Response to Unfairness
Personal Sacrifice for Equality
Ultimatum Dilemma: You and a randomly chosen stranger are given a chance to win a cash
prize. The stranger will propose a split of ₹5000, and you can either accept or reject it.
However, if you reject, neither of you gets any money. The stranger offers you ₹500 while
keeping ₹4500. What do you do?
Dictator Decision: You are a manager of a company and receive a ₹100,000 bonus. You
have the option to share a portion of this bonus with your team. They have no say in how
much you give. How much will you give to your team?
Merit vs. Equality: Imagine you are part of a team where rewards can be allocated based
on individual contribution. Do you think it’s fair to award more to those who contribute more,
even if it means unequal shares?
1. Ultimatum Dilemma: The results match Güth et al. (1982), showing that people
reject unfair offers even at a personal cost, supporting inequity aversion. Camerer
(2003) found that offers below 20% are often rejected, similar to those who refused
₹500 out of ₹5000, emphasizing fairness over self-interest.
2. Dictator Decision: Consistent with Forsythe et al. (1994), many participants shared a
portion of their bonus, showing social preferences over pure self-interest. Fehr &
Schmidt (1999) suggest this reflects inequity aversion, where people dislike extreme
pay gaps.
3. Merit vs. Equality: Most favoured merit-based rewards, aligning with Adams (1965)
and Bolton & Ockenfels (2000), who found that equity theory drives fairness
perceptions. This confirms that people accept inequality if seen as deserved.
4. Response to Unfairness: The majority chose to act against unfairness, supporting
Fehr & Gächter (2002) on altruistic punishment, where individuals penalize unfair
behavior to maintain cooperation, even at personal cost.
5. Personal Sacrifice for Equality: A large portion sacrificed personal gain for fairness,
mirroring Charness & Rabin (2002), who found that social welfare preferences drive
such behavior, and Fehr & Schmidt (1999), who noted strong inequity aversion in
decision-making.
Most participants shared part of their bonus, reflecting social preferences and fairness
concerns over self-interest. This suggests that people naturally consider equity, even
without obligation.
The preference for merit-based rewards highlights equity theory, where people believe
rewards should match effort rather than being distributed equally, reinforcing
meritocratic ideals.
A majority were willing to give up some rewards for fairness, showing that social
welfare concerns influence decisions, though some still prioritized maximizing
personal gain.
Business & Salary Negotiations: The Ultimatum Game reflects salary negotiations,
where employees reject unfair pay despite personal loss. Fair wages improve
retention, while unfair offers lead to dissatisfaction.
Leadership & Team Management: The Dictator Game applies to bonus distribution.
Leaders who share rewards fairly build trust and motivation, while selfish decisions
harm workplace morale.
Public Policy & Wealth Distribution: The merit vs. equality debate shapes tax
policies and welfare. Governments must balance rewarding effort with ensuring
fairness for economic stability.
Social Justice & Protests: People protest inequality and corporate greed as a form of
altruistic punishment. Boycotts and activism push for fairer systems.
Group Dynamics & Cooperation: Sacrificing personal gain for fairness drives
charity, social movements, and teamwork, ensuring group harmony and shared
success.
Game 5 Conclusion
The game highlights that fairness, reciprocity, and social norms play a crucial role in
decision-making. The results show that people are willing to reject unfair deals, share
resources, and punish inequity, even at a personal cost. While many favor merit-based
rewards, there is also a strong inclination toward equitable outcomes and group welfare.
These findings align with behavioural economics research, proving that human decisions are
not purely self-interested but shaped by fairness concerns and social expectations.
Theories Involved:
Sunk Cost Fallacy: The sunk cost fallacy is a cognitive bias where individuals
continue investing time, money, or effort into a failing endeavor because they have
already invested resources, even when rational analysis suggests stopping. Instead of
making decisions based on future outcomes, people irrationally consider past,
irrecoverable costs.
Findings:
Policy Implications:
Time Investment
Awareness Check
Q1 Initial Investment Decision: You have invested 1,00,000 Rs. into a project. New
evidence suggests the project is unlikely to succeed. Do you:
Q2: Doubling Down Scenario: After further investment, you face another setback. Do you:
Q3: Time Investment: Imagine you’ve spent many hours on a project that is failing. Would
you continue working on it simply because you’ve already spent so much time?
Q4: Rational vs. Emotional Commitment: When deciding whether to continue a failing
project, do you feel that your past investments should influence your decision even if future
prospects are poor?
Q5: Awareness Check: Before making your next decision, how aware are you of the “sunk
cost fallacy” (the tendency to continue investing in something due to past investments)?
Comparison with Published Research
Escalation of commitment makes people persist in failing projects due to loss aversion
and the hope of eventual success. This shows how psychological commitment can
override logical decision-making, making failure harder to accept.
Time investments feel more valuable than money, making them harder to abandon.
Personal attachment drives persistence, showing that non-monetary sunk costs
influence decisions in work, education, and personal life.
Some people let past investments influence future choices, while others focus only on
expected outcomes. Emotional ties make detaching from past losses difficult, but
doing so leads to better decision-making.
Even when aware of the sunk cost fallacy, people still act irrationally due to ego,
emotions, or social pressures. Awareness alone isn't enough—actively training for
rational decision-making is key.
Stock Market & Trading: Investors hold on to losing stocks hoping they will
recover, instead of reallocating funds to better opportunities. Recognizing sunk cost
bias helps in rational portfolio management.
Career & Education: People stay in unfulfilling jobs or degrees because they have
already spent years pursuing them. The best approach is to focus on future
opportunities rather than past commitments.
Game 6 Conclusion
The game highlights how sunk cost fallacy and escalation of commitment influence decision-
making in financial, professional, and personal contexts. Many participants struggled to
abandon failing investments, showing how past commitments cloud rational judgment.
However, some recognized the need to cut losses and focus on future outcomes, indicating
awareness of decision biases. The results align with behavioural economics research, proving
that logical reasoning often competes with emotional attachment and loss aversion.
Overcoming these biases requires a shift toward forward-looking decision-making rather than
being trapped by past investments.
Conclusion
The study reveals that decision-making is often influenced by cognitive biases, social
influences, and psychological tendencies. While rational models like Nash Equilibrium and
Expected Utility Theory provide a foundation, behavioral economics highlights deviations
driven by heuristics, emotions, and social expectations. Understanding these tendencies has
significant implications for business strategies, public policy, and personal decision-making.
Future research could explore interventions to mitigate biases and improve decision-making
outcomes.