0% found this document useful (0 votes)
4 views

Lecture 13 Cognitive Bias Part 2

This document provides an overview of cognitive biases, including anchoring effect, base rate fallacy, and law of small numbers. It discusses how our fast, automatic thinking can lead us astray through priming and neglect of sample sizes. Statistics reasoning is challenging and we tend to substitute easier questions that neglect important population-level information.

Uploaded by

Adrian Cheung
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Lecture 13 Cognitive Bias Part 2

This document provides an overview of cognitive biases, including anchoring effect, base rate fallacy, and law of small numbers. It discusses how our fast, automatic thinking can lead us astray through priming and neglect of sample sizes. Statistics reasoning is challenging and we tend to substitute easier questions that neglect important population-level information.

Uploaded by

Adrian Cheung
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 47

WEEK 13

COGNITIVE BIAS PART 2


CCHU9021 Critical Thinking in Contemporary Society
Dr. Arthur Chin
Department of Philosophy
2023/24 Semester Two 1
LECTURE OVERVIEW

1) Cognitive biases (cont’d)


• Anchoring effect
• Base rate fallacy
• Law of small number
• Dunning-Kruger effect
2) Concluding remarks + Midterm Q&A

2
What comes to your mind?
(Thinking Fast and Slow (2011) p.20)

3
REVIEW: TWO SYSTEMS OF OUR MIND

• System 1
• Fast and automatic
• Effortless
• Feels involuntary
• System 2 (What is the product of 128 x 29?)
• Slow and involves attention
• Effortful (and lazy)
• Feels deliberate
4
ANCHORING EFFECT

5
A QUICK QUESTION

• Namibia is a country in Africa.


• Do you think the population of Namibia
is over or below 400 million?
• Give a number as a rough guess.

6
A QUICK QUESTION

• Namibia is a country in Africa.


• Do you think the population of Namibia
is over or below 40 million?
• Give a number as a rough guess.

7
NOT A ONE-OFF COINCIDENCE

• Observation: The estimate given by people who answer the 400 million
question is much higher than that given by people answering the 40 million
question.
• Another example
• Do you think Gandhi was older than 124 when he died? If not, how old you think
he was when he died?
• Do you think Gandhi was older than 36 when he died? If not, how old you think he
was when he died?
• How would you expect the estimations to differ upon being asked one of them?

8
ANCHORING EFFECT

• Anchoring effect occurs when people’s estimation of an unknown quantity is


unduly influenced by an “anchor” – a value that they considered prior to
making the estimation.
• How does it work?
1) System 2: Start from the anchor, assess if it's too high or too low, then
deliberately adjust by moving away from it; stop when you are no longer certain
you should move further. Adjustment typically stop prematurely.
2) System 1: Primed, or stimulated, by the anchor to search for evidence in favor
of, not against, it.

9
ANCHORING EFFECT

• Priming is the mechanism through which System 1’s exposure to one idea or
image or feeling triggers it to become more susceptible to other associated
ideas, images or feelings.
• “Fill out the missing letter: SO_P.”
• Our mind can be primed without even our being conscious of it.
• Study: University students divided into 2 groups. Task to assemble sentences out
of a set of words. Some sets have a theme of old age: “forgetful”; “bald”.
• Subjects then asked to walk to another room.
• Researchers measured the time each group took to walk down the corridor.
• Subjects just acquainted with words associated with “old” walked more slowly!!! 10
Example. Start with 400 million “anchor”

1) S2: sounds high, but I probably haven’t been asked


a crazy question, so it’s not going to be less than
200 million, so I’ll stop there
2) S1: Hearing 400 million primes me to think of
Africa as a really large continent, some of the
countries are huge, so 200 million is probably a
EXPLANATION: good guess.

NAMIBIA CASE Example. Start with 40 million “anchor"

1)S2: sounds low, but probably not a crazy question.


80 million?
2)S1: Hearing 40 million primes my memory that
there are lots of countries in Africa, some of them
are probably pretty small, I don’t know much about
Namibia, so it’s probably one of the small ones. 80
million?

11
ACTUAL ANSWER

• 2.5 million!

12
ANCHORING EFFECT

• Possible query: Probably the number given is not picked out of the blues.
Hence it is reasonable for one to rely on the given value as “anchor”.
• But what about the following experiment done by Kahneman?
• Students were asked to spin a lucky wheel which stopped only at “10” or “65”.
• Students then asked 2 questions:
o Is the % of African countries among UN members larger or smaller than the number on
which the wheel stopped?
o What is your best estimate of the % of African countries in the UN?
• Same anchoring effect, though the number of luck wheel obviously irrelevant.
• Anchoring effect may work on us unknowingly! 13
ANCHORING EFFECT: PRACTICAL VALUE

• Anchor can be used for manipulating people into


spending more money
• A shop prices something high, then offers a reduction –
so it looks like a bargain.
• An apartment owner puts too high an asking price – acts
as an anchor.
• How to guard against the anchoring effect? Try to
consciously think of reasons why the anchor might be
completely wrong.
• Do your research ahead of time.
• Be aware of the effect.
14
HEURISTICS
AND BIAS
CONCERNING
STATISTICS

15
STATISTICS IS VERY
HARD

• Statistics is difficult.
• System 2 needs a lot of training to do
statistics.
• Even with training it is still difficult.
• System 2 is lazy.
• We tend to:
• Instead of answering the real statistical
question, system 1 often substitutes it
for an easier question which is not
statistical.
• System 2 is prone to accept it.
• Reminder: Linda Problem 16
EXAMPLE

• Fact: The US counties in which incidence of


kidney cancer is lowest are mostly rural,
sparsely populated, and located in
traditionally Republican states.

• Question: How would you explain this?

• Ans: Clean-living rural lifestyle, fresh food?

17
EXAMPLE

• Fact: The US counties in which incidence of


kidney cancer is highest are mostly rural,
sparsely populated, and located in
traditionally Republican states.

• Question: How would you explain this?

• Ans: Rural poverty, lack of access to advanced


medical facilities, tobacco, etc.?

18
EXAMPLE

• We cannot say:
• Incidence of kidney cancer is lowest in
rural, sparsely populated, Republican
states because of healthy lifestyle, and
• Incidence of kidney cancer is highest in
rural, sparsely populated, Republican
states because of unhealthy lifestyle.

• Key lies not in “rural” but in “sparsely


populated”
19
SO HOW TO EXPLAIN IT?

• We naturally looked for a causal explanation, but the real explanation


statistical.
• Statistical extremes (low and high) are much more likely in small samples (e.g. sparsely
populated states).
• E.g. Extremes of “ALL heads” occur more often when you toss 2 coins (small sample)
in comparison to tossing 10 coins
• It is an (uninteresting) truism that results drawn from large samples are more
trustworthy. But often we do not fully understand its flip side.
• Large samples are reliable  small samples often have extremes
• Kahneman calls this neglect of the unreliability of small sample the “law of small
numbers” 20
• John is a man who wears gothic inspired clothing, has
long black hair, and listens to death metal.
• Is he more likely a Christian or a Satanist?
• Common answer: More likely to be a Satanist.
• This provides a more “coherent” or “representative”
picture of Satanist.

• But there are billions of Christians, and a much smaller


number of Satanists.
• This base rate information shows that John is much
more likely to be a Christian, despite his fashion
choices.

21
BASE RATE FALLACY

• Base rate fallacy: a common source of error in statistical reasoning


• Sometimes we have both:
• general information about some phenomenon (base rate, population-wide
information), and
• specific information (information about a single case).
• Fallacy arises when our reasoning uses only the specific information (easier
question about representativeness and coherence) rather than also incorporating
the general information (hard question).

22
BASE RATES

• Example: A test for HIV is highly reliable.


• People who are HIV positive test positive
99.7% of the time.
• People who are HIV negative test negative
98.5% of the time.
• This test appears to be incredibly accurate.

• Exercise: John tests positive. How likely is


John to be HIV positive?
• 99.7%?

23
BASE RATES

• To know the likelihood that John is HIV


positive, you need to know about the base rate.
• Only 0.04% of Hong Kong people have
HIV/AIDS.

• To calculate the odds that John’s test is


accurate, think through the example like this.
• Suppose everybody in Hong Kong takes the
test.

24
BASE RATES

• Take the population of Hong Kong to be 7,000,000


• 0.04% of these people have HIV/AIDS

• 2,800 people are infected


• 6,997,200 people are healthy
• Of the infected people that take the test, 0.3% of
them will receive a negative result.
• 8 false negatives
• 2,792 true positives
• Of the healthy people, 1.5% of them will receive a
false positive result.
• 104,958 false positives
• 6,892,242 true negatives 25
BASE RATES

• So, overall:
• 107,750 tests came back positive
• 6,892,250 tests came back negative

• Of those positive tests, only 2,792 (2.59%) of


the people have HIV/AIDS.

• If John is selected at random and tests positive,


there’s a 2.59% chance he has HIV/AIDS.

26
• Not understanding false positives can have tragic
results:
“Former Senator Lawton Chiles of Florida at an AIDS
conference in 1987 reported that of 22 blood donors in
UNDERSTANDING Florida who were notified that they tested HIV-positive
BASE RATES IS with the ELISA test, seven committed suicide… even
ESSENTIAL if the results of both AIDS tests, the ELISA and WB
(Western blot), are positive, the chances are only 50-50
that the individual is infected”
(quoted in Bishop and Trout, Epistemology and the
Psychology of Human Judgment).

27
AVOIDING BASE RATE FALLACY

• To judge the probability of something, we should:


• Step 1: Try to figure out what the base rate is (i.e. how likely the event is in the
general population).
o E.g. % of HIV in population
• Step 2: See if we have any additional evidence related to the specific case.
o E.g. John tested positive
• Step 3: Change our probabilities by starting with the base rate, then moving in
the direction of the evidence according to how reliable it is.

28
• We’re very bad at statistics.

• Instead of answering statistical questions, we often turn them


into questions about:
• A causal explanation (kidney cancer case)
• A story about coherence and representativeness (Linda
TAKING
problem; Satanist case)
STOCK
• Only the case-specific evidence in front of us, while
ignoring base rates (HIV case).

• Each of these can lead to answers which are not just wrong,
but irrational, and can result in very bad decisions.

29
OVERCONFIDENCE

30
BIASES • How do we see ourselves, and our relationship to others?
RELATED TO • How do you rate yourself in terms of critical thinking ability
THE SELF or sociability in comparison to your fellow HKU students?
31
ABOVE AVERAGE EFFECT

• Also called the Dunning-Kruger effect


• Low ability or knowledge in a certain domain
or task, combined with
• a high level of confidence regardless of one’s
low level of performance.
• Also known as Lake Wobegon effect
• Lake Wobegon is a fictional town where “all
the women are strong, all the men are good
looking, and all the children are above
average.”
Dunning and Kruger Journal of Personality and Social
Psychology 1999 Vol.77: “Unskilled and Unaware of it”
32
• Psychologists: Most people believe they are above
average.
• >50% of drivers think they drive better and safer
OVERCONFIDENCE
than average
: THE ABOVE
• Most students think they are more popular than
AVERAGE EFFECT average
• >50% of people think they have an above average
sense of humor

33
• Observation: A lot of people overestimate their
abilities.
• How to explain this has been a subject of much
psychological research.
• They are less good at judging other people’s abilities,
OVERCONFIDENCE so less able to learn from role models?
: THE ABOVE • Poor performers lack the proper incentive to evaluate
AVERAGE EFFECT their (in)competence in a more objective way? Or
they are not given proper feedback?
o Apparently no, according to a 2008 study by
Ehrlinger, Dunning and Kruger
oMight be that their low abilities prevent them from
engaging in meta-cognition?

34
OVER-OPTIMISM

• Over-confidence associated with over-optimism, and both can lead to costly


mistakes and bad decisions.
• Example: Newly weds (almost) invariably think they won’t ever get divorced. But
divorce rater in HK about 48% in 2019.
• A fallacy of particular relevance to you: Planning fallacy
• “How long does it take you to write a good quality essay?”
• Kahneman’s textbook project (Thinking Fast and Slow pp.245-247)
• “How much and how long does it take to renovate the kitchen?”

35
OVER-OPTIMISM

• One strategy to mitigate the planning fallacy is to take both the inside view
and outside view, strategy similar to that of avoiding base rate fallacy.
• Outside view: Try to identify the appropriate reference class and the base rate
• Kitchen renovation: “What is the percentage of cases in which expenditures
exceeded budgets?”
• Inside view: Start from that baseline, and adjust in light of the case-specific
information in front of you
• “How do you rate the reliability of your kitchen renovator?”

36
EXERCISE

• Read the paragraph below. What reasons do we have to suspect that Alex’s
assessment of his chances of getting the internship position is biased?
• Alex is in the final year of his undergraduate studies at HKU and has applied
for postgraduate studies at one of the most prestigious universities overseas.
The competition for admission has been known to be extremely keen.
Nonetheless Alex is confident and believes that there is a greater than 50%
chance that his application will be successful: he figures that he has a very good
GPA, that his writing sample has been lauded as “excellent” by a faculty
member at HKU, and that his performance in the group interview has been far
superior to that of most of other interviewees.

37
• Cognitive biases affect everyone. It is difficult to completely avoid
their influence.

• Two systems
• System 1 often works unconsciously and substitutes difficult for easy
questions
• System 2 is lazy and often doesn’t check, and sometimes doesn’t know
SUMMARY how.

• Specific biases and fallacies: (i) gambler’s fallacy; (ii) availability


bias; (iii) anchoring effect; (iv) “law of small number”; (v) base
rate fallacy; (vi) overconfidence

• Getting the right answers in exam is not the most important thing;
but avoiding the mistakes in real-life when it matters
38
FINAL REMARKS

39
CRITICAL THINKING AND CC

• What is “common core”?


• “The ‘Common’ indicates that the formal curriculum and its many complements
focus on the commonality of human experience, while the ‘Core’ gestures toward
the centrality of issues that are of deeply profound significance to humankind, as
well as the core intellectual, social, and imaginative skills that all HKU
undergraduates should acquire during their stay at the university.”
• Critical thinking necessary for all of us to identify which issues are of
significance, why, and how to address tackle problems pertaining to these
core issues.
• Practice and internalize the norms so they become readily applicable in your life
outside classroom! 40
WHAT MATTERS IN THE LONG RUN IS NOT THE “TERMS”,
BUT ABILITY TO USE CONCEPTS IN YOUR LIFE!!

• Mizrahi “Arguments from Expert Opinion and Persistent Bias” (2018


Argumentation Vol.32)
• “[I]f there were evidence that expert judgments are reliable, either significantly more
likely to be true than false or at least significantly more likely to be true than novice
judgments are, then that would have been a strong reason to think that expert
judgments are reliable, and hence that they can be a trustworthy source…of evidence
for the truth of such judgments. Since there is no evidence that expert judgments are
reliable (in fact, there is evidence to the contrary, i.e., that expert judgments are not
more reliable than novice judgments), there is no reason to think that expert judgments
area trustworthy source of…evidence for the truth of such judgments.” (p.179)
• Writer in an earlier article cited a number of studies supporting his claim that there is
“no evidence” that expert judgments are reliable.
41
BE CRITICAL OF ONESELF

• Don’t be the Fallacy Man!

• At least as important is to apply the critical


thinking skills to yourself!!
• “I am already very critical of myself!”
• Overconfidence?

42
COURSE EVALUATION

• Student Feedback on Teaching and Learning (SFTL)

• Students access to the system: http://sftl.hku.hk/

• All evaluations will be saved anonymously, without any identification.


• There are separate forms for the course, teacher, tutor and demonstrator (if
appropriate).

43
FINAL TEST

• April 24 14:30-16:00 • The test will be conducted in two


• Format: closed-book; short-question venues
• LE2 for students with last name A-L
• Scope: Everything, with q on Week 1-
• LE4 for students with last name M-Z
5 carrying no more than 25% weight
• Second half of course
1) Scientific reasoning • Revision
• Lecture PPTs
2) Causal reasoning
• Final tutorial exercise (causal, statistical,
3) Evaluating Evidence
evidence evaluation)
4) Statistical reasoning
• Relevant chapters and exercises in Lau
5) Cognitive Biases (2011) 44
MIDTERM

• Any questions?

45
Thank you for your
participation throughout the
semester!!

46
READING AND WEB
MODULES

• Relevant textbook sections:


• Lau (2011): Chapter 20 (only those
biases mentioned in class will be tested)

• Critical Thinking Web


http://philosophy.hku.hk/think
• [F08 Cognitive Biases]

47

You might also like