100% found this document useful (1 vote)
174 views

Black+Box+Thinking+-+Matthew+Syed

This book discusses embracing failure to unlock learnings. It uses the aviation industry as an example of an industry that has embraced failure by rigorously investigating accidents to extract lessons. In contrast, the healthcare industry has been slower to adopt this approach and instead often shuns failure due to ego and hierarchy. The book advocates adopting an objective "forensics" mindset when failures occur to understand what went wrong in order to improve.

Uploaded by

jasmine19jhs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
174 views

Black+Box+Thinking+-+Matthew+Syed

This book discusses embracing failure to unlock learnings. It uses the aviation industry as an example of an industry that has embraced failure by rigorously investigating accidents to extract lessons. In contrast, the healthcare industry has been slower to adopt this approach and instead often shuns failure due to ego and hierarchy. The book advocates adopting an objective "forensics" mindset when failures occur to understand what went wrong in order to improve.

Uploaded by

jasmine19jhs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Black Box Thinking, by Matthew Syed

“When we are confronted with evidence that challenges are deeply held beliefs,
we are more likely to reframe the evidence than we are to alter our beliefs.”

Summary: this book was first recommended to me by


Daniel Ek who said “I use the principles almost every day.”

If a brilliant billionaire uses something every day, that’s good


enough for me. So what is this book about?

Embracing failure.

To the title, all airplanes must carry two black boxes to


record: 1) all electronic systems, and 2) cockpit dialogue.

Syed compares the aeronautical industry—which embraces


failure to extract learnings—to the healthcare industry—
which often shuns failure due to ego and hierarchy.

Black Box Thinking is about shifting our mindset into


objective forensics mode whenever something bad happens:

● A board meeting doesn’t go well


● A product launch fails to gain traction
● Sales misses their quarterly number
● You lose a large customer to a competitor
● A key executive quits unexpectedly

Don’t get mad. Get curious. For it is from the roots of failure
that all learning grows. Failure + learning = adaptation.

“Removing failure from innovation is like removing oxygen from a fire.”

Matthew Syed Author bio: Matthew Syed is an author and speaker in the field of
high performance. He has written six bestselling books on the
subject of mindset and high performance: Rebel Ideas, Bounce,
Black Box Thinking, The Greatest, and his celebrated children’s
books, You Are Awesome and The You Are Awesome Journal.

His specialty: working with leading organisations to build a mindset


of continuous improvement by destigmatizing failure.

Matthew is also an award-winning journalist for The Times and a


regular contributor to television and radio.

In his previous career, Matthew was the England ping pong


champion for almost a decade.

Syed resides in London with his wife Kathy, and their two children:
a son and daughter.
2-5-10-20: distillation exercise via constraint.

This book in 2 words: Embrace failure.

This book in 5 words: Embrace failure to unlock learnings.

This book in 10 words: Embrace failure to unlock learnings; your ego is hampering growth.

This book in 20 words: Embrace failure to unlock learnings; your ego is hampering growth. Seek
marginal gains, a growth mindset, and evidence via experimentation.

Part 1: The Logic of Failure

“We will find that in all these instances the explanation for success hinges, in powerful and often
counterintuitive ways, on how we react to failure.”

Syed adeptly highlights the aviation industry which has a mindset and system to learn from mistakes.

For every one million


flights on western-built
jets, there were only
0.41 accidents - a rate
of one accident per
2.4 million flights.

Contrast aviation with healthcare and you get a starkly different mindset toward failure. According to the
American Institute of Medicine report ‘To Err is Human’ published in 1999, between 44,000 and 98,000
Americans die each year as a result of preventable medical errors. This is the equivalent of two jumbo
jets falling out of the sky every 24 hours.

The reasons healthcare is plagued with errors is not surprising given that:
1. The human body is complex, i.e. 12,420 diseases and protocols according to the WHO
2. Providers are resources strapped
3. Providers need to make quick decisions w/ little time to consider all possibilities

“Society, as a whole, has a deeply contradictory attitude to failure.”


For a variety of reasons, the human brain is a fantastic regret-shunning machine.

“Experiments have demonstrated that we all have a sophisticated ability to delete failures from
memory, like editors cutting gaffes from a film reel. Far from learning from mistakes, we edit them
out of the official autobiographies we all keep in our own heads.”

Let’s talk about bloodletting

You know those swirly posts outside of barber shops? They used to be
just red and white to let passerbys know that bloodletting services were
available. The red represented blood oozing out.

In the 1900s they added blue to be patriotic.

Bloodletting was thought to be a powerful cure for disease. But now we


know the procedure was incredibly harmful.

Why were barbers and doctors effectively killing patients for the better part
of 1,700 years? “Not because they lacked intelligence or compassion, but
because they did not recognize the flaws in their own procedures.”

Bloodletting is an example of a closed loop system. There was no


feedback loop to objectively learn from mistakes, and no control group.

If the patient survived, confirmation bias set in: “It must have been the
bloodletting.” If the patient died, they were deemed too sick for
bloodletting to have helped.

Historically, healthcare institutions have not routinely collected data on how accidents happen, and so
cannot detect meaningful patterns, let alone learn from them.

United Airlines Flight 173


In the event of an accident, independent
aviation investigators are given full rein to
explore the wreckage and all evidence.
Mistakes are not stigmatized, but considered to
be learning opportunities.

Interestingly, “the evidence compiled by the


accident investigation branch is inadmissible in
court proceedings.” This legal safe cover
preempts people from concealing facts.
Incentives are aligned for maximum disclosure.
There are mistakes where we:
1. Know the right answer
2. Don’t know the right answer

Psychologists often make a distinction between these two categories of error. However, in both
scenarios, failure is indispensable to the process of discovery.

Abraham Wald was a Hungarian


mathematician who helped the U.S.
Air Force analyze data on attacked
bomber aircraft in WWII.

The question was where to reinforce


bombers based on bullet hole
patterns in planes that had returned.

Wald’s insight: what about the


planes that didn’t make it home, i.e.
Abraham Wald the data they didn’t have?

Takeaway: the data you already have can be misleading. Ask yourself: what data is missing? This
approach, inspired by Wald’s insight, led the US Air Force to reinforce the bombers in the areas that
were not hit as seen in the bombers that returned (cockpit, tail). Once they had reinforced the cockpit and
tail with more armor, the survival rate of bombers increased dramatically from the dire 52%.

Source: A Method of Estimating Plane Vulnerability Based on Damage of Survivors (declassified in 1980)

The Paradox of Success

“This is the paradox of success: it is built upon failure.”

Examples from aviation:


1. Checklists emerged from a series of plane crashes in the 1930s.
2. Ergonomic cockpit design was an attempt to learn from disastrous B-17 accidents.
3. Crew Resource Management came to fruition after the United Airlines 173 crash.

All of these successes—now standards of industry—were born out of calamity.

Compare the falsifiable, evidenced-based world with a pseudoscience like astrology:


“This gives astrology a seductive strength: it is never
‘wrong’. But the price it pays for immunity from failure
is high indeed: it cannot learn. Astrology has not
changed in any meaningful way for over two
centuries.”

“Most closed loops exist because people deny failure


or try to spin it. With pseudoscience the problem is
more structural. They have been designed, wittingly or
otherwise, to make failure impossible. That is why, to
their adherents, they are so mesmerizing. They are
compatible with everything that happens. But that also
means they cannot learn from anything.”

On feedback: “Feedback, when delayed, is considerably less effective in improving intuitive judgement.”

→ This insight compels me to share feedback with my management team and ICs as
quickly as possible (or appropriate). There is certainly a time and a place for feedback, but
the value depreciates with time.

Meet Dr. Peter Pronovost, a handsome doctor who has helped healthcare become much safer.

Peter J. Pronovost, M.D., Ph.D., F.C.C.M., is a


world-renowned patient safety champion, innovator, critical
care physician, a prolific researcher (publishing over 800
peer review publications), entrepreneur (founding a health
care start-up that was acquired), and a global thought
leader informing US and global health policy.

Impact: after instituting a five-point checklist and


empowering nurses to speak up if surgeons didn’t comply,
the 10-day line infection rate dropped from 11% to 0%.

This one reform saved 1,500 lives and $100 million over the
course of 18 months in the state of Michigan alone.

In 2008 Time magazine voted Pronovost as one of the most


influential 100 individuals in the world due to the suffering
he helped to avoid. Central line infections kill 30,000 to
60,000 patients per year.

“Healthcare by its nature is highly error-provoking — yet health carers stigmatise fallibility and have had
little or no training in error management or error detection.” —Professor James Reason
Part 2: Cognitive Dissonance

Wrongful convictions

“When we are confronted with evidence that challenges are deeply held beliefs we are more likely to
reframe the evidence than we are to alter our beliefs.”

Cognitive dissonance is a term Leon Festinger coined to describe the inner tension we feel when,
among other things, our beliefs are challenged by evidence.

In these ego-threatening scenarios, we have two choices:


1. Acceptance: question our own decisions and beliefs; get curious.
2. Denial: reframe the evidence; filter it, spin it, ignore it.

“It is only when we have staked our ego that our mistakes of judgement
become threatening. That is when we build defensive walls and deploy
cognitive filters.”

Disposition effect: a manifestation of cognitive dissonance in investing where the investor, after making
an investment that goes down, is more likely to keep the investment, regardless of the future prospects.
Why? Because we hate to crystallise a loss. Selling makes it real, and our egos don’t like that.

The scientific method acts as a corrective to our tendency to spend our time confirming what we think
we already know, rather than seeking to discover what we don’t know.

References I didn’t know but likely should


● Leon Festinger: “arguably the most influential sociologist in the last half-century.”

Meet Trofim Lysenko, a Soviet biologist who rejected Mendelian genetics. Lysenko’s story offers a
cautionary tale: when policies are adopted based on ideology vs. evidence, very bad things can happen.
Lysenko believed in Lamarckism—that notion that organisms can
pass on to their offspring certain physical characteristics they acquire
during their lifetime. These beliefs have since been falsified or
abandoned by the scientific community.

But back in 1940, Lysenko silenced his critics through his political
power and Stalin’s backing. By “silenced” I mean he had them
persecuted, imprisoned and executed. The damage to the Soviet
scientific community and education system is hard to overstate.

The absence of scientific peer criticism allowed Lysenko’s flawed


principles to become USSR-sanctioned policies. This resulted in
overseeding crops by over 400% (based on the flawed belief that
same-species plants wouldn’t compete; spoiler: they do).

Communist China also adopted Lysenko’s principles whole-hog. Mao


Zedong soon faced a grim reality: crop yields were decimated. The
resulting famine killed 20-43 million people, one of history’s worst..

Lesson: bad ideas should be allowed to fail.

“Cognitive dissonance doesn’t leave a paper trail. There are documents that can be pointed to when we
reframe inconvenient truths.”

Neil deGrasse Tyson was once voted sexiest astrophysicist in


the world. He is a prolific scientist, writer and media personality.

Tyson is highlighted in this book for remembering—and


proliferating—something that didn’t happen. He claims George W.
Bush referred to “Our God is the God who named the stars” in a
speech days after the 9/11 attacks. He condemned Bush for these
divisive words amidst the tragedy.

Except Bush didn’t say these words. It took three years of


confusion and bickering for Tyson to finally issue a retraction.

The goal of this story is not to slam Tyson, but rather to illustrate
that even brilliant minds are susceptible to faulty memories.

Case in point: 75% of wrongful convictions involve a


mistaken eyewitness identification.

When it comes to reforming criminal justice, The Innocence Project has campaigned for specific reforms
to reduce the risk of bias and error:
1. Line-ups should be administered by an officer who doesn’t know the identity of the suspect.
2. Line-ups should be sequential (one-by-one) instead of simultaneous.
3. To reduce the risk of false confessions, all interrogations must be videotaped.
4. Create Criminal Justice Reform Commissions, i.e. independent bodies mandated to investigate
wrongful convictions and recommend reforms (~11 states have adopted these).

Confronting Complexity

We dive head first into the wonderful world of Richard Dawkins, focusing first on his fantastic book The
Blind Watchmaker.

To illustrate a point about evolution, Dawkins asks us to


imagine a monkey trying to type a single line of Hamlet:
‘Methinks it is like a weasel.’

The random keystrokes of our confused monkey have a 1 in


19,683 chance of getting just the first three letters right! Even
on an evolutionary time scale, randomness takes a long time.
But evolution isn’t random: it fails, learns, adapts.

So Dawkins built a computer program that could compare


Hamlet attempts to the target phrase, keep the closest
letters, and discard the others. When the “monkey” program
was allowed to cumulatively learn, it got the phrase right after
43 attempts. Remarkable!

“The process is so powerful that, in the natural world, it confers what has been called ‘the illusion of
design’: animals that look as if they were designed by a vast intelligence when they were, in fact, created
by a blind process.”

On creative destruction: approximately 10% of American companies go bankrupt every year. Economist
Joseph Schumpeter refers to this as creative destruction.

On the beauty of trial and error: our author Matthew Syed highlights the invention of the first steam
engine for pumping water. The inventor’s name was Thomas Newcomen who was barely literate and
had no scientific background. But he was a tinkerer. James Watt would go one to refine Newcomen’s
invention. These tinkerer’s knew little of the scientific underpinning (it broke the laws of physics at the
time), but they trial and error’d their way into massive discoveries. Twenty-seven year old Sadi Carnot
was inspired to figure out why these engine inventions worked the way they did, and his work laid the
foundation for thermodynamics (and also helped Rudolf Diesel make his famous engine).
“We are hardwired to think that the world is simpler than it really is. And if the world is simple, why bother
to conduct tests? If we already have all the answers, why would we feel inclined to challenge them?”

Narrative fallacy: refers to human propensity to create stories to explain what we see in hindsight, i.e.
after the event happens. Nassim Nicholas Taleb and Daniel Kahneman have done a lot of work on this.

“Narrative fallacies arise inevitably from


our continuous attempt to make sense
of the world. The explanatory stories that
people find compelling are simple; are
concrete rather than abstract; assign a
larger role to talent, stupidity and
intentions than to luck; and focus on a
few striking events that happened rather
than the countless events that failed to
happen. Any recent salient event is a
candidate to become the kernel of a
causal narrative.”
—Danny Kahneman

Toby Ord is an Oxford philosopher and philanthropist expert. He contends that many non-profits don’t do
sufficient program evaluation [i.e. measuring the impact of their efforts] because they’re trying to keep the
overhead ratio low (a popular metric with donors). To address this problem, Ord started the Effective
Altruism movement and GivingWhatWeCan.org to help donors identify the charities with the best track
records of effectiveness. You can also see how your income stacks up globally and what a 10% donation
would give back to the world:

“The majority of our assumptions have never been subject to robust failure tests. Unless we do
something about it they never will be.”
Scared Straight was a youth development program designed to
expose kids to incarcerated prisoners in hopes that the kids decide to
avoid a life of crime. It was heralded as a monumental success and
replicated around the country. Except there was one big problem: they
never accurately tested the program.

The program’s organizers simply reported the glowingly high


percentage of Scared Straight alums that led a non-criminal life
(~90%). But without a randomized control trial, there was no control
group to compare results against.

Subsequent testing uncovered that Scared Straight kids actually had


increased criminal activity compared to a cohort of similar age (some
tests showed as much as 25-28% higher crime rate).

This flawed program offers valuable lessons in intellectual honesty, narrative fallacy, and the
ruthless objectivity of randomized control trials.

Marginal Gains

My favorite chapter. Marginal gains, as an approach, is about having the intellectual honesty to see
where you are going wrong, and delivering improvements as a result.

This chapter shares strategies and techniques for small, incremental improvements that compound over
time. We start off by learning about athletic performance in the world of competitive cycling.

Sir David Brailsford

Meet Sir David Brailsford, a British cycling coach who


implemented a marginal gains philosophy and went on to win
18 Olympic gold medals and the first Tour de France victory in
109 years (Bradley Wiggins in 2012 - individual; Team Sky
and Chris Froome in 2013 - general classification).

It’s worth noting that Chris Froome would go on to win the


Tour de France in 2013, 2015, 2016, and 2017—an
unprecedented feat.

“The whole principle came from the idea that if you broke
down everything you could think of that goes into riding a bike,
and then improved it by 1%, you will get a significant increase
when you put them all together.”

Brailsford's approach involved the constant measuring and


monitoring of key statistics such as cyclists' power output, and
training interventions targeting specific weaknesses.
Marginal gains applied to economic international development

Esther Duflo is a petite French economist. Esther advocates for applying scientific, evidence-based rigor
to international development programs, e.g. healthcare, education, etc.

“It’s easy to sit back and come up with


grand theories about how to change the
world. But often our intuitions are wrong.
The world is too complex to figure
everything out from your armchair.

The only way to be sure is to go out and


test your ideas and programmes, and to
realise that you will often be wrong.

But that is not a bad thing. It leads to


progress.” —Esther Duflo

Mercedes-AMG Petronas F1 Team is the winniest team in the history of F1 racing. Their team
embraces “Black Box Thinking” in their endeavor to continuously capture marginal gains.

Mercedes F1 insight: iterate the data capture before


you iterate the process.

“We use the same process for everything, not just pit
stops. First of all, you need a decent understanding of
the engineering problem. So, with the pit stops we
came up with a strategy based on our bluesky ideas.
But this strategy was always going to be less than
optimal, because the problem is complex. So we
created sensors so we could measure what was
happening and test our assumptions.

But the crucial thing is what happened next. Once you


have gone through a practice cycle and the initial
strategy, you immediately realize that there are
miscellaneous items that you are not measuring. Just
doing a pit stop practice run opens your eyes to data
points that are relevant to the task, but that were
absent from the initial blueprint. So the second state of
the cycle is about improving your measurement
statistics, even before you start to improve the pit stop
process.”
—James Vowles, Chief Strategist
“The basic proposition of this book is that we all have an allergic attitude to failure. We try to avoid it,
cover it up and airbrush it from our lives.”

We’ll try anything—cognitive dissonance, euphemisms, avoidance—to stop the pain we experience from
failure.

But top performers like Brailsford, Duflo and Mercedes F1 see failure differently: “Every error, every
flaw, every failure, however small, is a marginal gain in disguise. This information is regarded not
as a threat but as an opportunity.”

On a culture of experimentation:

“As of 2010, Google was carrying out 12,000 Randomized Control Trials every year [230 per week!]. This
is an astonishing amount of experimentation and it means that Google clocks up thousands of little
failures. Each RCT may seem like nit-picking, but the cumulative effect starts to look very different.”

According to Dan Cobley, Google UK’s managing director, a simple color-switch generated $200m in
additional annual revenue.

On creativity
“People think of creativity as a mystical
process. The idea is that creative insights
emerge from the ether, through pure
contemplation. The model conceives of
innovation as something that happens to
people, normally geniuses.

But this could not be more wrong.


Creativity is something that has to be
worked at, and it has specific
characteristics. Unless we understand how
it happens, we will not improve our
creativity, as a society or as a world.

—Sir James Dyson, founder of Dyson Ltd.

Dyson had been struggling with his vacuum cleaner’s suction. It was a lingering problem until one day: “I
just snapped.” There was enough frustration and anger to sustain his efforts for a while. During this
“hatred phase” he dissected and tested vacuums and realized the culprit was fine dust blocking the filter.

Insight at a timber merchant:

“As I stood there waiting [for his wood to be cut],


I noticed this ducting going off the machines. It
travelled along to this thing on the roof, thirty or
forty feet tall. It was a cyclone [a cone-shaped
device that changes the dynamics of airflow,
separating the dust from the air via centrifugal
force]. It was made of galvanized steel.”

—Sir James Dyson

This insight—from another industry—would


become a component for Dyson’s 58 products.

As of July 2021, Dyson has earned a net worth


of $24.5 billion.

But the eureka moment was just the beginning. It took Dyson a jaw-dropping 5,127 prototypes before
Dyson felt the cyclone was ready for the vacuum cleaner, including the addition of a dual cyclone to
capture larger hair and dust particles. One of the first things Dyson did was he read two books on
cyclone mathematics, and even visited one of the authors (academic R. G. Dorman).
What problem are you so obsessed with that you’d
visit the author of a book on the subject?

The hard work came after the “aha” moment, not before. In fact, the first cyclone vacuum cleaner patent
had already been lodged in 1928, 63 years prior to Dyson starting his company.

In
a
similar
vein
Johannes
Gutenberg
invented mass
printing by applying
the pressing of wine
to the pressing
of pages.

“The original idea is only 2 percent of the journey.” —Sr. James Dyson

On Pixar and creative failure

“All told it takes around 12,000 storyboard


drawings to make one 90-minute feature,
and because of the iterative process, story
teams often create more than 125,000
storyboards by the time the film is actually
delivered.”

Insight: At Pixar, this means that 90% of


the storyboards (113,000 in total) are
discarded!

The creative brilliance of Pixar that we’ve


come to enjoy is quite literally the “top
10%” of their creativity.
“Early on, all of our movies suck. That’s a blunt
assessment, I know, but I choose that phrasing
because saying it in a softer way fails to convey
how bad the first versions of our films really are.
I’m not trying to be modest or self-effacing by
saying this. Pixar films are not good at first, and
our job is to make them go. . . from suck to
non-suck.

“We are true believers in the power of bracing,


candid feedback and the iterative process
—reworking, reworking and reworking again, until
a flawed story finds its throughline, or a hollow
Ed Catmull, Pixar cofounder
character finds its soul.”

The Blame Game

“We have to engage with the complexity of the world if we are to learn
from it; we have to resist the hardwired tendency to blame instantly,
and look deeper into the factors surrounding error if we are going to
future out what really happened and thus create a culture based upon
openness and honesty rather than defensiveness and
back-covering.”p.235

Most failures are not blameworthy, but rather the result of complexity. According to one report by Harvard
Business School, executives interviewed shared that only 2-5% of failures in their organizations were
truly blameworth. But human nature got the better of them: the admitted to treating 70-90% of the
mistakes as blameworthy.

“In the world of business, politics, aviation and healthcare, people


often make mistakes for subtle, situational reasons. The problem is
often not a lack of focus, it is a consequence of complexity.
Increasing punishment, in this context, doesn’t reduce mistakes, it
reduces openness. It drives the mistakes underground.” p.243

In other words, blame undermines the information vital for meaningful adaptation. It obscures the
complexity of our world, deluding us into thinking we understand our environment when we should be
learning from it.

Creating a growth culture: Author Matthew Syed brings us home with some great stories about David
Bekham, Michael Jordan, and the importance of “mindset”.
This final section draws on the work of Jason Moser and Carol Dweck (DBT readers will recognize the
latter from her excellent book, Mindset, which we’ve summarized). To recap, there are two types of
mindset:

● Fixed: believes competence is static; failure is personalized and tarnishes identity


● Growth: believes competence can be cultivated through effort; failure is embraced

Moser even detected these differences in our brain chemistry, i.e. when confronted with a mistake, those
in the “growth mindset” group exhibited a Pe brain signal 3 times as great. Mistakes were interesting and
drew full attention. To the “fixed mindset” group, the brain signals were 3x lower, almost as if the
mistakes were being ignored.

“Failure is simply the opportunity to begin again, this time more intelligently.” —Henry Ford

Interesting perspective: “Self-esteem, in short, is a vastly overvalued psychological trait. It can cause
us to jeopardize learning if we think it might risk us looking anything less than perfect. What we really
need is resilience: the capacity to face up to failure, and to learn from it. Ultimately, this is what what
growth is all about.”

In closing coda, Syed credits the Greeks (and later Bacon) for eschewing dogma in favor of
rational criticism.

In the words of Xenophanes:

The gods did not reveal, from the beginning,


All things to us, but in the course of time,
Through seeking we may learn and know things better.

How to put some of this into action according to the author:


● Transform our notion of failure: from embarrassing → educative
● Teach our kids that avoiding failure leads to stagnation
● Praise their effort and resilience (especially in cases of failure/mistakes)
○ Don’t only praise success
● Watch Sidney Dekker video on how aviation has changed its approach to failure over the years
● Design business pilots to test assumptions, not confirm them
● Is there a part of your life you could be more evidence-based? e.g. personal healthcare,
business, habits, with customers?
● How could you create a Randomized Control Trial in your work?
● Before your next big initiative, could doing a pre-mortem help bring risks to the surface?

You might also like