Black+Box+Thinking+-+Matthew+Syed
Black+Box+Thinking+-+Matthew+Syed
“When we are confronted with evidence that challenges are deeply held beliefs,
we are more likely to reframe the evidence than we are to alter our beliefs.”
Embracing failure.
Don’t get mad. Get curious. For it is from the roots of failure
that all learning grows. Failure + learning = adaptation.
Matthew Syed Author bio: Matthew Syed is an author and speaker in the field of
high performance. He has written six bestselling books on the
subject of mindset and high performance: Rebel Ideas, Bounce,
Black Box Thinking, The Greatest, and his celebrated children’s
books, You Are Awesome and The You Are Awesome Journal.
Syed resides in London with his wife Kathy, and their two children:
a son and daughter.
2-5-10-20: distillation exercise via constraint.
This book in 10 words: Embrace failure to unlock learnings; your ego is hampering growth.
This book in 20 words: Embrace failure to unlock learnings; your ego is hampering growth. Seek
marginal gains, a growth mindset, and evidence via experimentation.
“We will find that in all these instances the explanation for success hinges, in powerful and often
counterintuitive ways, on how we react to failure.”
Syed adeptly highlights the aviation industry which has a mindset and system to learn from mistakes.
Contrast aviation with healthcare and you get a starkly different mindset toward failure. According to the
American Institute of Medicine report ‘To Err is Human’ published in 1999, between 44,000 and 98,000
Americans die each year as a result of preventable medical errors. This is the equivalent of two jumbo
jets falling out of the sky every 24 hours.
The reasons healthcare is plagued with errors is not surprising given that:
1. The human body is complex, i.e. 12,420 diseases and protocols according to the WHO
2. Providers are resources strapped
3. Providers need to make quick decisions w/ little time to consider all possibilities
“Experiments have demonstrated that we all have a sophisticated ability to delete failures from
memory, like editors cutting gaffes from a film reel. Far from learning from mistakes, we edit them
out of the official autobiographies we all keep in our own heads.”
You know those swirly posts outside of barber shops? They used to be
just red and white to let passerbys know that bloodletting services were
available. The red represented blood oozing out.
Why were barbers and doctors effectively killing patients for the better part
of 1,700 years? “Not because they lacked intelligence or compassion, but
because they did not recognize the flaws in their own procedures.”
If the patient survived, confirmation bias set in: “It must have been the
bloodletting.” If the patient died, they were deemed too sick for
bloodletting to have helped.
Historically, healthcare institutions have not routinely collected data on how accidents happen, and so
cannot detect meaningful patterns, let alone learn from them.
Psychologists often make a distinction between these two categories of error. However, in both
scenarios, failure is indispensable to the process of discovery.
Takeaway: the data you already have can be misleading. Ask yourself: what data is missing? This
approach, inspired by Wald’s insight, led the US Air Force to reinforce the bombers in the areas that
were not hit as seen in the bombers that returned (cockpit, tail). Once they had reinforced the cockpit and
tail with more armor, the survival rate of bombers increased dramatically from the dire 52%.
Source: A Method of Estimating Plane Vulnerability Based on Damage of Survivors (declassified in 1980)
On feedback: “Feedback, when delayed, is considerably less effective in improving intuitive judgement.”
→ This insight compels me to share feedback with my management team and ICs as
quickly as possible (or appropriate). There is certainly a time and a place for feedback, but
the value depreciates with time.
Meet Dr. Peter Pronovost, a handsome doctor who has helped healthcare become much safer.
This one reform saved 1,500 lives and $100 million over the
course of 18 months in the state of Michigan alone.
“Healthcare by its nature is highly error-provoking — yet health carers stigmatise fallibility and have had
little or no training in error management or error detection.” —Professor James Reason
Part 2: Cognitive Dissonance
Wrongful convictions
“When we are confronted with evidence that challenges are deeply held beliefs we are more likely to
reframe the evidence than we are to alter our beliefs.”
Cognitive dissonance is a term Leon Festinger coined to describe the inner tension we feel when,
among other things, our beliefs are challenged by evidence.
“It is only when we have staked our ego that our mistakes of judgement
become threatening. That is when we build defensive walls and deploy
cognitive filters.”
Disposition effect: a manifestation of cognitive dissonance in investing where the investor, after making
an investment that goes down, is more likely to keep the investment, regardless of the future prospects.
Why? Because we hate to crystallise a loss. Selling makes it real, and our egos don’t like that.
The scientific method acts as a corrective to our tendency to spend our time confirming what we think
we already know, rather than seeking to discover what we don’t know.
Meet Trofim Lysenko, a Soviet biologist who rejected Mendelian genetics. Lysenko’s story offers a
cautionary tale: when policies are adopted based on ideology vs. evidence, very bad things can happen.
Lysenko believed in Lamarckism—that notion that organisms can
pass on to their offspring certain physical characteristics they acquire
during their lifetime. These beliefs have since been falsified or
abandoned by the scientific community.
But back in 1940, Lysenko silenced his critics through his political
power and Stalin’s backing. By “silenced” I mean he had them
persecuted, imprisoned and executed. The damage to the Soviet
scientific community and education system is hard to overstate.
“Cognitive dissonance doesn’t leave a paper trail. There are documents that can be pointed to when we
reframe inconvenient truths.”
The goal of this story is not to slam Tyson, but rather to illustrate
that even brilliant minds are susceptible to faulty memories.
When it comes to reforming criminal justice, The Innocence Project has campaigned for specific reforms
to reduce the risk of bias and error:
1. Line-ups should be administered by an officer who doesn’t know the identity of the suspect.
2. Line-ups should be sequential (one-by-one) instead of simultaneous.
3. To reduce the risk of false confessions, all interrogations must be videotaped.
4. Create Criminal Justice Reform Commissions, i.e. independent bodies mandated to investigate
wrongful convictions and recommend reforms (~11 states have adopted these).
Confronting Complexity
We dive head first into the wonderful world of Richard Dawkins, focusing first on his fantastic book The
Blind Watchmaker.
“The process is so powerful that, in the natural world, it confers what has been called ‘the illusion of
design’: animals that look as if they were designed by a vast intelligence when they were, in fact, created
by a blind process.”
On creative destruction: approximately 10% of American companies go bankrupt every year. Economist
Joseph Schumpeter refers to this as creative destruction.
On the beauty of trial and error: our author Matthew Syed highlights the invention of the first steam
engine for pumping water. The inventor’s name was Thomas Newcomen who was barely literate and
had no scientific background. But he was a tinkerer. James Watt would go one to refine Newcomen’s
invention. These tinkerer’s knew little of the scientific underpinning (it broke the laws of physics at the
time), but they trial and error’d their way into massive discoveries. Twenty-seven year old Sadi Carnot
was inspired to figure out why these engine inventions worked the way they did, and his work laid the
foundation for thermodynamics (and also helped Rudolf Diesel make his famous engine).
“We are hardwired to think that the world is simpler than it really is. And if the world is simple, why bother
to conduct tests? If we already have all the answers, why would we feel inclined to challenge them?”
Narrative fallacy: refers to human propensity to create stories to explain what we see in hindsight, i.e.
after the event happens. Nassim Nicholas Taleb and Daniel Kahneman have done a lot of work on this.
Toby Ord is an Oxford philosopher and philanthropist expert. He contends that many non-profits don’t do
sufficient program evaluation [i.e. measuring the impact of their efforts] because they’re trying to keep the
overhead ratio low (a popular metric with donors). To address this problem, Ord started the Effective
Altruism movement and GivingWhatWeCan.org to help donors identify the charities with the best track
records of effectiveness. You can also see how your income stacks up globally and what a 10% donation
would give back to the world:
“The majority of our assumptions have never been subject to robust failure tests. Unless we do
something about it they never will be.”
Scared Straight was a youth development program designed to
expose kids to incarcerated prisoners in hopes that the kids decide to
avoid a life of crime. It was heralded as a monumental success and
replicated around the country. Except there was one big problem: they
never accurately tested the program.
This flawed program offers valuable lessons in intellectual honesty, narrative fallacy, and the
ruthless objectivity of randomized control trials.
Marginal Gains
My favorite chapter. Marginal gains, as an approach, is about having the intellectual honesty to see
where you are going wrong, and delivering improvements as a result.
This chapter shares strategies and techniques for small, incremental improvements that compound over
time. We start off by learning about athletic performance in the world of competitive cycling.
“The whole principle came from the idea that if you broke
down everything you could think of that goes into riding a bike,
and then improved it by 1%, you will get a significant increase
when you put them all together.”
Esther Duflo is a petite French economist. Esther advocates for applying scientific, evidence-based rigor
to international development programs, e.g. healthcare, education, etc.
Mercedes-AMG Petronas F1 Team is the winniest team in the history of F1 racing. Their team
embraces “Black Box Thinking” in their endeavor to continuously capture marginal gains.
“We use the same process for everything, not just pit
stops. First of all, you need a decent understanding of
the engineering problem. So, with the pit stops we
came up with a strategy based on our bluesky ideas.
But this strategy was always going to be less than
optimal, because the problem is complex. So we
created sensors so we could measure what was
happening and test our assumptions.
We’ll try anything—cognitive dissonance, euphemisms, avoidance—to stop the pain we experience from
failure.
But top performers like Brailsford, Duflo and Mercedes F1 see failure differently: “Every error, every
flaw, every failure, however small, is a marginal gain in disguise. This information is regarded not
as a threat but as an opportunity.”
On a culture of experimentation:
“As of 2010, Google was carrying out 12,000 Randomized Control Trials every year [230 per week!]. This
is an astonishing amount of experimentation and it means that Google clocks up thousands of little
failures. Each RCT may seem like nit-picking, but the cumulative effect starts to look very different.”
According to Dan Cobley, Google UK’s managing director, a simple color-switch generated $200m in
additional annual revenue.
On creativity
“People think of creativity as a mystical
process. The idea is that creative insights
emerge from the ether, through pure
contemplation. The model conceives of
innovation as something that happens to
people, normally geniuses.
Dyson had been struggling with his vacuum cleaner’s suction. It was a lingering problem until one day: “I
just snapped.” There was enough frustration and anger to sustain his efforts for a while. During this
“hatred phase” he dissected and tested vacuums and realized the culprit was fine dust blocking the filter.
But the eureka moment was just the beginning. It took Dyson a jaw-dropping 5,127 prototypes before
Dyson felt the cyclone was ready for the vacuum cleaner, including the addition of a dual cyclone to
capture larger hair and dust particles. One of the first things Dyson did was he read two books on
cyclone mathematics, and even visited one of the authors (academic R. G. Dorman).
What problem are you so obsessed with that you’d
visit the author of a book on the subject?
The hard work came after the “aha” moment, not before. In fact, the first cyclone vacuum cleaner patent
had already been lodged in 1928, 63 years prior to Dyson starting his company.
In
a
similar
vein
Johannes
Gutenberg
invented mass
printing by applying
the pressing of wine
to the pressing
of pages.
“The original idea is only 2 percent of the journey.” —Sr. James Dyson
“We have to engage with the complexity of the world if we are to learn
from it; we have to resist the hardwired tendency to blame instantly,
and look deeper into the factors surrounding error if we are going to
future out what really happened and thus create a culture based upon
openness and honesty rather than defensiveness and
back-covering.”p.235
Most failures are not blameworthy, but rather the result of complexity. According to one report by Harvard
Business School, executives interviewed shared that only 2-5% of failures in their organizations were
truly blameworth. But human nature got the better of them: the admitted to treating 70-90% of the
mistakes as blameworthy.
In other words, blame undermines the information vital for meaningful adaptation. It obscures the
complexity of our world, deluding us into thinking we understand our environment when we should be
learning from it.
Creating a growth culture: Author Matthew Syed brings us home with some great stories about David
Bekham, Michael Jordan, and the importance of “mindset”.
This final section draws on the work of Jason Moser and Carol Dweck (DBT readers will recognize the
latter from her excellent book, Mindset, which we’ve summarized). To recap, there are two types of
mindset:
Moser even detected these differences in our brain chemistry, i.e. when confronted with a mistake, those
in the “growth mindset” group exhibited a Pe brain signal 3 times as great. Mistakes were interesting and
drew full attention. To the “fixed mindset” group, the brain signals were 3x lower, almost as if the
mistakes were being ignored.
“Failure is simply the opportunity to begin again, this time more intelligently.” —Henry Ford
Interesting perspective: “Self-esteem, in short, is a vastly overvalued psychological trait. It can cause
us to jeopardize learning if we think it might risk us looking anything less than perfect. What we really
need is resilience: the capacity to face up to failure, and to learn from it. Ultimately, this is what what
growth is all about.”
In closing coda, Syed credits the Greeks (and later Bacon) for eschewing dogma in favor of
rational criticism.