Summary: Black Box Thinking by Matthew Syed
Summary: Black Box Thinking by Matthew Syed

Summary: Black Box Thinking by Matthew Syed

Kindle | Hardcover | Audiobook

What Is Black Box Thinking?


Black Box Thinking is the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit. It’s about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.

In aviation, the black box records the flight so that in event of a disaster, we can thoroughly examine what went wrong. They then implement changes and take precautionary action against future accidents. This is why accidents in the air have dramatically gone down over the last decades.

Everything we know in the aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died. We’ve purchased lessons literally bought with blood. We have to preserve institutional knowledge and pass on to generations. We cannot have a moral failure to forget these lessons after we learn them.

 

Aviation Vs. Healthcare


Unlike aviation, healthcare industry today keeps making mistakes consistently with tragic consequences. Doctors usually spawn cover ups with generic messages like “Something happened…”.  They have unconsciously enforced a culture where mistakes are seen as weaknesses that cannot be tolerated.

Reason is not usually laziness or unwillingness. The reason is more often the necessary knowledge is not being translated into a simple usable systematic form.

If the only thing people did in aviation was to issue page-long bulletins, it’d be like subjecting pilots to study almost 7,000 medical journal articles per year, which is why aviation practitioners distill the information into its practical essence.

You might be thinking when doctors make a mistake, it results in someone else’s death. When pilots make a mistake, it results in his own death. That is why pilots are better motivated than doctors to reduce errors. But this analysis misses the crucial point. Remember pilots died in large numbers in the early days of aviation. This was not because they lacked the incentive to live, but because the system had so many flaws. Failure is inevitable in the complex world. This is precisely why learning from mistakes is so crucial.

Doctors aren’t supposed to make (and admit) mistakes they make. This leads the system to ignore and deny rather than investigate and learn. But the system alone isn’t to blame. Society as a whole has a deeply contradictory attitude to failure. We’re quick to blame others for their mistakes than we’re so keen to conceal our own. The result is simple; it obliterates openness and spawns cover-ups. It destroys vital information we need in order to learn.

 

A Progressive Attitude to Failure


Mistakes stimulate growth BUT only when we see them and use the feedback to improve.

A golfer gradually improves their game with trial-and-error practice. Imagine they practiced in the dark. They could practice for years, or a lifetime and they’d never know where the ball landed. Closed loop is where failure doesn’t lead to progress because information on errors is misinterpreted or ignored altogether. Open loop leads to progress because feedback is shared generously and rigorously acted upon.

Traditionally we punish errors. Punitive actions lead to blaming and covering up valuable information that could be used to learn and generate solutions that would prevent more mistakes. As a result, we end up practicing golf in the dark.

Fixed mindsets ignore mistakes because they feel threatened – mistakes are a sign they’re inferior and always will be. Growth mindset on the other hand are interested in their mistakes because they picture errors differently. They believe they can develop their abilities through hard work.

 

 

Cognitive Dissonance


Ironically, the more famous the experts the less accurate their predictions tended to be. Why is this? Cognitive dissonance is the answer. It’s those who are strongly associated with their predictions whose likelihood of egos are bound up with their expertise. They’re most likely to re-frame their mistakes and least likely to learn from them. The idea here is the learning advantage of adapting to mistake is outweighed by the reputational disadvantage of admitting to it.

The problem again is not always the external incentive structure. It’s the internal one. It’s the sheer inability of us to admit our mistakes even when we’re incentivized to do so.

After all, you expect the higher you go up in the company the less you’ll see the effects of cognitive dissonance. Aren’t the people who get to the top of the company supposed to be rational, forensic and clearly sighted? Isn’t that supposed to be their defining characteristics? In fact, the opposite is the case.

Why Do Doctors Rarely Admit Mistakes?

Cognitive dissonance has something to do with it. Doctors have spent years to reach high standards of performance. Their self-esteem is bound up with their clinical competence. They came into medicine to reduce suffering, not to increase it. But now they confront with having killed the healthy people. Just think how they desperate they’d have been to re-frame the fatality.

Admitting to error becomes so threatening that in some cases, surgeons would rather risk killing a patient than admit they might be wrong. As a result, doctors end up hiding their mistakes from patients, other doctors and even from themselves.

Doctors trusted in the power of bloodletting

When a patient died, they believed because he was so ill that no even bloodletting could save him. But when they lived, that confirmed bloodletting saved him. Think of how many success stories circulated in the medieval world. What the patients and doctors didn’t see is what happens if the procedure wasn’t performed?

Sure we can speculate what would happen and we can make decent guesses. But we don’t really know. But let’s say patients are randomly divided into two groups. Providing the sample size is big enough, the two groups  are likely to be similar. The only difference between the two groups is one gets the treatment the other doesn’t. Those don’t receive the treatment are called control group.

This is known as randomized control trial (RTC). Many of the patients that received bloodletting recovered. It looks successful. But many more in the control group  recovered. The reason is simple the body has its own powers of recuperation. By comparing  two groups, it’s possible to say that bloodletting on average kills them.

Same way, take the example of website redesign. The problem is whether if the change in design has increased sales or it was increased by something else. Suppose you randomly direct users to the new or the old design, you can then measure if they buy more with the old or the new design. This will filer out all the other variables such as interest rates, weather, competition.

 

Ready, Fire and Aim


You construct a perfect rifle, you create a model of how the bullet will be affected by wind and gravity, you do your best to get the strategy just right. Then you calibrate the rifle, pull the trigger and watch as the bullet sails towards the target. But this approach is flawed for two reasons.

  1. The real world contains greater complexity than just wind and gravity. There’re endless variables and inter-dependencies.
  2. By the time you pull the trigger, the target would have been moved. So, Ready, FIRE and then AIM.
Success is Just the Tip of the Iceberg.

Beneath the surface of success, outside our view and often outside our awareness is the mountain of necessary failures. So fail early, fail fast and succeed!

 

Kindle | Hardcover | Audiobook