You’re A Programmed Coincidence Machine, And You Can Do Better

By:  DM Kashmer MD MBA FACS (@DavidKashmer)

 

A Few What If Scenarios

Take a minute to answer these questions.  I’m really interested in what you think.  Nothing tricky here, just some interrogatives about what we commonly experience.  Picture each situation in your mind and see where they go…

 

(1) Lightning strikes a dried out log during a storm, what happens next?

 

(2) It rains for a half an hour and the ground is soaked.  Fortunately, the sun came out while it was raining and stayed out when the rain stopped.  You look up at the sky on this sunny day just after the rain, and you expect to see what?

 

Ok, so what about that first situation?  A flash of lightning violently strikes a log and you watch expecting to see what?  A fire?  How about the rain storm on a fairly sunny day?  You look up at the sky and expect to see what?  A rainbow?

 

Guess what…usually when lightning strikes a log there’s no fire.  And in most situations where it rains, yet it’s fairly sunny, there’s no rainbow.  Why do you intuit things that aren’t really going to happen?  (I do it too.) Why is our mental simulator WAY off?

 

The Mental Simulator We Have Is Way Off

Here’s why:  we’ve evolved as a programmed coincidence machine.  Look here.  Or here.  (Please note:  I did NOT offer Dawkins’ argument to agree with his conclusions about the Divine…I just offer some of that work to highlight how the idea that we seek order in randomness is very common.) Oh I didn’t make up that catchphrase “programmed coincidence machine”, yet it nicely captures the idea.  It is evolutionarily adaptive, so the line of reasoning goes, to notice “Hey that makes a fire!” or “Wow look at that unusual thing…”.  Noticing special cases is programmed into us.

 

Well, guess what…that leads us to lousy decisions about everything from investing to what makes us happy.  (Check out how our mental simulator fools us with respect to happiness here.) Strange, huh?  And counter-intuitive.  I file findings like this away with truisms like the Dunning-Kruger effect.

 

We Don’t Notice The True Message of the System

The bottom line is we don’t notice the full robustness of the situation, with all of its variation, central tendency, and beauty in the system.  We are easily distracted by special cases which don’t embody the full message of the system.  You see this all the time!

 

For example, what happens in the field of Surgery when a case goes wrong?  Well, it garners attention.  Sometimes we even react to the spectacular cases where the spotlight has shined and we miss the message and robustness of the system.  We overcorrect or, worse yet, under-recognize.  Often, in classic Process Improvement systems in Healthcare, we don’t know if this latest attention-grabbing headline / case is a issue, outlier, or exactly where it falls.  So we react (because we care) and disrupt a system with inappropriate corrections that actually induce MORE variation in outcomes.

 

Advanced Quality Tools Offer Some Protection

That’s why I work with these tools, and why I like to describe them.  Understanding Type 1 and Type 2 errors, working with data to represent the complete picture of a system’s variation, and knowing rigorously whether we are improving, worsening, or staying the same with respect to our performance are key to understanding whether and how to make course corrections.

 

I recommend using some rigorous tools to understand your team’s true performance, or else you may fall victim to the spectacular…yet distracting.

 

Disagree?  Have a story about being lead astray by intuition?  Let me know beneath.