Why “Follow the Science” Fails

“The greatest mistake you can make is to be continually fearing that you’ll make one” – Elbert Hubbard (an American writer, artist, and philosopher)

We need to tackle a really important issue, one that has received a lot of recent coverage1,2,3: Why does “Follow the Science” fail to answer so many questions today?  While this question has come to the forefront primarily due to Covid-19 and policies established to deal with it, it’s been a skeleton in the closet for centuries, literally.

The first issue is that the question is itself misleading primarily because it suffers from typical slogan myopia – it’s nice as a short catchy phrase but it does not convey all that it should convey.  It is incomplete.  It is another victim of Fundamental Principle 6:

There will always be Missing Information.

A more accurate slogan is:

Follow the Science that we currently have and understand.

That tail end of the slogan is unfortunately omitted for two reasons.

First, to a scientist, it is implied and therefore taken for granted; we know that.  Our naïve error is presuming everyone else understands this too.

Second, for a politician and a media person (and others whose attention span drops after three words), it makes the slogan too long, not zippy enough, not a legitimate sound bite.  Their naïve error is that they don’t know that they don’t know what they don’t know that’s missing.

So we end up with a short, zippy, incomplete slogan that is easily and more readily misinterpreted and misapplied as it become more widely spread and used.

The second issue is, as a result, one of the frustrating consequences and complaints (as expressed by a broad range of people) is that Policy Changes as Reality Changes.  Witness social distancing advice, masking rules, vaccine recommendations, etc. all varying by state or locale, and often at variance with governmental “guidelines.”

In other words, it is not well understood or accepted that one steers the ship as new conditions and information develop.  This does not result in clear and stable guidelines, and this does not go down well with the masses.  Why?  Well, there are a number of reasons – but it’s complex and doesn’t lend itself to a simple narrative or explanation.

Let’s begin by considering some simple health reasons for “Following the Science,” all very noble:

  • People want to protect themselves;
  • Protect their families;
  • Protect their communities; and
  • Protect the vulnerable.

Unfortunately, what we see in real-life behaviors is that these are not all of equal priority; the first two take priority while the last two generally get less attention.

A third issue is that one might think that following the advice of experts is a no-brainer.  After all, there is the phenomenon known as The Einstein Effect: this is where the general population accepts the statements of people they have come to recognize as experts in fields beyond their own expertise.

There is also the Celebrity Effect: accepting the statements of well-known personalities with celebrity status (“performing artists”), regardless of their expertise in the area of their statements. 

These two seem contradictory, yet they have great influence over our lives (weirdly, it seems the Celebrity Effect has greater impact).

Both of these appear to be based upon a “feeling” that “expert opinion” is a “unitary omniscient” force – that experts “know” the one, single correct answer for a given situation, assuming that the answer is already known.

This assumption is based upon our innate human tendency to reduce everything to a “binary” solution:

Either it’s This and they’re right, Or it’s That and they’re wrong!

In other words, we seek a simple explanation or narrative for a complex situation that is not recognized or accepted as such.

So here is the first unintended consequence of a short, zippy slogan: “Follow the Science” must imply that there is already a single absolute answer.  After all, we’ve learned that science has already helped in the rejection of myth, conjured opinions, and outright lies. 

So, it is assumed that what science knows about “the reality that counts” (i.e., our perception of reality) looks like this (simplified concept),

The Einstein Effect, following the advice of experts, thus gets sabotaged by the conflicting second issue, Policy Changing as Reality Changes, as well as reading that “experts” themselves often disagree.

This leads us to the next three less recognized issues: overcoming the assumption of a “unitary omniscient” answer must include recognizing that there is a lack of understanding about science and how it works, a lack of understanding about incomplete and dynamically changing information, and, finally, a lack of understanding about risk that is always present.

Let me propose an alternate “simple but more realistic” description of what science is “doing” in dear old Mother Nature:

“What Science Already Knows” represents the results of studies that we know are predictable, reproducible, trustable, and useable.  When people use the slogan “Follow the Science” they are assuming that what Science will tell them is already known and found in this “box” on the left.

“What Science is looking into” (the “box” in the middle, in lighter colored font to signify still incomplete information and a real difference with the “box” on the left) represents studies taking place to answer questions (and hypotheses) that science doesn’t yet have an answer to.  “We’ve got questions, we don’t yet have reliable answers.”  This is the realm of new developments and changing landscape – things like specific vaccines and better masks (tangible items), and better understanding on intangible things, like social distancing.  There is a certain amount of Risk involved in drawing conclusions (and establishing needed policies) from tentative results that arise in this “box” because, Fundamental Principle 6 again, there is most certainly some information not yet discovered here, and therefore missing.

“What Science doesn’t know to look into yet” (the “box” on the right, in very light colored font because we have no information) represents stuff we haven’t decided to look into yet, or need to look into yet because we haven’t been confronted with it yet.  For example, looking into Covid-19 during 2018.  There is a whole lotta’ RISK involved in speculating about this area.  Consider this area fertile ground for conspiracy theories (and bad policies).

Speaking of bad policies (or perhaps better, inconvenient policies, or perhaps unbelievable policies, or if you are like some, any policies at all), these arise in a complicated environment where 1) something must be done to protect more than just individuals, yet 2) we do not have all the information, so 3) we must make a risky decision for policies based upon what we do know and what we think is the best description of what we do not know.

This latter situation is uncomfortable for science.  Normally, we scientists want to be 110% certain of our data and conclusions from a controlled (experimental) environment before we risk putting our name on a publication that will live and be referenced and interpreted forever.

However, in business we learn that very often there are time dependent situations in an uncontrollable environment where you must make a decision when you have only 50% of the information you need or else there are very undesirable consequences.  You estimate the risk, take the decision, but reserve the right to change or alter the decision when more relevant information comes in (and you are always looking for it).

Policy Changes as Reality Changes

Policy issues with Covid-19 fall into the latter category of trying to best manage in a dynamically changing environment:

    • The virus is mutating often and spreading quickly (time sensitive);
    • The public wants decisive, fixed and absolute guidelines (they want “them” to make “it” go away), and they want minimal personal impact and don’t want to hear about Trade-Offs and their possible negative effects;
    • Politicians don’t want to establish a “bad” policy based upon just 50% of data for fear of unintended consequences, including looking like they don’t know what they’re doing, real Trade-Offs flaring up, and not getting reelected;
    • Scientists don’t want to commit for less than 110% certainty for fear of reputation damage and getting blamed for not being “perfectly right,” and the cause of bad policies and unavoidable Trade-Off flare-ups.

You see the dilemmas.

Part of the reasons behind these dilemmas is our seventh issue, that we were raised and educated with a Fear of Failure (leading to some sort of repercussions), and we have become very good at ducking responsibility for our own failures and very good at pointing out the failures of others.  It’s a national pastime (with no worries about opening day being postponed or cancelled).

Underlying all this is our eighth issue, that ever present yet fuzzy, indefinable, and poorly understood concept called Risk (also because we weren’t educated about it).

“We need to be better at quantifying risk, and not discussing it in a binary way”1

Risk is the concept that remains when you discover, or admit, that life is not Either/Or, the way human nature wants it to be.  And it’s not And/And (two or more choices together) either.  It’s somewhere in the broad in-between – and you can’t quite get a handle on it.  The outcomes depend upon Probability, which is a related concept most people can’t comfortably fathom (because we weren’t educated about that either).

In analyst Sherman Kent’s study, respondents were asked What [probability or number between 0%-100%] would you assign to the phrase “[phrase]”?  The results are summarized in this more graph,

You can see the effects of poor education on the concept of “Probability” in the variations in the responses graphed.  Perceptions haven’t changed much since the study.  Other than “About Even” (50% probability), perceptions are generally abysmal (note there are estimations of “90%+ probable” for “Highly Unlikely” and “Almost No Chance”, and an estimate of  “only 15% probable” for “Highly Likely”).

You can see why the skill of good Decision Making based on an understanding of risk is tentative at best.  It’s another area where education is weak.

At least the graph is colorful.

The world we live in is a complex thing.  In spite of that our human nature leads us instinctively to seek a simple explanation or narrative that makes us comfortable and justifies our “convenient wisdom” (a label that can easily describe the “plants and landscaping” inside our individual Strongholds of belief and comfort).

Another way of viewing this “convenient wisdom” (not to be confused with “Conventional Wisdom” which is more widespread but probably neither conventional nor wisdom), is to consider how homo sapiens commonly reacts to challenging or changing new information,

“If I don’t know it, it doesn’t matter because it doesn’t exist;
If I don’t understand it, it’s wrong.”

You can see this “compartmentalization” amply demonstrated in the reactions to policies and to “Follow the Science.”

BTW, if that’s your reaction, then you are pretty comfortable where you are in your Bubble (or Compartment, or Stronghold, or Fortress) and permitting change to leak into it is not going to be acceptable or comfortable.  (This is not a new phenomenon.  Two thousand years ago when Stephan testified before the Sanhedrin, they covered their ears and loudly yelled so they didn’t have to listen to him (Acts 7:57)).

With our modern and connected society there are a plethora of conforming and biased sources (who each suffer from the same malady above), so if those are the only ones we listen to then we will conclude that the majority (of what we hear) is right.  This is selective Crowdsourcing, simply putting another brick in the wall.  It’s also known as Confirmation Bias.

The deeper reality is that there is missing or incomplete information and the topic is more complicated than it seems because of the multiple issues above. 

Resolving the situation means individually having the courage and motivation to seek out reliable sources outside of one’s routine.  This is more difficult than it seems because it means transitioning from an Either/Or mentality to an And/And one and becoming more comfortable with the trade-offs that come with Risk (i.e., not knowing exactly what an outcome will be, beforehand).  Neither of these choices is generally well understood.  These are not innate personal skills; they must be taught and practiced.  And they must be desired.

A final issue is that many people “pretend that science offers an unambiguous answer and it just happens to be what they favor,” without clarifying how they got to that “favorite” answer to begin with.  This is perhaps another unintended consequence of the “doesn’t matter; don’t understand” malady above.

It’s entirely possible that this resulting belief involved some kind of Either/Or (and not And/And) thinking (i.e., it was binary), it was convenient, involved no “risks,” originated from an alleged “expert opinion,” and/or maintains a comfortable status quo.  Or possibly all of them.

Invariably this gives rise to the 7 Last Words of Fortress dwellers, the perpetually un- or under informed:

But I’ve always lived comfortably this way.

The proposal that “we need to be better at quantifying risk and not discussing it in a binary way” assumes* that people will listen and try to understand this more complicated approach.

(*assume: to jump to an absolute conclusion without verification and run with it.)

It also presumes that an ample reservoir of applied process education and critical thinking is readily available, and that people are actually willing to listen to and consider uncomfortably new and challenging information.  Given the widespread reactions to “Follow the Science” and various policies, the probability of this happening probably (ugh!) falls into the Little Chance category above.  Perhaps the good news is that somebody thinks Little Chance means 100% probable.

—–

Notes:

1: Follow the Science?, New York Times, Feb 11, 2022.

2: “Follow the Science” Might Not Mean What You Think It Means, EconLib, Jun 30, 2021

3: What “Follow the Science” Obscures, Slate, Feb 9, 2022.

About Jim Edmonds

I am a husband, father, mentor, who once was a chemist turned physicist turned marketer turned executive turned missionary turned professor. And survived it all.
This entry was posted in 06: Incomplete Information and tagged , , , , . Bookmark the permalink.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.