I just finished listening to a Great Courses series entitled Your Deceptive Mind: A Scientific Guide to Critical Thinking Skills by Professor Steven Novella.
This excellent course offers insight and reminders invaluable to all business leaders trying to think critically and make excellent decisions in today’s world.
Below I share some ideas from the course. But, I do encourage you to buy the course, listen to it, and share it amongst your team. It will be worth your while.
Critical Thinking
- Professor Novella defines critical thinking as applying systematic logic and doubt to any claim or belief by thinking carefully and rigorously.
- We are not living in the age of information. Rather, we live in the age of mis-information with ill-informed and misleading data and information surrounding us.
- As such, we need to think critically to separate good insights from the overload of good and bad data and information that is out there.
Bias
- Unfortunately, we have numerous biases that worsen our ability to think critically. To overcome these biases, first and foremost, it helps to be aware of them:
- Confirmation Bias: Our mind’s bias to support beliefs we already hold.
- We have the tendency to notice and accept information that confirms with what we already believe.
- We have the tendency to rationalize away or ignore information that goes against what we believe.
- Fundamental Attribution Error: We tend to believe that the actions of others are the result of internal motives or personality traits rather than external situational factors.
- As an example, a friend passes by and does not stop and talk.
- In such a situation, we may jump to the conclusion that the friend no longer likes us or is purposely being rude.
- In point of fact, it may only be that the friend is late to a meeting.
- Anchoring Bias: We tend to focus on a prominent feature of an object, person or event and then make judgments based on that single feature alone.
- This occurs with politicians (and business leaders) all the time.
- We see a politician gave an impressive speech and conclude that (since they can speak well) they must be a good politician able to get things done.
- Availability Heuristic: We assume that what is immediately accessible to us – what we can think of – must be important and influential. We assume that if we can think of it, it must be common or representative.
- Because we read of one shark attack, we assume that there are countless other shark attacks happening every day and that it is too dangerous to swim in the ocean.
- Anecdote Bias: Anecdotes or stories of experiences (often that we have experienced) have more of an influence over our decisions than they should based on overall probability.
- The story of one person winning a lottery encourages friends and neighbors to play the lottery more often even though the very remote odds of winning have not changed.
- Representative Heuristic: The assumption that causes must resemble effects. Significant events must have significant and notable causes.
- The JFK assassination was too important and too big of an event to be caused by Lee Harvey Oswald alone; there must have been a massive government conspiracy and cover-up.
- Effort Heuristic: We value items more if they require greater effort to obtain.
- Buyer’s remorse happens much less often when we have haggled for a while when buying a car or other item.
- Having to spend the effort to negotiate and haggle to get the car, we then value it more.
- Familiarity Bias: Also called the exposure effect, this means that we tend to rate things more favorably the more familiar that we are with them.
- As we see an actor or athlete more often in the media, our opinion of them improves.
- As we see a salesperson or fellow employee more often, our opinion of them improves.
- Choice Supportive Bias: The bias in which once we make a decision we then assess that decision much more favorably.
- We make a 50 – 50 decision.
- By the next day, we are convinced that we made the right decision.
- Forer Effect: The tendency to make judgments about vague or general descriptions and interpret them as being specifically for us.
- Think horoscopes, palm readings, or astrology.
- As an example, a friend passes by and does not stop and talk.
- Confirmation Bias: Our mind’s bias to support beliefs we already hold.
Memory
- In addition to all of these and many other biases, our critical thinking is hampered because our memory is wildly unreliable.
- Study after study confirms that we have poor and inaccurate memories.
- We do not keep static memories in our brains. Rather we confabulate our memories. That is, we fill in details missing from our memory as the brain invents the missing details to construct a consistent narrative.
- Confabulation has led to many instances of implanted false memories usually resulting from strong suggestions or imaging.
- The quote from Robert Evans, the 1970’s Hollywood producer, has been proven to be accurate:
- “There are three sides to every story: your side, my side and the truth. And no one is lying. Memories shared serve each differently.”
- Interestingly, numerous studies have shown a slight inverse correlation between the certainty of the memory and the accuracy of the memory.
- When we are certain that we remember exactly how something happened, our memories are actually slightly less accurate.
Patterns
- We are evolutionarily geared to find patterns (think shapes of clouds, etc.) even when patterns do not exist.
- This is seen in conspiracy theories and other cases where we find meaning where there is no meaning.
- Importantly, for business, this has serious repercussions for data mining.
- Is there a true pattern in the data or did we just invent it?
Probability
- Relatedly, we are exceedingly innumerate. This means that we do not have a good sense of probability and statistics.
- This leads to finding meaning in data that may often just be random statistical variation in the data.
- If you flip a coin and come up with heads ten times in a row, this does not mean that the person flipping the coin has mastered flipping coins.
- No, either the person is cheating (say by using a two-headed coin) or else it is a normal and random statistical variation where sometime ten heads will come up in a row.
- Likewise, we suffer from Clustering illusions: We do not understand the degree to which randomness clusters.
- In sports, this leads to believing in the “hot hand effect” in basketball where we attribute successive three point shots to a “hot hand” rather than randomness.
- As an other example, most streaming music services modify their random algorithms to prevent a song from being played twice in a row when the device is in random mode. Under a pure random algorithm, this should happen occasionally (and it does not mean that the device or music service is broken).
Conclusion
As business leaders…
- We need to transcend all these biases and limitations in order to make strong judgments and effective executive decisions.
- And we need to be humble about our memory; what we are certain we remember may not have happened exactly as we remember.
- In the end, it just might have been better to have it written down.