Francisco Contreras – Unabashedly Skeptical

"The essence of the independent mind lies not in what it thinks, but in how it thinks." –Christopher Hitchens

Tag Archives: Heuristics and Biases

Having Opinions and Getting them Right

“In think chocolate ice-cream is tastier than vanilla”

“In my opinion, raising the minimum wage will improve the economy”

Most people wouldn’t think twice about classifying these two statements as “opinions”. Yet, they’re fundamentally different statements. The first is a statement of subjective preference that cannot be proven one way or the other and the second is a quantifiable belief claim that can be tested empirically. However, most people fail to make this distinction (or fail to realize its implications).  This is particularly problematic once you realize that quantifiable belief claims can have real, life-altering effects. If I believe eating kiwis will cure my cancer, I will be significantly worse off  than if  got treatment.

tumblr_inline_n4foafra0C1sew80h

Therefore it seems that quantifiable belief claims must be subjected to a standard  of empirical rigor proportional to the claim’s effect on its environment (the more important a claim is, the more rigor you must place in verifying that claim). With that in mind, let me attempt to make a “Checklist for  Having Important Opinions” . Let’s go.

Checklist for Having Important Opinions

1. Account for Personal Biases- It seems like I’ve talked about cognitive biases in nearly every post I’ve written. However, it is one of the most important tools to make sure your belief claims are objectively true.  Make sure that the degree to which you believe your quantifiable belief claim is the degree to which you have analysed the evidence for it and not degree to which you want to believe it.  According to this article, there are certain tested steps you can make to avoid some of the major biases:

  1. Consider the Opposite. “I am inclined to believe X. What are some reasons that X might be incorrect?” (I believe kiwis cure cancer. What are some seasons I can think of that show that kiwis wont cure my cancer?).
  2. Provide Reasons. Take a quick look through any major social media site and you’ll find people stating quantifiable belief claims in the form of assertions. Instead of doing this, provide reasons for your belief.  “I believe X. I believe X because… “
  3. Get Proper Training.  If you’re going to make statements about economic policy, make sure you have the sufficient training in economics to be able to analyse the data accurately. Likewise with belief claims involving statistics, medical research, penal law, etc.
  4. Reference Class Forecasting.  When you are faced with a difficult problem, find a reference of similar cases consider them in your predictions.

dilbert-confirmation-bias

2. Look for Quality Evidence  If your opinion is the type of belief claim that can be supported by empirical studies, look for the studies themselves and analyse them. Do not rely on third parties to interpret the studies for you.

  • Reading the primary source material is important because people are subject to errors of judgement and can interpret the results of a study in ways that confirm their presuppositions. The higher the degree of separation from the primary sources of evidence, the more likely it is to be misinterpreted.
  • Second, find out if there is expert consensus (usually achieved after an overwhelming number of repeatable, peer-reviewed studies).
  • Lastly,  if you have any reason to doubt the veracity of the study (of it its results are particularly controversial), look for meta-analyses and analyse them objectively.

i-have-data_3

3. Update your beliefs – It is not easy to incorporate new evidence to old beliefs. In order to avoid making  arguments impervious to  evidence, we must make sure to accurately update our beliefs in light of new evidence (provided we have done steps 1 & 2).  Also, we must make sure that the way we update our beliefs corresponds to the strength of the evidence (see Conservatism Bias)

Bayesian


Not Enough

I have found these steps personally useful. However,  although they might be necessary conditions for true beliefs, they are not sufficient ones. Having truthful quantified belief claims is an active process that no checklist will be able to perfect. According to the de-biasing article, going through the steps does not guarantee that you wont have biases, it just decreases their likelihood.

Likewise, seeking primary sources of evidence and reading meta-analyses may not be enough.  First, studies themselves are not impervious to bias or shoddy experimentation (even after peer review). Then,  groups of experts each with their own set of studies, might fundamentally differ on core findings (on cognitive priming, for example) . Lastly, there are times when different sets of meta-analyses point to opposite results. Scientists make mistakes as well.

Now, this is not to say that one should adopt a defeatist view of holding true beliefs (ie. “I cannot be 100% sure that I’m right, so I shouldn’t try”). What I mean to say is that having certain types of opinions require different degrees of justifications. These justifications are difficult and even if executed correctly are hardly perfect, so we should approach beliefs fully aware of our ignorance and limitations, and with the honest intention of getting them right (regardless of our feelings).

Resources and Further Reading

This article on “Reference Class Forecasting” and this one, too

-These two articles (one and two) by Scott Alexander on the fallibility of experts.

-These two great guides to Bayesian updating (one and two).

-This article on the “Twelve Virtues of Rationality”

Intuition: Friend or Foe?

Intuition:

n.  The ability to understand something immediately, without the need for conscious reasoning.

Intuition Is Your Enemy

Perhaps the greatest lesson I have learned from Daniel Kahneman’s work on decision-making is that we should be extremely skeptical of our intuitions.  We all have a subconscious, automatic system of thinking that often produces deviations in judgement, or cognitive biases. These cognitive biases can make us prone to make poor decisions (see my previous post), and produce misleading, contradictory, and even racist intuitions;  all while being completely unaware of what is happening.

IAT_ChartPrejudiced Intuitions

In 1995,  the psychologist Anthony Greenwald created the Implicit Association Test in order to confirm the idea that implicit and explicit memory applies to social constructs and attitudes.   It has been over a decade since this test was created, and the results have been shocking. In a meta-analysis of these tests, it showed that in all 185 of the studies analysed, 70 % of the people showed an implicit social preference for light skin over darker skin. The interesting part is that only 20% self-reported to having this preference. In other words, a majority of people have bias against darker skin, but very few are actually aware of it.  Furthermore, the analysis also showed that these implicit biases are predictive of social behavior.

What does this mean? Everyone has probably heard of or experienced situations where employers have denied a position to an otherwise qualified applicant on the basis of an “expert intuition”. This study suggests that the “expert intuition” may well just be an unconscious, implicit bias towards race, gender or age.

Moreover, evidence against expert intuition goes beyond just implicit associations.  Research shows that for instance political forecasters are no better than ordinary people at predicting events, professional wine connoisseurs cannot tell white wine from red wine in a blind taste test,  and simple mathematical models have proven vastly superior at diagnosing patients than professional psychiatrists.  The body of research is ever-growing and gives us compelling reasons to seriously distrust expert intuitions.

Intuition is Your Friend 

In Blink: The Power or Thinking Without Thinking, journalist Malcolm Gladwell makes the argument that fast, unconscious decisions are a tool for improved decision-making. He says that our mind is excellent at unconsciously spotting patterns from limited windows of experience and sees this phenomenon as a useful skill that should be exploited for our benefit.  In fact, he provides numerous examples of experts using their unconscious intuitions (or “rapid cognition”, as Gladwell calls them) with surprising speed and accuracy. He tells the stories of chess grand-masters able to come up with the perfect counter-move within a couple of seconds of their opponent’s action, of a tennis coach who is able to spot when a player will do  a double-fault moments before the racket even touches the ball,  and of art experts able to differentiate highly elaborate copies of artwork from originals with only an instant of close inspection.

ysynr3gf-1367382615

In fact, the study of highly accurate, expert intuition is part a decision-making framework called Naturalistic Decision-Making. NDM researchers usually favor the snap decisions of highly skilled, highly experienced individuals instead of formal analysis and optimal performance models. NDM research shows that in certain environments, these intuitive decisions actually outperform the outcome of  optimal decision models and other formal analyses.

Verdict 

Different Perspectives

What to make of this apparent contradiction? On one hand you’ve got Kahneman and his “Heuristics and Biases” camp advocating skepticism in the face of expert intuition and highlighting the many instances where skilled experts fail miserably.  On the other hand, there is Gladwell, Gary Klein and the NDM crowd showing how intuitive expert decisions are often excellent, sometimes even better than deliberate conscious analysis.

When, if ever, should I trust expert intuitions? When should I trust my intuitions? Are Gladwell’s examples rare anomalies, or do they represent a complex pattern?

The answer to these questions turns out to be simple and elegant. After a few hours of digging through paper after paper looking for a possible way to make sense of this intriguing paradox, I stumbled upon a paper aptly titled A Failure to Disagree. It is co-authored by none other than Daniel Kahneman and Gary A. Klein, the leading advocates of the two decision-making frameworks.  In the paper they explained how the frameworks in which they were operating were two different perspectives of the same phenomenon, not opposing and contradictory views. The paper proposed various conditions with which to judge the validity of intuitive judgement, but I will summarize them in three questions.

Diagnostic Questions to Judge the Value of Intuition

  1. Has this intuitive judgement been made in a predictable environment?
  2. Does this environment provide immediate and accurate feedback?
  3. Do you have solid grounds for saying that this individual has had enough opportunity to learn the regularities of the environment? (sufficient expertise)

If the answer to these three questions is positive, then you have enough information to trust intuitive judgement. If not, you should use your skepticism and be ever wary of cognitive biases in intuitive decision-making.


Reflection

One of the aims of my Open Source Learning Project is to learn to think critically and make better decisions. My quick experience in the study of intuition taught me something much more profound than just an updated model of decision-making; this experience taught me to look beyond the false dichotomy I instinctively created when confronted with seemingly opposing views. The truth turned out to be very useful, enlightening and even beautiful; but it did require a lot of effort and deep thinking. It could have been much easier to choose the perspective I liked the most and rationalize my decision accordingly. However, choosing to pursue the truth is what gave true understanding of the subject, and that is what’s most important.

Further Reading & Resources

Sources of Power: How People Make Decisions by Gary Klein (Amazon)

Implicit Association Test (free to take)

Talks@Google: Daniel Kahneman (video)

Daniel Kahneman’s experience on working with Klein (video)