Francisco Contreras – Unabashedly Skeptical

"The essence of the independent mind lies not in what it thinks, but in how it thinks." –Christopher Hitchens

Tag Archives: Debiasing

Having Opinions and Getting them Right

“In think chocolate ice-cream is tastier than vanilla”

“In my opinion, raising the minimum wage will improve the economy”

Most people wouldn’t think twice about classifying these two statements as “opinions”. Yet, they’re fundamentally different statements. The first is a statement of subjective preference that cannot be proven one way or the other and the second is a quantifiable belief claim that can be tested empirically. However, most people fail to make this distinction (or fail to realize its implications).  This is particularly problematic once you realize that quantifiable belief claims can have real, life-altering effects. If I believe eating kiwis will cure my cancer, I will be significantly worse off  than if  got treatment.

tumblr_inline_n4foafra0C1sew80h

Therefore it seems that quantifiable belief claims must be subjected to a standard  of empirical rigor proportional to the claim’s effect on its environment (the more important a claim is, the more rigor you must place in verifying that claim). With that in mind, let me attempt to make a “Checklist for  Having Important Opinions” . Let’s go.

Checklist for Having Important Opinions

1. Account for Personal Biases- It seems like I’ve talked about cognitive biases in nearly every post I’ve written. However, it is one of the most important tools to make sure your belief claims are objectively true.  Make sure that the degree to which you believe your quantifiable belief claim is the degree to which you have analysed the evidence for it and not degree to which you want to believe it.  According to this article, there are certain tested steps you can make to avoid some of the major biases:

  1. Consider the Opposite. “I am inclined to believe X. What are some reasons that X might be incorrect?” (I believe kiwis cure cancer. What are some seasons I can think of that show that kiwis wont cure my cancer?).
  2. Provide Reasons. Take a quick look through any major social media site and you’ll find people stating quantifiable belief claims in the form of assertions. Instead of doing this, provide reasons for your belief.  “I believe X. I believe X because… “
  3. Get Proper Training.  If you’re going to make statements about economic policy, make sure you have the sufficient training in economics to be able to analyse the data accurately. Likewise with belief claims involving statistics, medical research, penal law, etc.
  4. Reference Class Forecasting.  When you are faced with a difficult problem, find a reference of similar cases consider them in your predictions.

dilbert-confirmation-bias

2. Look for Quality Evidence  If your opinion is the type of belief claim that can be supported by empirical studies, look for the studies themselves and analyse them. Do not rely on third parties to interpret the studies for you.

  • Reading the primary source material is important because people are subject to errors of judgement and can interpret the results of a study in ways that confirm their presuppositions. The higher the degree of separation from the primary sources of evidence, the more likely it is to be misinterpreted.
  • Second, find out if there is expert consensus (usually achieved after an overwhelming number of repeatable, peer-reviewed studies).
  • Lastly,  if you have any reason to doubt the veracity of the study (of it its results are particularly controversial), look for meta-analyses and analyse them objectively.

i-have-data_3

3. Update your beliefs – It is not easy to incorporate new evidence to old beliefs. In order to avoid making  arguments impervious to  evidence, we must make sure to accurately update our beliefs in light of new evidence (provided we have done steps 1 & 2).  Also, we must make sure that the way we update our beliefs corresponds to the strength of the evidence (see Conservatism Bias)

Bayesian


Not Enough

I have found these steps personally useful. However,  although they might be necessary conditions for true beliefs, they are not sufficient ones. Having truthful quantified belief claims is an active process that no checklist will be able to perfect. According to the de-biasing article, going through the steps does not guarantee that you wont have biases, it just decreases their likelihood.

Likewise, seeking primary sources of evidence and reading meta-analyses may not be enough.  First, studies themselves are not impervious to bias or shoddy experimentation (even after peer review). Then,  groups of experts each with their own set of studies, might fundamentally differ on core findings (on cognitive priming, for example) . Lastly, there are times when different sets of meta-analyses point to opposite results. Scientists make mistakes as well.

Now, this is not to say that one should adopt a defeatist view of holding true beliefs (ie. “I cannot be 100% sure that I’m right, so I shouldn’t try”). What I mean to say is that having certain types of opinions require different degrees of justifications. These justifications are difficult and even if executed correctly are hardly perfect, so we should approach beliefs fully aware of our ignorance and limitations, and with the honest intention of getting them right (regardless of our feelings).

Resources and Further Reading

This article on “Reference Class Forecasting” and this one, too

-These two articles (one and two) by Scott Alexander on the fallibility of experts.

-These two great guides to Bayesian updating (one and two).

-This article on the “Twelve Virtues of Rationality”

Advertisements