Francisco Contreras – Unabashedly Skeptical

"The essence of the independent mind lies not in what it thinks, but in how it thinks." –Christopher Hitchens

Tag Archives: Eliezer Yudkowsky

In Defense of Uncertainty (Lesson #1, Rationality Series)

It has been a while since I read the book Rationality: From AI to Zombies by Eliezer Yudkowsky (from now RAIZ). I have been meaning to write about it, but the book is 1800+ pages long and covers very many topics in varying degrees, from evolutionary psychology to quantum mechanics. Instead of writing one long review, I will write a series of short posts expanding upon a specific idea from the book.

In Defense of Uncertainty.

Certainly one of the biggest takeaways from RAIZ  is its attitude towards epistemic certainty. That is to say that due to our systematic, unconscious errors in judgement and the fact that both the instrument (brain) and the method (science) for obtaining truth is clumsy and fallible, we must adopt an attitude of humility when making judgments. This has been a recurring theme in this blog, so I don’t want to repeat myself too much. Instead, I want to show how this attitude towards certainty has worked in my life.

Prior to being exposed to the ideas of this book (and of “Rationality”/Mental Models material in general), I thought I was certain about many things, especially political things.  When I realized that certainty was pretty much unattainable at an ideological level, my whole political thought-process was radically transformed.  To illustrate what I mean, and in homage of the quirky rhetoric in which RAIZ is written, I will do it by way of a dialogue. The following exchange is between a slightly caricatured version of my former self –Irrational Luis–, and an idealized, rational me –Rational Francisco–.


Irrational_Luis:¨ Reds are wrong. Their goal is to suppress freedom and discourage progress. We Greens must stand for the oppressed and fight against the forces that seek to destroy everything that we have accomplished  ”

Rational_Francisco: “How can you believe something like that?… You mean to say that everyone who isn’t a Green must be freedom-hating and oppressive?”

Irrational_Luis “ Yup, pretty much, and I will prove it to you.  Take a hard look at the history of the world. If you’re really looking, you’ll notice that  the biggest oppressors, killers, and liars have been Red. Even when they claimed to be Green, their actions were undoubtedly Red.   Likewise, people who have advocated morality and have brought prosperity on our country have been Green. Even when they claimed to be Red, their actions were undoubtedly Green”

Rational_Francisco:: “ I see… “

Irrational_Luis: “ You’re not convinced? Well I have plenty more evidence. I go to a group that gets together every Tuesday and Thursday and talks about all the ways in which Green ideas are correct. We also talk about all the silly Red ideas that we hear and we discuss how they are wrong. We talk for hours and even make bumper stickers of our favorite Green soundbites. “

Rational _Francisco:: “You seem pretty certain of your position”

Irrational_Luis: “Of course. How couldn’t I be?”

Rational_Francisco:“ Well, have you ever considered that you might be wrong?”

Irrational_Luis: “Look, buddy, I’m no fool. I base my opinions on truth. Either the world is Red, or the world is Green. As simple as that. If the world all of a sudden changed to Red, then I would be Red. Got it?  But the world is not Red”

Rational_Francisco:“Would you agree that the world is a complex place?”

Irrational_Luis:“Why yes, of course. The world is very complex”

Rational Luis:“Okay. Do you think that there are things we haven’t yet understood about the world?”

Irrational_Luis:“ Where are you getting at? Why so many questions?”

Rational_Francisco:”I’m just trying to understand.”

Irrational_Luis: “Are you even a Green?”

Rational_Francisco:‘No”

Irrational_Luis: I KNEW IT. You’re one of those filthy..”

Rational_Francisco:“I’m not a Red either”

Irrational_Luis: “Ah, you must be a foreigner then. My cousin Sandra once talked with one of those hippies and…”

Rational_Francisco: “I’m not. Um… please just answer my question.  Do you think there things out there that you don’t understand?

Irrational_Luis: “Yeah… sure.  But that doesn’t mean that the Reds know any better. You see, those people believe that…”

Rational_Francisco: “Let me stop you right there. You admitted that the world is a complex place and that there are things that you don’t understand, so why must reality fit your narrative?

The theories and ideas that you have about the world are, by definition, no more than abstractions that you construct  in order to understand reality. You cannot expect to form accurate beliefs when you are dismissive of anything that contradicts your preconceptions. You discriminately require more evidence for anything that opposes your view and all too readily accept things that fit. You hang around people who think the same way as you and make caricatures of those who don’t.

Breaking free from your self-imposed delusion requires that you discard your need for certainty and narrative coherence. Get used to the fact that you are going to be wrong about many things.  Seek ideas and people who disagree with you and always listen to what they have to say. Re-phrase their ideas as best and as generously as you can and try to find value in their words.

Be humble and admit that there is no reason for you to understand the world.  It’s good that you are passionate, but don’t worry too much about whether things are Red or Green, worry about whether things are true. Be a champion for truth.  That should be your chief worry. Truth. Be suspicious of when you feel certain about something, because that could just be your passions diverting you from the truth.

Here, take this book. It should get you started. Never stop reading. Never stop questioning. Learn some math.”


Disclaimer: 

I originally wrote the dialogue for other purposes, but I think the idea is original enough to merit being reproduced in this blog. It also seems to have the capacity to potentially make me cringe when I look back on it some time hence, which is always fun I guess.

Advertisements

Having Opinions and Getting them Right

“In think chocolate ice-cream is tastier than vanilla”

“In my opinion, raising the minimum wage will improve the economy”

Most people wouldn’t think twice about classifying these two statements as “opinions”. Yet, they’re fundamentally different statements. The first is a statement of subjective preference that cannot be proven one way or the other and the second is a quantifiable belief claim that can be tested empirically. However, most people fail to make this distinction (or fail to realize its implications).  This is particularly problematic once you realize that quantifiable belief claims can have real, life-altering effects. If I believe eating kiwis will cure my cancer, I will be significantly worse off  than if  got treatment.

tumblr_inline_n4foafra0C1sew80h

Therefore it seems that quantifiable belief claims must be subjected to a standard  of empirical rigor proportional to the claim’s effect on its environment (the more important a claim is, the more rigor you must place in verifying that claim). With that in mind, let me attempt to make a “Checklist for  Having Important Opinions” . Let’s go.

Checklist for Having Important Opinions

1. Account for Personal Biases- It seems like I’ve talked about cognitive biases in nearly every post I’ve written. However, it is one of the most important tools to make sure your belief claims are objectively true.  Make sure that the degree to which you believe your quantifiable belief claim is the degree to which you have analysed the evidence for it and not degree to which you want to believe it.  According to this article, there are certain tested steps you can make to avoid some of the major biases:

  1. Consider the Opposite. “I am inclined to believe X. What are some reasons that X might be incorrect?” (I believe kiwis cure cancer. What are some seasons I can think of that show that kiwis wont cure my cancer?).
  2. Provide Reasons. Take a quick look through any major social media site and you’ll find people stating quantifiable belief claims in the form of assertions. Instead of doing this, provide reasons for your belief.  “I believe X. I believe X because… “
  3. Get Proper Training.  If you’re going to make statements about economic policy, make sure you have the sufficient training in economics to be able to analyse the data accurately. Likewise with belief claims involving statistics, medical research, penal law, etc.
  4. Reference Class Forecasting.  When you are faced with a difficult problem, find a reference of similar cases consider them in your predictions.

dilbert-confirmation-bias

2. Look for Quality Evidence  If your opinion is the type of belief claim that can be supported by empirical studies, look for the studies themselves and analyse them. Do not rely on third parties to interpret the studies for you.

  • Reading the primary source material is important because people are subject to errors of judgement and can interpret the results of a study in ways that confirm their presuppositions. The higher the degree of separation from the primary sources of evidence, the more likely it is to be misinterpreted.
  • Second, find out if there is expert consensus (usually achieved after an overwhelming number of repeatable, peer-reviewed studies).
  • Lastly,  if you have any reason to doubt the veracity of the study (of it its results are particularly controversial), look for meta-analyses and analyse them objectively.

i-have-data_3

3. Update your beliefs – It is not easy to incorporate new evidence to old beliefs. In order to avoid making  arguments impervious to  evidence, we must make sure to accurately update our beliefs in light of new evidence (provided we have done steps 1 & 2).  Also, we must make sure that the way we update our beliefs corresponds to the strength of the evidence (see Conservatism Bias)

Bayesian


Not Enough

I have found these steps personally useful. However,  although they might be necessary conditions for true beliefs, they are not sufficient ones. Having truthful quantified belief claims is an active process that no checklist will be able to perfect. According to the de-biasing article, going through the steps does not guarantee that you wont have biases, it just decreases their likelihood.

Likewise, seeking primary sources of evidence and reading meta-analyses may not be enough.  First, studies themselves are not impervious to bias or shoddy experimentation (even after peer review). Then,  groups of experts each with their own set of studies, might fundamentally differ on core findings (on cognitive priming, for example) . Lastly, there are times when different sets of meta-analyses point to opposite results. Scientists make mistakes as well.

Now, this is not to say that one should adopt a defeatist view of holding true beliefs (ie. “I cannot be 100% sure that I’m right, so I shouldn’t try”). What I mean to say is that having certain types of opinions require different degrees of justifications. These justifications are difficult and even if executed correctly are hardly perfect, so we should approach beliefs fully aware of our ignorance and limitations, and with the honest intention of getting them right (regardless of our feelings).

Resources and Further Reading

This article on “Reference Class Forecasting” and this one, too

-These two articles (one and two) by Scott Alexander on the fallibility of experts.

-These two great guides to Bayesian updating (one and two).

-This article on the “Twelve Virtues of Rationality”