Francisco Contreras – Unabashedly Skeptical

"The essence of the independent mind lies not in what it thinks, but in how it thinks." –Christopher Hitchens

Category Archives: Heuristics and Biases

In Defense of Uncertainty (Lesson #1, Rationality Series)

It has been a while since I read the book Rationality: From AI to Zombies by Eliezer Yudkowsky (from now RAIZ). I have been meaning to write about it, but the book is 1800+ pages long and covers very many topics in varying degrees, from evolutionary psychology to quantum mechanics. Instead of writing one long review, I will write a series of short posts expanding upon a specific idea from the book.

In Defense of Uncertainty.

Certainly one of the biggest takeaways from RAIZ  is its attitude towards epistemic certainty. That is to say that due to our systematic, unconscious errors in judgement and the fact that both the instrument (brain) and the method (science) for obtaining truth is clumsy and fallible, we must adopt an attitude of humility when making judgments. This has been a recurring theme in this blog, so I don’t want to repeat myself too much. Instead, I want to show how this attitude towards certainty has worked in my life.

Prior to being exposed to the ideas of this book (and of “Rationality”/Mental Models material in general), I thought I was certain about many things, especially political things.  When I realized that certainty was pretty much unattainable at an ideological level, my whole political thought-process was radically transformed.  To illustrate what I mean, and in homage of the quirky rhetoric in which RAIZ is written, I will do it by way of a dialogue. The following exchange is between a slightly caricatured version of my former self –Irrational Luis–, and an idealized, rational me –Rational Francisco–.

Irrational_Luis:¨ Reds are wrong. Their goal is to suppress freedom and discourage progress. We Greens must stand for the oppressed and fight against the forces that seek to destroy everything that we have accomplished  ”

Rational_Francisco: “How can you believe something like that?… You mean to say that everyone who isn’t a Green must be freedom-hating and oppressive?”

Irrational_Luis “ Yup, pretty much, and I will prove it to you.  Take a hard look at the history of the world. If you’re really looking, you’ll notice that  the biggest oppressors, killers, and liars have been Red. Even when they claimed to be Green, their actions were undoubtedly Red.   Likewise, people who have advocated morality and have brought prosperity on our country have been Green. Even when they claimed to be Red, their actions were undoubtedly Green”

Rational_Francisco:: “ I see… “

Irrational_Luis: “ You’re not convinced? Well I have plenty more evidence. I go to a group that gets together every Tuesday and Thursday and talks about all the ways in which Green ideas are correct. We also talk about all the silly Red ideas that we hear and we discuss how they are wrong. We talk for hours and even make bumper stickers of our favorite Green soundbites. “

Rational _Francisco:: “You seem pretty certain of your position”

Irrational_Luis: “Of course. How couldn’t I be?”

Rational_Francisco:“ Well, have you ever considered that you might be wrong?”

Irrational_Luis: “Look, buddy, I’m no fool. I base my opinions on truth. Either the world is Red, or the world is Green. As simple as that. If the world all of a sudden changed to Red, then I would be Red. Got it?  But the world is not Red”

Rational_Francisco:“Would you agree that the world is a complex place?”

Irrational_Luis:“Why yes, of course. The world is very complex”

Rational Luis:“Okay. Do you think that there are things we haven’t yet understood about the world?”

Irrational_Luis:“ Where are you getting at? Why so many questions?”

Rational_Francisco:”I’m just trying to understand.”

Irrational_Luis: “Are you even a Green?”


Irrational_Luis: I KNEW IT. You’re one of those filthy..”

Rational_Francisco:“I’m not a Red either”

Irrational_Luis: “Ah, you must be a foreigner then. My cousin Sandra once talked with one of those hippies and…”

Rational_Francisco: “I’m not. Um… please just answer my question.  Do you think there things out there that you don’t understand?

Irrational_Luis: “Yeah… sure.  But that doesn’t mean that the Reds know any better. You see, those people believe that…”

Rational_Francisco: “Let me stop you right there. You admitted that the world is a complex place and that there are things that you don’t understand, so why must reality fit your narrative?

The theories and ideas that you have about the world are, by definition, no more than abstractions that you construct  in order to understand reality. You cannot expect to form accurate beliefs when you are dismissive of anything that contradicts your preconceptions. You discriminately require more evidence for anything that opposes your view and all too readily accept things that fit. You hang around people who think the same way as you and make caricatures of those who don’t.

Breaking free from your self-imposed delusion requires that you discard your need for certainty and narrative coherence. Get used to the fact that you are going to be wrong about many things.  Seek ideas and people who disagree with you and always listen to what they have to say. Re-phrase their ideas as best and as generously as you can and try to find value in their words.

Be humble and admit that there is no reason for you to understand the world.  It’s good that you are passionate, but don’t worry too much about whether things are Red or Green, worry about whether things are true. Be a champion for truth.  That should be your chief worry. Truth. Be suspicious of when you feel certain about something, because that could just be your passions diverting you from the truth.

Here, take this book. It should get you started. Never stop reading. Never stop questioning. Learn some math.”


I originally wrote the dialogue for other purposes, but I think the idea is original enough to merit being reproduced in this blog. It also seems to have the capacity to potentially make me cringe when I look back on it some time hence, which is always fun I guess.


Having Opinions and Getting them Right

“In think chocolate ice-cream is tastier than vanilla”

“In my opinion, raising the minimum wage will improve the economy”

Most people wouldn’t think twice about classifying these two statements as “opinions”. Yet, they’re fundamentally different statements. The first is a statement of subjective preference that cannot be proven one way or the other and the second is a quantifiable belief claim that can be tested empirically. However, most people fail to make this distinction (or fail to realize its implications).  This is particularly problematic once you realize that quantifiable belief claims can have real, life-altering effects. If I believe eating kiwis will cure my cancer, I will be significantly worse off  than if  got treatment.


Therefore it seems that quantifiable belief claims must be subjected to a standard  of empirical rigor proportional to the claim’s effect on its environment (the more important a claim is, the more rigor you must place in verifying that claim). With that in mind, let me attempt to make a “Checklist for  Having Important Opinions” . Let’s go.

Checklist for Having Important Opinions

1. Account for Personal Biases- It seems like I’ve talked about cognitive biases in nearly every post I’ve written. However, it is one of the most important tools to make sure your belief claims are objectively true.  Make sure that the degree to which you believe your quantifiable belief claim is the degree to which you have analysed the evidence for it and not degree to which you want to believe it.  According to this article, there are certain tested steps you can make to avoid some of the major biases:

  1. Consider the Opposite. “I am inclined to believe X. What are some reasons that X might be incorrect?” (I believe kiwis cure cancer. What are some seasons I can think of that show that kiwis wont cure my cancer?).
  2. Provide Reasons. Take a quick look through any major social media site and you’ll find people stating quantifiable belief claims in the form of assertions. Instead of doing this, provide reasons for your belief.  “I believe X. I believe X because… “
  3. Get Proper Training.  If you’re going to make statements about economic policy, make sure you have the sufficient training in economics to be able to analyse the data accurately. Likewise with belief claims involving statistics, medical research, penal law, etc.
  4. Reference Class Forecasting.  When you are faced with a difficult problem, find a reference of similar cases consider them in your predictions.


2. Look for Quality Evidence  If your opinion is the type of belief claim that can be supported by empirical studies, look for the studies themselves and analyse them. Do not rely on third parties to interpret the studies for you.

  • Reading the primary source material is important because people are subject to errors of judgement and can interpret the results of a study in ways that confirm their presuppositions. The higher the degree of separation from the primary sources of evidence, the more likely it is to be misinterpreted.
  • Second, find out if there is expert consensus (usually achieved after an overwhelming number of repeatable, peer-reviewed studies).
  • Lastly,  if you have any reason to doubt the veracity of the study (of it its results are particularly controversial), look for meta-analyses and analyse them objectively.


3. Update your beliefs – It is not easy to incorporate new evidence to old beliefs. In order to avoid making  arguments impervious to  evidence, we must make sure to accurately update our beliefs in light of new evidence (provided we have done steps 1 & 2).  Also, we must make sure that the way we update our beliefs corresponds to the strength of the evidence (see Conservatism Bias)


Not Enough

I have found these steps personally useful. However,  although they might be necessary conditions for true beliefs, they are not sufficient ones. Having truthful quantified belief claims is an active process that no checklist will be able to perfect. According to the de-biasing article, going through the steps does not guarantee that you wont have biases, it just decreases their likelihood.

Likewise, seeking primary sources of evidence and reading meta-analyses may not be enough.  First, studies themselves are not impervious to bias or shoddy experimentation (even after peer review). Then,  groups of experts each with their own set of studies, might fundamentally differ on core findings (on cognitive priming, for example) . Lastly, there are times when different sets of meta-analyses point to opposite results. Scientists make mistakes as well.

Now, this is not to say that one should adopt a defeatist view of holding true beliefs (ie. “I cannot be 100% sure that I’m right, so I shouldn’t try”). What I mean to say is that having certain types of opinions require different degrees of justifications. These justifications are difficult and even if executed correctly are hardly perfect, so we should approach beliefs fully aware of our ignorance and limitations, and with the honest intention of getting them right (regardless of our feelings).

Resources and Further Reading

This article on “Reference Class Forecasting” and this one, too

-These two articles (one and two) by Scott Alexander on the fallibility of experts.

-These two great guides to Bayesian updating (one and two).

-This article on the “Twelve Virtues of Rationality”

Intuition: Friend or Foe?


n.  The ability to understand something immediately, without the need for conscious reasoning.

Intuition Is Your Enemy

Perhaps the greatest lesson I have learned from Daniel Kahneman’s work on decision-making is that we should be extremely skeptical of our intuitions.  We all have a subconscious, automatic system of thinking that often produces deviations in judgement, or cognitive biases. These cognitive biases can make us prone to make poor decisions (see my previous post), and produce misleading, contradictory, and even racist intuitions;  all while being completely unaware of what is happening.

IAT_ChartPrejudiced Intuitions

In 1995,  the psychologist Anthony Greenwald created the Implicit Association Test in order to confirm the idea that implicit and explicit memory applies to social constructs and attitudes.   It has been over a decade since this test was created, and the results have been shocking. In a meta-analysis of these tests, it showed that in all 185 of the studies analysed, 70 % of the people showed an implicit social preference for light skin over darker skin. The interesting part is that only 20% self-reported to having this preference. In other words, a majority of people have bias against darker skin, but very few are actually aware of it.  Furthermore, the analysis also showed that these implicit biases are predictive of social behavior.

What does this mean? Everyone has probably heard of or experienced situations where employers have denied a position to an otherwise qualified applicant on the basis of an “expert intuition”. This study suggests that the “expert intuition” may well just be an unconscious, implicit bias towards race, gender or age.

Moreover, evidence against expert intuition goes beyond just implicit associations.  Research shows that for instance political forecasters are no better than ordinary people at predicting events, professional wine connoisseurs cannot tell white wine from red wine in a blind taste test,  and simple mathematical models have proven vastly superior at diagnosing patients than professional psychiatrists.  The body of research is ever-growing and gives us compelling reasons to seriously distrust expert intuitions.

Intuition is Your Friend 

In Blink: The Power or Thinking Without Thinking, journalist Malcolm Gladwell makes the argument that fast, unconscious decisions are a tool for improved decision-making. He says that our mind is excellent at unconsciously spotting patterns from limited windows of experience and sees this phenomenon as a useful skill that should be exploited for our benefit.  In fact, he provides numerous examples of experts using their unconscious intuitions (or “rapid cognition”, as Gladwell calls them) with surprising speed and accuracy. He tells the stories of chess grand-masters able to come up with the perfect counter-move within a couple of seconds of their opponent’s action, of a tennis coach who is able to spot when a player will do  a double-fault moments before the racket even touches the ball,  and of art experts able to differentiate highly elaborate copies of artwork from originals with only an instant of close inspection.


In fact, the study of highly accurate, expert intuition is part a decision-making framework called Naturalistic Decision-Making. NDM researchers usually favor the snap decisions of highly skilled, highly experienced individuals instead of formal analysis and optimal performance models. NDM research shows that in certain environments, these intuitive decisions actually outperform the outcome of  optimal decision models and other formal analyses.


Different Perspectives

What to make of this apparent contradiction? On one hand you’ve got Kahneman and his “Heuristics and Biases” camp advocating skepticism in the face of expert intuition and highlighting the many instances where skilled experts fail miserably.  On the other hand, there is Gladwell, Gary Klein and the NDM crowd showing how intuitive expert decisions are often excellent, sometimes even better than deliberate conscious analysis.

When, if ever, should I trust expert intuitions? When should I trust my intuitions? Are Gladwell’s examples rare anomalies, or do they represent a complex pattern?

The answer to these questions turns out to be simple and elegant. After a few hours of digging through paper after paper looking for a possible way to make sense of this intriguing paradox, I stumbled upon a paper aptly titled A Failure to Disagree. It is co-authored by none other than Daniel Kahneman and Gary A. Klein, the leading advocates of the two decision-making frameworks.  In the paper they explained how the frameworks in which they were operating were two different perspectives of the same phenomenon, not opposing and contradictory views. The paper proposed various conditions with which to judge the validity of intuitive judgement, but I will summarize them in three questions.

Diagnostic Questions to Judge the Value of Intuition

  1. Has this intuitive judgement been made in a predictable environment?
  2. Does this environment provide immediate and accurate feedback?
  3. Do you have solid grounds for saying that this individual has had enough opportunity to learn the regularities of the environment? (sufficient expertise)

If the answer to these three questions is positive, then you have enough information to trust intuitive judgement. If not, you should use your skepticism and be ever wary of cognitive biases in intuitive decision-making.


One of the aims of my Open Source Learning Project is to learn to think critically and make better decisions. My quick experience in the study of intuition taught me something much more profound than just an updated model of decision-making; this experience taught me to look beyond the false dichotomy I instinctively created when confronted with seemingly opposing views. The truth turned out to be very useful, enlightening and even beautiful; but it did require a lot of effort and deep thinking. It could have been much easier to choose the perspective I liked the most and rationalize my decision accordingly. However, choosing to pursue the truth is what gave true understanding of the subject, and that is what’s most important.

Further Reading & Resources

Sources of Power: How People Make Decisions by Gary Klein (Amazon)

Implicit Association Test (free to take)

Talks@Google: Daniel Kahneman (video)

Daniel Kahneman’s experience on working with Klein (video)

Learning How to Think: “What You See Is All There Is”

The Problem

There is a severe problem in today’s society that forces you to pick sides. Whether it be sports or politics, society leaves little room for nuance and depth. There seems to be a tendency to sharply dichotomize the world into black or white, completely neglecting the wide spectrum of colors that seems to make up reality.  This phenomenon could be explained with the letters “WYSIATI”.

left-right-compromise                                                      Liberal-vs-Conservative

WYSIATI is an acronym coined by psychologist and economic theorist Daniel Kahneman and it stands for “What You See Is All There Is”. Kahneman uses this word to describe a human bias that further illustrates the fact that decision making is not entirely based on rational thought. In fact, according to Mr. Kahneman’s work, our decision-making processes are much less rational than we’d like them  to be.  For example, imagine I were describing a friend to you and I said that this friend is “kind, nurturing, and patient”. Imagine I then asked you if this friend would be a good parent. You immediately have an answer for that, don’t you? Why is that? You really don’t know much about the friend I was describing other than the meager information I decided to share with you. What if I added “manic depressive” to my list of adjectives? Would you have come to the same conclusion?


Kahneman explains that the human mind, marvelous as it is, is a machine for jumping to conclusions using insufficient, unreliable, or unverified information. This feature of our brain is predictable and reproducible.  Moreover, we seem to be very confident about the conclusion that pops into our head from this sparse information. It is not very difficult to see the evolutionary benefit of this feature. If primitive homo sapiens saw the brush moving in usual patterns, it benefited him to jump to the conclusion that it was a predator hiding in the brush then to pause and seek more evidence.

However, this cognitive feature is obviously detrimental if one is to think about muti-dimensional complex subjects such as politics, education,  and ethics. The fact is, most things do no neatly fit into coherent narratives, especially not ones constructed from objectively insufficient information. No matter how coherent your little story sounds, your church going neighbor is probably not a racist ,trigger-happy bigot.  Knowing your peer voted for a specific political party does not give you nearly enough evidence to construct a narrative about her.  You have no reliable evidence on which to extrapolate further behavior, yet we do that all the time; we increase great divisions using miscategorizations and ad hominem arguments.  Unfortunately, the degree to which we are certain about a specific conclusion depends not on the strength of the evidence on which it is based, but on the coherence of the narrative we have created for ourselves. Thus, we must look beyond the narrative that we automatically construct, especially on  complex and important subjects.

This is clearly easier said than done. Like I said, this particular bias in decision making –WYSIATI– is ingrained into our cognitive psyche; it is a feature of our automatic, subconscious System 1. Asking you to not create a narrative when presented with information would be like asking you not to read a certain word if I spell it out for you.

“I just wrote the letters ‘H-E-L-LO’, on the board; however, you must not read the letters I just wrote. Think of these letters as nothing more than a random grouping of characters and not a formulation of a word.”

This would be impossible. What can we do, then?

My Solution


1. Recognition (Grammar)

Kahneman’s groundbreaking research seems to suggest that there is not much to be done to reprogram our brains in order to avoid cognitive biases. However, we can acquire the tools to be able to recognize them and hopefully minimize their effects.

Memorize cognitive biases and logical fallacies.  Logical fallacies have been studied at least since the time of Aristotle and are constantly used by political leaders, advertisers, and con-men. Yet, perhaps only philosophy majors can list them and their uses.  Watch any political speech and tally how many logical fallacies are employed. After you finish picking your jaw off the floor, commit them to memory.

2. The Grind (Logic)

“It is the mark of an educated mind to be able to entertain a thought without accepting it.”  — Aristotle

Know the opposing view so well that you could be able to recreate its argument and defend it if necessary. There is almost always more nuance to the opposing view than the people in your camp would have you believe.

Overconfidence is also a feature of our brain. For the most part we truly believe the narratives we construct even to the point of ignoring and discrediting contradictory evidence (see also “Confirmation Bias”).  Don’t trust your conclusions without rigorous testing.  Welcome contradictory evidence. Come up with plausible conclusions that correspond to the evidence and not the other way around. Always realize that there is a possibility that your conclusions are incomplete and incorrect and be ready to adjust them accordingly.

3. Debate (Rhetoric)

Whether it be through speech or the written word, seize every opportunity to reiterate and defend your arguments. Seek debates, seek criticism, be at the mercy of peer reviews.  I have found that the mere act of outlining or presenting my arguments forces me to analyse them again, especially if I am presenting them to a peer or a group of my peers.

This three step process is the one that has worked for me.  I have been following this method for several months now and have already noticed the drastic changes in how I process information and how I make decisions.

If you haven’t figured it out already, this “process” has been modeled directly by the Trivium Method for critical and creative thinking. I will be writing more about it soon!

Further Reading

Thinking, Fast and Slow by Daniel Kahnman

Wikipedia: Cognitive Bias

Overview of the Trivium Method