There is a severe problem in today’s society that forces you to pick sides. Whether it be sports or politics, society leaves little room for nuance and depth. There seems to be a tendency to sharply dichotomize the world into black or white, completely neglecting the wide spectrum of colors that seems to make up reality. This phenomenon could be explained with the letters “WYSIATI”.
WYSIATI is an acronym coined by psychologist and economic theorist Daniel Kahneman and it stands for “What You See Is All There Is”. Kahneman uses this word to describe a human bias that further illustrates the fact that decision making is not entirely based on rational thought. In fact, according to Mr. Kahneman’s work, our decision-making processes are much less rational than we’d like them to be. For example, imagine I were describing a friend to you and I said that this friend is “kind, nurturing, and patient”. Imagine I then asked you if this friend would be a good parent. You immediately have an answer for that, don’t you? Why is that? You really don’t know much about the friend I was describing other than the meager information I decided to share with you. What if I added “manic depressive” to my list of adjectives? Would you have come to the same conclusion?
Kahneman explains that the human mind, marvelous as it is, is a machine for jumping to conclusions using insufficient, unreliable, or unverified information. This feature of our brain is predictable and reproducible. Moreover, we seem to be very confident about the conclusion that pops into our head from this sparse information. It is not very difficult to see the evolutionary benefit of this feature. If primitive homo sapiens saw the brush moving in usual patterns, it benefited him to jump to the conclusion that it was a predator hiding in the brush then to pause and seek more evidence.
However, this cognitive feature is obviously detrimental if one is to think about muti-dimensional complex subjects such as politics, education, and ethics. The fact is, most things do no neatly fit into coherent narratives, especially not ones constructed from objectively insufficient information. No matter how coherent your little story sounds, your church going neighbor is probably not a racist ,trigger-happy bigot. Knowing your peer voted for a specific political party does not give you nearly enough evidence to construct a narrative about her. You have no reliable evidence on which to extrapolate further behavior, yet we do that all the time; we increase great divisions using miscategorizations and ad hominem arguments. Unfortunately, the degree to which we are certain about a specific conclusion depends not on the strength of the evidence on which it is based, but on the coherence of the narrative we have created for ourselves. Thus, we must look beyond the narrative that we automatically construct, especially on complex and important subjects.
This is clearly easier said than done. Like I said, this particular bias in decision making –WYSIATI– is ingrained into our cognitive psyche; it is a feature of our automatic, subconscious System 1. Asking you to not create a narrative when presented with information would be like asking you not to read a certain word if I spell it out for you.
“I just wrote the letters ‘H-E-L-LO’, on the board; however, you must not read the letters I just wrote. Think of these letters as nothing more than a random grouping of characters and not a formulation of a word.”
This would be impossible. What can we do, then?
1. Recognition (Grammar)
Kahneman’s groundbreaking research seems to suggest that there is not much to be done to reprogram our brains in order to avoid cognitive biases. However, we can acquire the tools to be able to recognize them and hopefully minimize their effects.
Memorize cognitive biases and logical fallacies. Logical fallacies have been studied at least since the time of Aristotle and are constantly used by political leaders, advertisers, and con-men. Yet, perhaps only philosophy majors can list them and their uses. Watch any political speech and tally how many logical fallacies are employed. After you finish picking your jaw off the floor, commit them to memory.
2. The Grind (Logic)
“It is the mark of an educated mind to be able to entertain a thought without accepting it.” — Aristotle
Know the opposing view so well that you could be able to recreate its argument and defend it if necessary. There is almost always more nuance to the opposing view than the people in your camp would have you believe.
Overconfidence is also a feature of our brain. For the most part we truly believe the narratives we construct even to the point of ignoring and discrediting contradictory evidence (see also “Confirmation Bias”). Don’t trust your conclusions without rigorous testing. Welcome contradictory evidence. Come up with plausible conclusions that correspond to the evidence and not the other way around. Always realize that there is a possibility that your conclusions are incomplete and incorrect and be ready to adjust them accordingly.
3. Debate (Rhetoric)
Whether it be through speech or the written word, seize every opportunity to reiterate and defend your arguments. Seek debates, seek criticism, be at the mercy of peer reviews. I have found that the mere act of outlining or presenting my arguments forces me to analyse them again, especially if I am presenting them to a peer or a group of my peers.
This three step process is the one that has worked for me. I have been following this method for several months now and have already noticed the drastic changes in how I process information and how I make decisions.
If you haven’t figured it out already, this “process” has been modeled directly by the Trivium Method for critical and creative thinking. I will be writing more about it soon!
–Thinking, Fast and Slow by Daniel Kahnman
–Wikipedia: Cognitive Bias
–Overview of the Trivium Method