This month’s article is excerpted from a talk I gave on this topic, a portion of which was extracted from the following book.
Don’t Believe Everything You Think (Kida, Thomas. (2006). Don’t Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking. New York: Prometheus Books.
This book describes 6 basic mistakes that we make in thinking. This book comes from the Skeptics Society tradition of looking at what prevents us from critical thinking. Let’s take a brief look at these mistakes.
- The first is that we tend to prefer stories and anecdotes to statistics or formal proof. The example of this I recall from graduate school is the person who is picking a car. While they may have data to tell them a certain car is very reliable, they are more likely to be swayed by a neighbour who tells them about a isolated bad experience with that car. This is part of the “thinking quick” portion of this talk’s title, we like to get the broad sense of something and are much less inclined to evaluate the factual basis of the stories. Weapons of mass destruction anyone?
- The second is that we tend to seek to confirm, rather than to question our ideas. This is part of the “locking in” that we do in making judgements. We are prone to looking at and better remembering information that confirms our beliefs and potentially not even seeing information that disconfirms them. The final impression we may be left with is that the evidence for our belief is simply overwhelming. The alternative, pondering whether we have missed the point and need to re-tool is not an easy thing. After the invasion of Iraq it was found that the evidence that had been carefully selected to support going to war was mostly incorrect. This is a situation where a confirming strategy lead to dire consequences. We can also see it with those who believe in psychics or unusual health interventions, they tend to remember the hits and forget the misses. In fact, we are generally prone to remember the times we’re successful, and forget the times we fail. Failures that are remembered may then tend to be reinterpreted in a way that supports our belief. This is what the Harvard psychologist Eileen Langer calls the: “Heads I win, tails its chance” phenomenon.
- The third mistake is that we rarely appreciate the role of chance and coincidence in shaping events. We are prone to looking for meaning and causality in everything, which is generally good, but sometimes things just happen. What I like to say to my clients, and is discussed in this book, is that if you flip a coin several thousand times, a run of 20 heads, for example, is still random. You can see this kind of misplaced sense of causality in athletes wearing “lucky” clothes, and people putting their life savings into slot machines believing that they have warmed them up. I had a client who had worked in casinos for many years who described a common phenomenon of people who actually win, but are then unable to step away from the tables at that point, and then as one would expect given the role of chance, end up losing. Gaming feeds off human nature.
- The fourth mistake is that we sometimes misperceive the world around us. It has been said that the saying should not be “I’ll believe it when I see it,” but rather, “I see it because I believe it.” People who believe in ghosts or aliens are more likely to see them. An example in the book was a radio personality staying that he had seen V-shaped vessels in the sky, resulting in many calls of sightings. He had actually made it up. We can see this in the social realm when, for example, we expect bad behaviour from someone and chastise them even though they turn out to behave quite well.
- The fifth mistake is that we tend to oversimplify our thinking. We are inundated with information all the time and our body takes care of us by paring things down. Sometimes we do that in our decision-making too when we assume that because someone has some similar attributes to others we know, that they will behave in the same way. This is seen in stereotypes, and other situations where we judge solely on the basis of information that readily comes to mind.
- Finally, the sixth mistake discussed in this book is that we tend to have faulty memories. Many assume wrongly that memory should work perfectly. One other little nugget I have used with some clients is that about 7 billion people in the world think that they have a memory problem. Some do, of course, many do not. Not only do we selectively store information away, we can be selective in retrieving it. Remembering is not actually going to a shelf to get the very same book that we put away, but rather a process of reconstructing the memory, or actually rewriting the book to a degree. With time, there is more and more creative writing going on. So, if I form a quick impression of you and lock that in, I may not notice, store away, or recall information that goes against that impression.
In essence critical thinking does not come naturally to us, we are uncomfortable with the uncertainty required as part of a search for truths, and we tend to be quick to believe things on the basis of incomplete or inappropriate evidence. What we need to do is think about reasons why our particular judgements could be wrong. Considering the alternatives is one of the most effective methods we have to counter many of our problematic judgment biases, but this takes practice.
* * *