Book Review: The Invisible Gorilla: How Our Intuitions Deceive Us

I did the audiobook version, with great narration from Dan Woren.

The human mind, I like to say, is the greatest renewable resource that we have at our disposal. Thanks to our evolved mind, I can write this book review about human minds on a laptop invented by human minds and disseminate it across the internet conceived of by human minds. And yet, our minds are also easily deceived (not even nefariously so!), swimming in an illusory reality, often out of necessity, but this is also the catalyst for real-world consequences. In their 2010 book, The Invisible Gorilla: How Our Intuitions Deceive Us, two psychologists, Christopher Chabris and Daniel Simons, explain the myriad ways in which these mental fallibilities create the world we live in, or at least, the world we perceive we live in.

To think about the illusory way our brains deceive us into thinking we’re better at utilizing our brain than we are is to consider all of the ways we use our brain, such as to access memories, to pay attention to the world around us, to express our knowledge, to ascertain our own potential, and to untangle the complexities of the world, and remember that we are far worse at all of them than we can possibly imagine. Humility is the ultimate theme of Chabris’ and Simons’ book. We are overconfident (and confidence itself is a manifest illusion!) about all of these facets of our mind.

In a world-famous study from 1999, which is the namesake of the book, Chabris and Simons showed that our minds can be easily duped by an attention task. They asked subjects to watch a video of people in two different color shirts passing around a basketball, directing the subjects to count the passes between people in white shirts. No matter where the test subjects are from, about half will miss someone in a gorilla suit walking into the shot precisely because our attention is narrowed on the task (counting the passes) at hand. We assume we can always see what is right in front of us, but we don’t. One of my favorite examples of this is as writers who edit our own work, we miss simple typos and grammatical errors all the time because our brains just fill in the gaps, gliding past such errors. Those who missed the gorilla, though, are flabbergasted that they could have missed the gorilla. Importantly, this isn’t a test of intelligence; those who missed the gorilla aren’t dumber than those who do see the gorilla. It’s possible those who missed it weren’t focusing as strongly on the task as they thought they were, and thus missed the gorilla, for example. But this is the first of six illusions the book covers: mental blindness.

  1. Mental blindness. Or the illusion of attention. We think we can always see what’s right in front of us, but nobody can multitask in true fashion. The brain is either focused on one thing, or the other, but not both. Expertise can help, such as asking an NBA player to watch the gorilla video and see if they notice the gorilla, but even then, the issue isn’t so much expertise, but the human mind’s inability to anticipate the unexpected. In other words, what allows us to move through day-to-day life in a literal way is predictability, which I’ve blogged about before. Predictability allows us to function, future-plan, and largely avoid disaster even when we aren’t fully paying attention to what’s in front of us. That’s why when the unexpected, definitionally rare, event occurs, it can be tragic, like a bouncing ball coming into the street with a child chasing it causes us to either hit the child or swerve at the last minute. We aren’t expecting such an event when driving in a neighborhood, and we can’t always maintain ironclad vigilance. Our minds wander! Some psychologists, including Chabris and Simons, wondered if the gorilla test was a problem of video. What would happen in a real-world situation? Such a study was done, too, where subjects who were giving directions to a stranger on the street didn’t notice when the stranger changed, even along gender lines. This is called change blindness. We need focused attention.
  2. The illusion of memory. This one fascinates me because as Chabris and Simons note, we seem to understand the fallibility of our memory in short-term, nonconsequential day-to-day anecdotes, like forgetting why we walked into a room, or our keys, or in the example Simons and Chabris use, the limitation we have on recalling 15 words the authors just told us. At best, we maybe can recall seven or eight. Most intriguing, because the words all relate to sleepy without actually using the word “sleepy,” some people will swear sleepy was among the 15 words presented! Yet, when it comes to often consequential, long-term memories, we become overconfident in our mind’s ability to access those memories. And I say “access” for lack of a better word, as that itself is a misnomer, like our memories are merely sitting in a vault waiting for us to grab them when the moment arises. That makes no sense for our brains to catalog all the things … ever! The most consequential way in which overconfidence in memory can have deleterious effects for others is in our criminal justice system and with eyewitness testimony, which often relies on fallible (and malleable!) memory. They use an example of a rape victim who was convinced that the person she identified was her rapist. He was later exonerated after serving a number of years in prison. In a less consequential example Chabris and Simons explore, albeit still fascinating, is film continuity. Script supervisors are supposed to catch those issues, but even in famous, award-winning films, continuity issues abound. But they largely go unnoticed because most people watching the film aren’t going to notice them! Even so, there is a cottage industry around pointing them out, but it’s not the common experience with movie-goers. This question of memory did make me wonder, however, where the line is between faulty memory and lying. Are you lying if you truly believe the contents of your faulty memory? Lying has to be intentional, I would think.
  3. The illusion of confidence. Imposter syndrome types notwithstanding, and honestly, even them depending on the context, people overestimate their cognitive abilities. We think we must be underrated in some capacity at something — we’re biased and self-interested in this estimation, after all. Overconfident witnesses relying on fallible memory can put people in prison, as the previous illusion demonstrated. We also tend to think someone showing confidence must be right. Or that the most confident job applicant is necessarily the best applicant. In group dynamics, it’s been shown that the person who voices a solution first is the solution the group will go with. Confidence tricks us! Confidence Man is the title of Maggie Haberman’s book about Donald Trump and it perfectly encapsulates how his braggadocious confidence allowed him to repeatedly fail upward, such as receiving loans he was not actually good for.
  4. The illusion of knowledge. This is one of my favorite illusions because it made me think about one of my favorite essays, Leonard Read’s “I, Pencil.” We don’t know how common objects, like a pencil or a zipper, work, and yet, we are overconfident in our abilities to attempt to explain it. The world is complex! Even the complexities of creating a pencil and then explaining how it works to someone who has never encountered a pencil would be beyond our capacity. But at first blush, we probably think we can do it! Chabris’ and Simons’ example is having subjects draw a bicycle, and then explaining all the intricacies of how a bicycle works. Doing so helps show the discrepancy between the common drawing of a bike and the actual complexity of all that goes into the makeup of a bike.
  5. The illusion of cause. Another way of saying this is the common refrain you may have heard before: correlation is not causation. We can put two random things together to show a seeming correlation, but that doesn’t mean Thing A is causing Thing B. The most pernicious, unfortunate example of this is autism and vaccines. The medical field became better at diagnosing autism, and thus, reported cases of autism increased, and those who are under the illusion of cause put that fact next to the increase in vaccine usage for children to show that vaccines, like the one for measles, were causing autism when there is absolutely zero scientific or medical basis for such a claim. Indeed, vaccines are one of the greatest inventions in human history, and keep us safe from dangerous diseases like measles. They do not cause autism. The belief took off, though, because our brains are wired to find patterns and to suffer under the illusion of cause, trying to suss out an easy cause for complex phenomena. Interestingly, the reason doctors are so effective as experts is because they’re good at weeding out all of the unlikely, irrelevant alternatives to what could be causing a medical malady due to their experience. But that also means because of prior illusions, they can miss the unexpected (rare) disease. If they’re not expecting to see something unexpected on the x-ray of your lungs, for example, then they might miss it, but it isn’t malicious or intended. Chabris and Simons warn us to beware of belief becoming because and cause from coincidence (coincidence is a big catalyst for conspiracy theory-thinking!). No one cause, even a new high-minded CEO at a company, or a leader in politics, can explain all the things. Business, politics, and life are too complex! Unfortunately, we prefer an anecdote — like Jenny McCarthy claiming her child was given autism by a vaccine — over science. Anecdotes give us a narrative, and we love a narrative because it conforms to our need for pattern and understanding, which all aides in our ability to remember it!
  6. The illusion of potential. No, you are not using only 10 percent of your brain, another one of those myths that won’t die. Evolutionarily-speaking, Chabris and Simons points out how natural selection would have reduced the size of our brains to better help us get through the birth canal, if we were wasting so much of our brain’s potential. But this illusion and the resulting belief compliments our overconfidence and desire that if only we could tap into the vast reservoirs of our brain, we could be a genius or accomplish more. Hypnosis is also born out of this idea, and it goes hand-in-hand with the illusion of memory. But to this potential issue, another myth that won’t die is the Mozart effect: if you share Mozart’s music with your baby (or even with the fetus while still pregnant), they will become a genius, or at least be smarter. The industry around this myth exploded in the mid-1990s, and is still with us today. There is no basis for this! In fact, Chabris and Simons explain that the original 1990s study this myth is based on concerned college-aged adults, not even babies! Even then, the study doesn’t hold up to replication, i.e., it’s bunk. But there is a kernel of intrigue out of this myth. Our brains do appear to operate better when we are listening to something we like, such as pop music (as one study showed), or Mozart if we like Mozart. Silence can steer our brain into different directions and off task. Your mood also just improves when you hear what you like, naturally! All of this is based around the hope that we can cognitively train our brains to be better and smarter. Cognitive training, such as the idea that doing puzzles will keep you sharp and stave off cognitive decline and the erosion of our gray matter, which is something I’ve always assumed was true, is also largely bunk. What helps our cognitive functioning and does stave off of the depletion of our gray matter is … aerobic exercise. Just walk a few times a week. It doesn’t even have to be strenuous. Walking gets the heart pumping and thus, blood flowing to the brain. Whereas sitting down doing a puzzle … is sitting down doing a puzzle.

Chabris and Simons are not arguing against intuition wholesale — more so the common myths I’ve already touched on, but also including the belief in subliminal messaging, our ability to perceive someone staring at us, and quick fixes to being smarter — but rather arguing that intuition is not as strong as thoughtful analysis. What makes us smart is knowing when to use intuition and when to use thoughtful analysis.

They also offer three approaches to lessen the damaging impact of these illusions:

  1. Learning how the illusions work, by shocker, reading their book! No, it’s a fair point. Awareness of illusions diminishes their impact, although, of course, to be cognizant of them all the time would be improbable and impossible.
  2. Enhancing our cognitive abilities, whether that means through a concerted effort to get better at a specific task (not so much the brain overall), or through walking.
  3. Utilizing technology to help us see through these illusions can also help.

If there was a one sentence thesis for the book, I think it’s fair to say Chabris and Simons believe intuition is poorly adapted to solving problems in the modern world, and we must rely on thoughtful analysis to make better decisions and live a better life in a better world. Their go-to example is that our brains were not evolved to travel 60 mph in a vehicle and focus our attention on everything, especially something coming at us at that speed. Similarly, our species evolved through word-of-mouth oral traditions, aka anecdotes and stories, not scientific rigor. Heck, until the modern world, we didn’t even have aggregate statistics that demonstrate, for example, how unbelievably safe vaccines are, and yet, our brains still follow the anecdotes and stories, not those statistics.

I don’t know if I can help you with missing the gorilla or not, but one thing I hope you don’t miss is this book. I learned a lot, and of course, challenged myself on my own preconceived notions.

Leave a comment