Trending Topics

‘Mind traps’ that can trick you (and those who judge your actions)

The jury didn’t believe the Boston cop. During a foot pursuit by multiple officers of multiple suspects in a shooting, he’d run right past a spot where fellow LEOs were mercilessly beating a black man, but he swore he hadn’t seen a thing, didn’t even know the other officers or their victim were there.

The jury convicted him of perjury and obstruction of justice, sensing a blatant example of the “police code of silence” that protects wrongdoers with badges. The officer was sentenced to 34 months, and fired. Later an appeals court overturned the verdict, but on a legal technicality that had nothing to do with what he’d seen or not seen.

So, did he get away with “testilying?” After all, how could a cop — a “trained observer” — not have noticed major illegal action that was plainly in his field of vision?

Presented in greater detail, this case opens the first chapter of a fascinating new book about tricks of the human mind. Written for a lay audience by two of the nation’s leading cognitive psychologists, Dr. Christopher Chabris and Dr. Daniel Simons, The Invisible Gorilla is “a lively tour of the brain’s blind spots” that has profound implications for law enforcement and the people who investigate and judge uses of force and other police behavior.

A world-renowned expert on memory, Dr. Elizabeth Loftus, author of the standard-setting book Eyewitness Testimony: Civil and Criminal, calls The Invisible Gorilla a “must-read” for anyone “curious about how your mind really works.”

Across nearly 300 pages, Chabris and Simons examine a series of commonly believed “truths” about human cognition that are, in fact, illusions. Some of these have been touched on in previous issues of Force Science News and are explored in depth during the 5-day certification course in Force Science Analysis.

“We collectively assume...that we pay attention to more than we do, that our memories are more detailed and robust than they are, that confident people are competent people, that we know more than we really do, that coincidences and correlations demonstrate causation, and that our brains have vast reserves of power that are easy to unlock,” the authors state. “But in all these cases, our intuitions are wrong.”

Indeed, the authors claim, “virtually no realm of human behavior is untouched by everyday illusions.”

“Unless that is recognized at all levels in the criminal justice system,” says Dr. Bill Lewinski, executive director of the Force Science Institute, “grave consequences can result. Officers involved in controversial uses of force may be unjustly accused and convicted of wrongdoing and trainers may send officers onto the street unnecessarily under-prepared to defend their lives.

“The little-understood facts that Simons and Chabris report should be required reading from the academy to the courtroom.”

Buttressed in the book by detailed documentation, here are just a few of the subjects the researchers illuminate:

Inattentional Blindness
Most people believe that we see — and thus should be able to remember — everything that occurs within our visual scope, recording and storing it much as a video camera would. In truth, our brain tends to screen out distractions — even those we are looking directly at — when we are intently concentrating on another object or activity.

One of the most startling and famous tests of this reality was created and is still used by Chabris and Simons and other researchers (including Lewinski in FSI’s certification course).

Test subjects are told to watch a 1-minute film in which several young adults are bouncing and passing a basketball and to count the number of times certain of these players toss the ball. About midway in the action, a very intrusive event occurs center-screen and lasts for about 9 seconds.

In debriefings afterward, about half the test subjects consistently and firmly deny having seen the intrusion, even though eye-trackers confirm that they looked right at it, on average, for a full second. They were “concentrating so hard” on their counting assignment that the disruption failed to consciously register visually.

Chabris and Simons believe this “inattentional blindness” may be what the Boston officer experienced. When he passed by the beating scene, he was intently focused on apprehending the suspect he was chasing, who was starting to scale a fence and potentially escape.

“He could have been right next to the beating, and even focused his eyes on it, without ever actually seeing it,” the authors assert. “When people devote their attention to a particular area or aspect of their visual world, they tend not to notice unexpected objects, even when those unexpected objects are salient, potentially important, and appear right where they are looking.

“Looking at something does not guarantee that you will notice it. In fact, we are aware of only a small portion of our visual world at any moment.” And this “wiring” of our visual reception “is almost entirely insulated from our conscious control,” regardless of personal intelligence, abilities, or “capacity for attention.” To eliminate inattentional blindness, “we effectively would have to eliminate focused attention.”

Morphing Memory
People assume that memories remain “consistent and stable over time,” Chabris and Simons write. But the truth is that our recollections can become stunningly distorted.

This is as true for “flashbulb memories” — those associated with “surprising and emotionally significant events” (an OIS, for example) — as for those of run-of-the-mill occurrences. Ironically, it is these most vivid memories that we are convinced are most accurate.

Consider these (among other) memory morphs that Chabris and Simons document:

• Trying to interpret or “make sense” of a scene or event, your mind may color or even dictate what you remember about it. “Each time we recall a memory, we integrate whatever details we do remember with our expectations for what we should remember.”
• Details in our memories “sometimes shift from one time to another or from one event to another.”
• “Rich details you remember are quite often wrong,” but you vividly “recall” them and believe them because “they feel right.” We trust details that seem to “fit in,” what “plausibly might have happened rather than what did happen.”
• Unbeknown to us, “our memory systems are constantly striving” to make “a more compelling story,” and we may unwittingly add details that “improve” the narrative.
• If we change our beliefs about something, we may also change the memories associated with it. Yet the odds are 3 to 1 that we will not realize we’ve made the alterations.
• Researchers have even documented what’s called a “failure of source” memory; that is, you internalize someone else’s memory, lose track of the true source, and falsely but sincerely believe you are retrieving “a record of something that happened to you rather than someone else.”
• When memories are easy to recall (“fluent” is the research term), we mistakenly think that means they are accurate, complete, and permanent. “We don’t experience all the distortions that happened to them after they were first stored.”
• A memory can be so strong that even documentary evidence that it never happened doesn’t change what we remember.”

Because these disconnects are beyond our conscious awareness, “we mistakenly believe that our memories are accurate and precise,” yet in reality “what we retrieve often is based on gist, inference, and other influences,” the authors say.

Unfortunately, the fallibility of memory is so poorly understood among the population at large that people often “impugn the intentions and motivations of those who are innocently misremembering,” Chabris and Simons conclude.

Lewinski points out that this misperception can have dire consequences in OIS investigations when officers may appear to be deliberately deceptive because their recollections are inconsistent with those of other witnesses or with physical evidence.

Confidence vs. Competence
Of special interest to trainers will be Chabris’s and Simons’s discussion of confidence vs. competence. We tend to mistake the former for the latter, in ourselves and others, when in fact there is commonly a dangerous link between confidence and incompetence, the authors claim.

Collectively, we consider ourselves superior when we’re not. More than 60 percent of Americans and 70 percent of Canadians, for example, believe they are “above average” in intelligence — a statistical impossibility. Men, especially, are given to this narcissistic exaggeration. We also believe — falsely — that we have great unused brain resources that are just waiting to be tapped to enhance our intelligence even more.

“We tend to think that our good performances reflect our superior abilities, while our mistakes are ‘accidental,’ ‘inadvertent,’ or a result of circumstances beyond our control, and we do our best to ignore evidence that contradicts these conclusions,” the authors write.

Thus we “overestimate our own qualities, especially our abilities relative to other people,” and we “interpret the confidence — or lack thereof — that other people express as a valid signal of their own abilities, of the extent of their knowledge, and of the accuracy of their memories.” Yet “confidence and ability can diverge so far that relying on the former becomes a gigantic mental trap, with potentially disastrous consequences.”

People who are least skilled — such as those who are new at a given task — are the most likely to think better of themselves than they should.” Researchers call this the “unskilled-and-unaware effect.” Because these subjects don’t realize or acknowledge their deficiency, “they are unlikely to take steps to improve their ability.” Plus, because we tend to believe and trust confident people, their unwarranted self-assurance may fool others into overestimating their ability.

Sustained training can be a remedy for confidence/competence imbalance. Researchers have found that teaching people to perform a task better significantly reduces their overconfidence and makes them “better judges of their competence,” Chabris and Simons report. “As we study and practice a task, we get better at both performing the task and knowing how well we perform it.”

True competence “helps to dispel the illusion of confidence. The key, though, is having definitive evidence of your own skills — you have to become good enough at what you do to recognize your own limitations.” And you should always harbor a “not sure” component that will motivate you to keep learning.

For investigators and prosecutors, Chabris and Simons offer some sobering data regarding overconfidence among eyewitnesses. Sample: “Mistaken eyewitness identifications, and their confident presentation to the jury, are the main cause of over 75 percent of wrongful convictions that are later overturned by DNA evidence.”

Causation
Police critics who make or accept assertions such as “TASERs kill” should be force-fed the Chabris and Simons chapter titled “Jumping to Conclusions.” In cogent terms, it addresses “the illusion of cause” — how non-experts confuse coincidences, correlations, and mere chronology with causal relationships.

The human mind, preferring to “perceive meaning rather than randomness” in what it encounters, has a “hyperactive tendency to spot patterns,” the authors explain. “These extraordinary pattern-detection abilities often serve us well, enabling us to draw conclusions in seconds (or milliseconds) that would take minutes or hours if we had to rely on laborious logical calculations.

“Unfortunately, they can also lead us astray,” allowing us to “perceive patterns where none exist” and “to infer cause rather than coincidence.” Two common misleading inferences in pattern perception are that “earlier events cause later ones” and that when 2 events happen together, “one must have caused the other.”

Unconsciously, we are primed to see patterns that fit our beliefs and “well-established expectations.” When that happens, we may be so confident that we have found a causal link that we fail “to notice more plausible alternative explanations,” the authors say.

Moreover, they point out, anecdotes — stories people hear — tend inherently to be more memorable and persuasive than dry statistical data. “Our brains evolved under conditions in which the only evidence available to us was what we experienced ourselves and what we heard from trusted others. Our ancestors lacked access to huge data sets, statistics, and experimental methods. By necessity, we learned from specific examples,” so examples “lodge in our minds, but statistics and averages do not.”

It can be “difficult to overcome a belief that is formed from compelling anecdotes,” the authors concede, just as it is difficult to convince people that a correlation (an association) is not necessarily a cause. And it is difficult to counter the influence of news reporting, which “often gets the causation wrong in an attempt to make the claim more interesting or the narrative more convincing.”

Yet challenging as the task may be, Chabris and Simons emphasize that “the only way — let us repeat, the only way — to definitely test” for causal relationships is to conduct scientific experiments.

“People do not necessarily accept the results of scientific studies, even when the data are overwhelming,” the authors note sadly. But, in Lewinski’s words, “that should never deter the relentless pursuit of the truth.”

Although The Invisible Gorilla is by no means tailored exclusively for law enforcement, Chabris and Simons do make note of the urgent “need for reform” in the legal system’s understanding of how the human mind works.

“The police, the witnesses, the lawyers, the judges, and the jurors are all too susceptible to the illusions” the book explores, the authors write. “Because they are human, they believe that we pay attention to much more than we do, that our memories are more complete and faithful than they are, and that confidence is a reliable gauge of accuracy.

“The common law of criminal procedure was established over centuries in England and the United States, and its assumptions are based precisely on mistaken intuitions like these.”

Force Science advances expert decision-making, performance, and honest accountability in public safety. Their team of physicians, attorneys, policing experts, psychologists and human performance researchers focuses on understanding and optimizing how civilians and law enforcement make decisions and perform in high-stress situations.