Presentation On Cognitive Errors (Biases)
COGNITIVE AND PERCEPTUAL ERRORS IN THINKING
On the evening of October 30, 1938, a play based on H. G. Wells’s novel War of the Worlds about a Martian invasion was broadcast to the nation on radio. Many of the people who listened to the show believed that the invasion was real. Some people even “smelled” the poisonous Martian gas and “felt” the heat rays being described on the radio. Others claimed to have seen the giant machines landing in New Jersey and the flames from the battle. One panicked person told police he had heard the president’s voice on the radio ordering them to evacuate.
Our perceptions of the world around us are easily skewed by social influences. Most people underestimate the critical role that cognitive and social factors play in our perception and interpretation of sense data. Although emotion has traditionally been regarded as the culprit when reason goes astray, studies suggest that many of the errors in our thinking are neurological in nature.20 In this section, we’ll be looking at some of these cognitive and perceptual errors.
As a consumer, how can you avoid being taken in by cognitive and perceptual errors used by marketers? See Chapter 10, p. 306.
Our minds are not like blank sheets of paper or recording devices, such as cameras or video recorders, as the empiricists claimed. Instead, our brains construct a picture of the world much as an artist does. Our brains filter our perceptions and fill in missing information based in part on our expectations, as occurred in the broadcast of War of the Worlds.
When the radio show based on the novel The War of the Worlds was broadcast, many of the listeners believed that the invasion was real.
Some skeptics believe that UFO sightings are based on perceptual errors, including optical illusions (see “Analyzing Images: The St. Louis Arch”). In 1969, an Air National Guard pilot spotted what he thought was a squadron of UFOs within Page 115several hundred feet of his plane. He later described the UFOs as the color of “burnished aluminum” and “shaped like a hydroplane.” As it turned out, the “squadron of UFOs” was most likely a meteor fireball that had broken up in the vicinity of the plane.21 However, while being able to provide alternative explanations for most UFO sightings makes their existence less probable, we cannot conclude with certainty that all sightings are a result of perceptual errors. We’ll be looking at the issue of the existence of UFOs in the “Critical Thinking Issue: Perspectives on Evaluating Evidence for the Existence of Unidentified Flying Objects” section at the end of this chapter.
The St. Louis Arch
Designed by architect Eero Saarinen, the St. Louis Arch in St. Louis, Missouri, was completed in 1965 on a site overlooking the Mississippi River. Although the height of the arch and its width at the base are both 630 feet, the graceful catenary creates the illusion that the arch is taller than it is wide. Even if we are told that its height and width are the same, we still have great difficulty making the cognitive adjustment to correct what is known as the vertical/horizontal illusion. Because of this optical illusion we also tend to overestimate the height of trees and tall buildings.
- What was your first reaction when you were told that the height and width of the arch were the same? Did they look the same after you were told the dimensions of the arch? Share with the class other optical illusions that you have encountered in architecture or elsewhere.
- Working in small groups, discuss why we might experience optical illusions such as the vertical/horizontal illusion. Discuss what resources you could use in developing your hypothesis (a hypothesis is an educated guess based on evidence and experimentation). Share your hypothesis with the class for analysis.
In an inkblot test such as the one above, a psychologist asks a person to describe what he or she sees. The psychologist uses the descriptions to learn more about a person’s motivations and unconscious drives.
- What do you see when you look at the above inkblot? Why do you think you saw what you did?
- Discuss how the inkblot test illustrates our tendency to impose order on random data. Think of a time when you fell for this error in your everyday life. Come up with two or three critical-thinking strategies you might use to make yourself less prone to being taken in by our tendency to impose meaning on random data.
What tools and strategies do scientists use to minimize perceptual errors? See Chapter 12, p. 369.
Our minds may also distort objects we perceive. A straight stick, when inserted in water, appears to bend. A full moon appears to be much larger when it is near the horizon, a phenomenon that NASA refers to as the “moon illusion.” Presentation On Cognitive Errors
Radar photo of 2005 Hurricane Katrina that had an object that looked like a “fetus facing to the left in the womb” to some, leading some anti-abortion advocates to draw the conclusion that the hurricane was punishment for the presence of abortion clinics in the city.
Misperception of Random Data
Because our brains loathe absence of meaning, we may “see” order or meaningful patterns where there are none. For example, when we look at clouds or see unexplained lights in the sky, our brains impose meaning on the random shapes we see. When we look at the moon, we see a “face,” popularly known as the man in the moon.
One of the most famous examples of this type of error is the “Martian Canals,” first reported as channels by Italian astronomer Giovanni Schiaparelli in 1877. Many astronomers continued to believe in the existence of these canals up until 1965, when the spacecraft Mariner 4 flew close to Mars and took photos of the planet’s surface. No canals showed up in the photos. It turned out that the “canals” were a combination of an optical illusion, the expectation that there were canals, and the brain’s tendency to impose order on random data. Because of our brain’s inclination to impose meaning on random data, we should maintain a stance of skepticism about what we see.
The combination of the error of misperceiving random data with confirmation bias—interpreting data in a way that confirms our cherished views—is illustrated by the next example. After the devastation of New Orleans by Hurricane Katrina in 2005, a group known as the Columbia Christians for Life announced that God’s purpose in sending the hurricane was to destroy the five abortion clinics in the city. Their proof was a radar photograph taken of the hurricane in which they claimed to have seen what looked like “a fetus facing to the left (west) in the womb, in the early weeks of gestation.”22
Stress as well as preconceptions about the world can affect our perception. How many of us, walking alone at night, have seen a person or dog standing in a shadow, only to discover it was a bush or other object?
The memorable-events error involves our ability to vividly remember outstanding events. Scientists have discovered channels in our brains that actually hinder most long-term memories by screening out the mundane incidents in our everyday life.23 However, these memory-impairing channels appear to close down during outstanding events. For example, most American adults recall exactly where they were and what they were doing on the morning of September 11, 2001. Page 117However, if you ask someone what they were doing on an ordinary weekday two months ago, most people would be unable to remember or would remember only if they could think of something special that happened on that day.
Critical THiNKing in Action
FOOD FOR THOUGHT: PERCEPTION AND SUPERSIZED FOOD PORTIONS
Obesity is becoming an epidemic in the U.S. More than one-third of Americans adults are obese—more than double the rate in 1980, according to the U.S. Centers for Disease Control and Prevention. Supersized portions of junk food, such as potato chips, hamburgers, and sodas, have been blamed, in part, for this trend.* Do supersized portions lead to supersized people, or is this just all hype so that we can place the blame for our weight problems on Lay’s potato chips and McDonald’s burgers? In fact, studies show that downsizing our food portions does work to keep our weight down because it takes advantage of a perceptual error. Appetite is not a matter of just the physiological state of hunger but also a matter of perception—what we see in front of us. Most of us eat more when the table or our plates are loaded with food.
Humans are not the only species who make this error. When a researcher places a pile of 100 grams of wheat in front of a hen, she will eat 50 grams and leave 50. However, if we put 200 grams of wheat in front of a hen in a similar hunger condition, she will eat far more—83 to 108 grams of wheat or, once again, about half of what is in front of her.** Furthermore, if the food is presented as whole grains of rice, rather than cracked rice, where the grains are one quarter the size of whole rice grains, the hen will eat two to three times as much as she would otherwise.
In other words, by cutting down your portion sizes and cutting your food into smaller pieces, your brain will think you’re full on less food.
- Many students put on weight in their first year of college, a phenomenon known as the “freshman 15.” Critically evaluate your college environment and ways in which it promotes or hinders good eating habits. Make a list of suggestions for improving the environment so students are not so vulnerable to perceptual errors and overeact as a result. Carry out one of the suggestions or pass it on to someone who is in a position to make the change.
- Examine your own eating habits. Evaluate ways in which being more aware of your thinking process, including inbuilt perceptual errors, can help you to maintain healthier eating habits.
To use another example, airplane crashes and fatalities are reported in the national media, whereas automobile fatalities generally are not. However, per mile traveled, airplane travel is far safer. We’re sixteen times more likely to be killed in an automobile accident than in an airplane accident. Page 118In fact, traffic accidents are one of the leading causes of death and disability of people between the ages of 15 and 44.24 However, the memorable-events error exerts such control over our thinking that even after being informed of these statistics, many of us still continue to be more nervous about flying than about driving.
Why do news stories lend themselves to memorable-events errors? See Chapter 11, p. 338.
What methods and techniques do scientists use to minimize personal and social bias? See Chapter 12, p. 369.
The memorable-events error is sometimes tied in with confirmation bias, in which we tend to remember events that confirm our beliefs and forget those that are contrary to our beliefs. A popular belief in the United States is that “death takes a holiday” and that terminally ill patients can postpone their death until after an important holiday or birthday. In fact, this belief is based purely on wishful thinking and anecdotal evidence. In an analysis of the death certificates of more than a million people who died from cancer, biostatisticians Donn Young and Erinn Hade found no evidence that there is a reduction in death rates prior to a holiday or important event.25 Personal and social beliefs are remarkably strong even in the face of empirical evidence that logically should be devastating. When their results were published, Young and Hade received several angry e-mails criticizing them for taking away people’s hope.
Statistically, there is a far greater chance per mile traveled of being killed in a car accident than in an airplane crash, yet most people have a greater fear of flying.
What is the probability that two people in your class have a birthday on the same month and day? Most people guess that the probability is pretty low. In fact, in a class of 23, the probability is about 50 percent. In larger classes, the probability is even higher. When we misestimate the probability of an event by a huge margin, we are committing probability error.
Humans are notoriously poor at determining probability. We are inclined to believe that coincidences must have paranormal causes when actually they are consistent with probability. For example, you are thinking of a friend whom you haven’t seen for a year when the phone rings and it’s your friend on the other line. Are you psychic? Or is it just a coincidence? You’ve probably thought of your friend hundreds or even thousands of times over the course of the past year without receiving any phone calls, but we tend to forget such times because nothing memorable occurred.
According to the Association for Psychological Science, 1.2 percent of the adult population are pathological gamblers and at least another 2.8 percent are problem gamblers.
One of the most insidious forms of probability error is gambler’s error—the erroneous belief that previous events affect the probability in a random event. Research suggests that gambling addiction is based on gambler’s error. In a study participants were invited to think aloud while gambling. Of the verbalized perceptions, 70 percent were based on erroneous thinking such as “The machine is due; I need to continue,” “Here is my lucky dealer,” “Today I feel great; it is my lucky day,” “It’s my turn to win.” These statements reveal a failure to understand the random nature of probability.
When questioned about their verbalizations, nonproblem gamblers realized that their beliefs were wrong. They were able to use accumulated evidence to critically evaluate and modify their perceptions. Problem gamblers, in contrast, processed the evidence much differently. They believed what they had said and interpreted their occasional Page 119random wins as confirming their belief that the outcome of a game can be predicted and controlled. The solution? Work to improve problem gamblers’ critical-thinking skills. By making gamblers aware of their erroneous perceptions and the reasons why they continue to cling to these beliefs, clinicians work to help gamblers overcome their addiction.26
Gambler’s error and an addiction to gambling is based on a misunderstanding of the random nature of probability.
There are several types of self-serving biases or errors that impede our thinking and pursuit of truth, including:
- The misperception that we are in control
- The tendency to overestimate ourselves in comparison to others
- The tendency to exaggerate our strengths and minimize our weaknesses
We are predisposed to believe that we are in control of events that are outside our control. “I knew it would rain today,” you groan. “I didn’t bring my umbrella.” Recently, the Power-ball lottery jackpot reached over $100 million. I was standing in line at a mini-mart when I overheard the following conversation between the people in front of me, who were waiting to buy lottery tickets.
Person 1: “What are you going to do? Are you going to pick your own numbers or let the computer do it for you?”
Person 2: “Pick my own. It gives me a better chance of winning.”
People who are poor critical thinkers may fall prey to more than one error in thinking in the same situation. In this case the control error was compounded by the probability error, which we discussed earlier. Although logically we know that lottery numbers are selected randomly, many of us also believe that choosing our own numbers—especially using our “lucky” numbers—increases our chances of winning. In fact, 80 percent of winning lottery tickets have numbers randomly generated by the computer, not so-called lucky numbers picked by the winners.27
Following the disastrous April 2010 oil rig explosion in the Gulf of Mexico, British Petroleum (BP) engaged in self-serving bias by grossly underestimating the amount of crude oil that flowed from the disabled well into the Gulf. BP also overestimated its control of the situation and its ability to stop the oil flow and clean up the oil spill without outside help.
The misperception that we are in control of random events also plays out in superstitious behavior such as Page 120wearing our lucky shirt during a big game or bringing a good-luck charm to an exam. Before a game, most college and professional athletes engage in ritualistic superstitious behavior such as using a particular color shoelace or tape. Some baseball players sleep with their bats to break out of a hitting slump or to keep up their batting average. To some extent, the belief that we are in control can boost our confidence in achieving our goals. In fact, ritualistic behaviors have been found to have a calming effect on athletes before a game.
However, if we carry the belief that we are in control too far, it can distort our thinking and lead to poor decisions in our lives. The self-serving bias, and misjudgment about our ability to handle a challenge, can work against our rational self-interests. For example in the case of the British Petroleum oil leak, BP lost billions of dollars as well as the public confidence because of their erroneous belief in the beginning that they were in control of the situation and didn’t need outside help. Thousands of people have died in wildfires and in hurricanes, despite repeated warnings to evacuate, because they thought they were in control of the situation and could ride out the storm.
This error is also expressed in the often-heard cliché “You can do anything you want if you put your mind to it,” the implication being that if only we wanted to enough, we would have perfect control. Self-help gurus have become wealthy catering to this self-serving error. In her book The Secret (2006), Rhonda Byrne claims to have found the secret to happiness in what she calls “the law of attraction.” According to Byrne, each of us has complete control over what happens to us in our lives. If we think positive thoughts, then like a magnet, we will attract whatever we want—whether it be a parking spot, a million dollars, a sexy figure, or a cure for cancer. The downside is that if we are not successful in getting what we want, then we have only ourselves and our negative thinking to blame. Presentation On Cognitive Errors
The belief that we are in control of situations where we actually have little or no control can contribute to irrational guilt or posttraumatic stress syndrome.28 A survivor of a traumatic event may believe that he or she should have been able to predict and do something to prevent an event such as sexual abuse, domestic violence, or the death of a loved one, especially an accidental or suicidal death.
Although genetic, physical, and environmental factors play a role in the onset of depression, the belief that we should be in control of our lives can also contribute to depression (see “Critical Thinking in Action: Irrational Beliefs and Depression”). People who are depressed may cling to the irrational belief that the only alternative to not having perfect control is having no control. Because they feel they lack any control over their lives, they tend to attribute their misfortune or sadness to other people’s actions. A side effect of this negative behavior is that their behavior often alienates other people, thereby confirming a second irrational belief common to depressed people that they are worthless and unlikable. Thus, their distorted expectations lead to a self-fulfilling prophecy, a cognitive error we’ll be studying in the next section.
Critical THiNKing in Action
IRRATIONAL BELIEFS AND DEPRESSION
Albert Ellis (b. 1913), founder of rational emotive behavioral therapy, maintains that irrational ideas are the primary source of depression, rage, feelings of inadequacy, and self-hatred. Some of these irrational beliefs are:
- “I must be outstandingly competent, or I am worthless.”
- “Others must treat me considerately, or they are absolutely rotten.”
- “The world should always give me happiness, or I will die.”
- “I must have perfect control over things, or I won’t be able to enjoy life.”
- “Because something once strongly affected my life, it will indefinitely affect my life.”
According to Ellis, a depressed person feels sad because he (or she) erroneously thinks he is inadequate and abandoned, even though depressed people have the capacity to perform as well as nondepressed people. The purpose of therapy is to dispute these irrational beliefs and replace them with positive rational beliefs. To achieve this, the therapist asks questions such as:
- Is there evidence for this belief?
- What is the evidence against this belief?
- What is the worst that can happen if you give up this belief?
- And what is the best that can happen?
To assist the clients in changing their irrational beliefs, the therapist also uses other techniques such as empathy training, assertiveness training, and encouraging the development of self-management strategies.
- Discuss how cognitive errors contribute to irrational beliefs. Make a list of other irrational beliefs people hold that are based on cognitive errors.
- Do you have any irrational beliefs that interfere with your achieving your life goals? If so, what are they? Discuss how you might use your critical-thinking skills to work toward overcoming these beliefs. Be specific.
See Albert Ellis, The Essence of Rational Emotive Behavior Therapy. Ph.D. dissertation, revised, May 1994.
A second self-serving bias is the tendency to overestimate ourselves in comparison to others. Most people rate themselves as above average when it comes to getting along with other people. Although it obviously can’t be true that the majority of people are above average—except, perhaps, in the fictional town of Lake Wobegon in Garrison Keillor’s Prairie Home Companion on Minnesota Public Radio—this self-serving bias can bolster our self-esteem and confidence. However, if we are unaware of the bias, it can become a problem and cause us not to take responsibility for our shortcomings. A Pew Research Center survey found that while 70 percent of Americans are overweight and that nine in ten agree that most of their fellow Americans are overweight, only 39 percent of Americans consider themselves to be overweight.29 Clearly there seems to be a disconnect between being overweight and people’s estimation of their own weight. Presentation On Cognitive Errors
Another example of the self-serving Page 121bias is that most people take personal credit for their successes and blame outside forces for their failures. College students often attribute their “A” grades to something about themselves—their intelligence, quick grasp of the material, or good study skills. In contrast, they usually attribute their poor grades to something outside their control such as having an unfair teacher or having a touch of the flu on the day of the exam.30 Similarly, when it comes to being overweight, many people blame a slow metabolism as the main reason why they can’t lose weight, rather than their lifestyle or factors under their control. However, when overweight people do lose weight, they rarely attribute their success to a peppy metabolism but instead credit their willpower and good choices.
What is cognitive dissonance and when are people most likely to engage in it? See Chapter 1, p 27.
This type of self-serving bias can be found in the workplace. When office employees were asked in a survey “if they ever experienced backstabbing, rudeness, or incivility in the workplace,” 89 percent said “yes.” However, in the same survey 99 percent said that “they were never rude or the cause of the conflict.”31 In other words, most of us are quick to complain about other’s irritating behaviors but give little thought to how our behavior might be the cause of workplace conflict.
According to Carol Tavris and Elliot Aronson, social psychologists and authors of Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts, being made aware of the gap between our self-image and our actual behavior creates cognitive dissonance and discomfort. To minimize this discomfort and maintain our good opinion of ourselves, we instinctively minimize the discrepancy through denial or by blaming someone else for our shortcomings. This sort of rationalization can prevent us from realizing that we’re clinging to a mistaken belief.32 As critical thinkers, we need to deal constructively with the discomfort that comes from cognitive dissonance and to work toward overcoming our mistaken beliefs about ourselves.
A third related self-serving bias is our inclination to exaggerate or place a greater value on our strengths and underestimate or downplay our weaknesses. In a study of intellectually gifted boys who thought they hadn’t done well in class, the boys downplayed the importance of academics and instead emphasized the importance of other pursuits such as sports.33 Seeing ourselves as having those traits and abilities that are important in life increases our sense of worth and helps us to achieve our life goals. This tendency, however, can also contribute to overconfidence and failure to seek or acknowledge other people’s skills.
As we noted in the introduction to this chapter, overconfidence in physicians and jumping to a conclusion has been identified as one of the key factors in diagnostic errors. Unless we are willing, as critical thinkers, to make an honest evaluation of ourselves, it is unlikely that we are going to take steps toward overcoming our shortcomings.
According to the Patient Safety Movement Foundation medical errors are responsible for an estimated 200,000 deaths a year in the United States.
A self-fulfilling prophecy occurs when our exaggerated or distorted expectations reinforce actions that actually bring about the expected event. Expectations can have a profound influence on our behavior. Rumors of impending bank failures during the Great Depression in the early 1930s led to mass panic in which people rushed to take their money out of banks before the banks crashed. As a result, lots of banks went bankrupt. Since banks invest some of the deposits rather than keeping all the money in the vault, the frenzy caused the collapse of the banks—the very thing the customers feared.
To use another example of a self-fulfilling prophecy, let’s say a literature professor has a star football player in her class. On the basis of her (mistaken) expectations about college athletes, she assumes that he is not academically inclined but is taking the course only because it is reputed to be easy. Because of this she calls on him less and doesn’t encourage him or make an effort to include him in class discussions. She justifies this behavior on her part as not wanting to embarrass him.
To preserve our expectations, we may interpret ambiguous data in ways that meet our expectations. For example, our football star may be particularly quiet and introspective during one class. The professor assumes that he is preoccupied with thinking about the upcoming game, when instead he is deep in thought about the poem that is being discussed in class. Our football player, who initially was very interested in the class and in literature and had even written several poems for his high school newspaper, soon begins to lose interest in the class and ends up getting only a mediocre grade. Thus, we have a self-fulfilling prophecy in which the professor’s distorted expectation comes true. Clearly, preserving our expectations can come at a cost to others.
Panicked citizens gather to withdraw their money from a federal bank during the Great Depression. This type of thinking also contributed to a plunge in the stock market in 2008, when people pulled their money from the stock market because of fears it would crash.
Humans are prone to several inborn cognitive and perceptual errors, including optical illusions, misperception of random data, memorable-events errors, probability errors, self-serving biases, and self-fulfilling prophecies. Because these errors are part of the way our brain interprets the world, we may fail to notice the influence they exert over our thinking. Developing our critical-thinking skills can help us be more aware of these tendencies and adjust for them when appropriate.
Rumors of impending bank failures during the Great Depression in the early 1930s led to mass panic in which people rushed to take their money out of banks before the banks crashed.
STOP AND ASSESS YOURSELF
- Come up with an issue—such as same-sex marriage, abortion, gun control, or legalizing marijuana—that is important to you. Discuss the extent to which cognitive errors bias the way you go about collecting and interpreting evidence regarding this issue. Discuss steps you might take to compensate for this bias.
- Think of a “lucky” charm or ritual that you use, or have used in the past, to increase your chances of doing well on something that really matters. This can include anything from wearing your “lucky shoes” during a baseball game to rubbing your mother’s ring before an important exam. Given your realization that this behavior is based on a cognitive error, why might you continue to do it? Support your answer using what you know of probability error.
- If you have ever bought a lottery ticket or know of someone who did, why did you (or the other person) buy it? When the ticket was bought, what did you, or the other person, think the probability of winning was? Discuss the extent to which this reasoning involved a probability error.Page 124
- Given that humans are prone to cognitive errors, should we use computers rather than physicians for medical diagnoses? Support your answer.
- Think of a time when you studied hard for a test but ended up with a low grade. To what did you attribute your poor performance? Now think of a time when you studied hard and did very well on a test. To what did you attribute your good performance? Discuss how self-serving biases may have contributed to the difference in your reaction in each of the two situations.
- Do you tend to overestimate the amount of control you have over your life? Give specific examples. Discuss how a distorted sense of control has had an impact on your ability to achieve your life goals. Come up with at least two critical-thinking strategies you could use to correct for this cognitive error.
- Which cognitive error are you most like to commit? Give a specific example of your using this error. If you are willing, share your strategies for overcoming these ideas with the class.
SOCIAL ERRORS AND BIASES
Humans are highly social animals. Because of this trait, social norms and cultural expectations exert a strong influence on how we perceive the world—so much so that we tend to perceive the world differently in groups from the way we do in isolation. Groups can systematically distort both the gathering and the interpretation of evidence.34
As we noted in Chapter 1, ethnocentrism—the unjustified belief that our group or culture is superior to that of others—can also bias our thinking and act as a barrier to critical thinking. Presentation On Cognitive Errors
Rioting in Ferguson, Missouri following the 2014 police shooting of an unarmed African-American teenager.
“One of Us/One of Them” Error
Our brains seem to be programmed to classify people as either “one of us” or “one of them.” We tend to treat people who are similar to us with respect and those who are different from us—whether in regard to race, sex, religion, political party, age, or nationality—with suspicion or worse. Although most of us claim to believe in human equality, in our culture the use of qualifiers such as gay judge, female doctor, Hispanic senator, and Down syndrome child betray our tacit belief that any deviation from the norm needs to be specified. We rarely hear terms such as straight judge, male doctor, European American senator, or able-bodied child!
Prejudices may operate without any conscious realization on our part. In a Harvard study, subjects were asked to quickly associate positive or negative words with black or white faces. Seven out of ten white people, despite their claims that they had no racial prejudice, showed “an automatic preference for white over black.”35
It is all too easy for people to fall into the “us versus them” mind-set, especially when they feel threatened. In 2014, riots broke out in Ferguson, Missouri, following the police shooting of Michael Brown, an unarmed African-American teenager. The police responded to protests following the shooting by donning military battle gear and face masks and confronting the protesters. This further exacerbated the “us versus them” mentality and the belief that the police were out to get black people. Police departments have since reexamined their response to protests.
This error also contributes to our tendency to polarize issues into two camps. “They,” whether it be “right-wing conservatives” or “far-left liberals,” are irrational; there is no point in even arguing a point with them. Our group, on the other hand, holds a monopoly on Truth. There is no middle ground. During presidential elections, Americans are quick to divide up the country into two opposing camps: the red states (Republicans) and the blue states (Democrats) and to classify people in their group as “good” and “right” and those in the other group as “bad” and “mistaken.”
If we are to overcome this social error we need to be aware of it in our thinking and to build in protective measures.36 As critical thinkers, we can work toward minimizing this error in our thinking by first critically evaluating the situation and then consciously reprogramming our brains to come up with new, more reasonable definitions of who it is that we view as “us” by seeking a more immediate and inclusive basis for a connection, such as we all attend the same college, we all are Americans, we all are human beings. We also need to make a conscious effort to be open to multiple perspectives, even those we are initially convinced must be mistaken.
The Salem witch hunts, which took place in Massachusetts in the late 17th century, targeted those mistakenly believed to be responsible for society’s ills.
When you serve as a juror, how can cognitive and social errors distort your analysis of the evidence? See Chapter 13, pp. 417–418.
The late nineteenth and early twentieth centuries were times of extraordinary technological advancement, setting in motion the expectation of ever new and revolutionary technology. On December 13, 1909, six years after the Wright brothers’ epic flight, the Boston Herald reported the invention of an airship by a local businessman, Wallace Tillinghast.37 Over the next several weeks, hundreds of witnesses from the New England–New York area, including police officers, judges, and businesspeople, reported sightings of the airship sailing through the skies.38 The reported sightings led to a massive search for the airship by reporters. The search was finally called off when it was revealed that the story was a hoax perpetuated by Tillinghast.
Social expectations can be so powerful that they may lead to collective delusions. Sometimes these social errors may even become institutionalized.39 Acting on social expectations without subjecting them to critical analysis can have dire consequences. The Salem witch trials in colonial Massachusetts, in which over 200 people, predominantly young women, were accused of witchcraft, were rooted in the social expectations of the seventeenth century. Those of us living in the twenty-first century may regard the witch-hunters as crazed fanatics. However, they were simply behaving in a manner that was consistent with the prevailing worldview and social expectations of their time in which certain unfortunate circumstances, such as crop failures, disease, and untimely deaths, were interpreted as being brought about by the Devil and his worldly agents—witches.
The social expectations of the police who interrogated Peter Reilly, a teenager who was accused in 1973 of killing his mother, also played a role in their use of leading questions to get a “confession” out of him. Reilly’s mother had been an emotionally abusive woman. In our society we expect victims of parental abuse to be violent and vengeful, even though studies suggest that it is children who witness domestic violence, rather than those who are direct victims of it, who are at highest risk, since they come to accept violence as normal.40 In addition, it is often a family member who commits this type of violent murder. Therefore, the police jumped to the conclusion, on the basis of their expectations, that Reilly must have committed the murder.
Stereotyping is another type of social bias based on socially generated group labels. In the study mentioned in Chapter 1, page 28, in which researchers showed students a picture of a black man on a subway next to a white man who was holding an open razor, when students were later asked to recall what they had seen, half of them reported that the black man was holding the razor.
Group Pressure and Conformity
Group pressure can influence individual members to take positions that they would never support by themselves, as happened in the Stanford prison experiment described in Chapter 1. Some religious cults exploit this tendency by separating their members from the dissenting views of family and friends. In many cults, people live together, eat together, and may even be assigned a buddy.
Group pressure is so powerful in shaping how we see the world that it can lead people to deny contrary evidence that is right before their eyes. In the 1950s, social psychologist Solomon Asch carried out a series of experiments in which he showed study subjects a screen containing a standard line on the left and three comparison lines on the right. (see “Analyzing Images: Asch Experiment”). One of the comparison lines was the same length as the standard line and the other two were of significantly different lengths.41 In each case, an unsuspecting study subject was introduced into a group with six confederates, who had been told by the experimenter to give the wrong answer. The group was then shown the lines. The experimenter asked one of the confederates which of the three lines on the right they thought was the same length as the standard line. The confederate, without hesitation, gave a wrong answer. The next few confederates gave the same answer. By now, the naïve subject was showing puzzlement and even dismay. How can six people be wrong?
In Asch’s experiment the six confederate subjects all gave the same wrong answer when asked which line in Exhibit 2 matched the line in Exhibit 1. After hearing their answers 75% of the naïve subjects also gave the same wrong answer.
- What do you think the naïve subject in the experiment was thinking?
- Think back to a time when you were in a similar situation where you thought you were correct, but everyone else with you thought something else. How did you respond to the discrepancy between your belief and theirs?
After hearing six “wrong” answers, 75 percent of the naïve study subjects, rather than trust the evidence of their senses, succumbed to group pressure and gave the same wrong answer. Even more surprising is the fact that when questioned afterward, some of these study subjects had actually come to believe the wrong answer was correct.
The desire for agreement is normal. However, this desire, when combined with our innate tendency to divide the world into “one of us” and “one of them,” can lead to the exclusion of those who disagree with the majority, since people tend to prefer being around people who agree with them. In the corporate world, disagreement is often tacitly discouraged. “Outliers” or nonconformists who do not agree with group members may be excluded by committee chairs from further discussions or even fired.42
Because of our inborn tendency to conform to what others think, we cannot assume that agreement leads to truth without knowledge about the manner and conditions under which the agreement was arrived. Indeed, the current emphasis on seeking group consensus in decision making may be unreliable. In consensus seeking, the majority in a group is often able to sway the whole group to its view.
As with other errors in our thinking, we need to develop strategies to recognize and compensate for our human inclination to conform to groupthink, the tendency of Page 127members of a group to yield to the consensus of the group. When a group comes to a decision, we need to mentally step back from the group and carefully evaluate the evidence for a particular position rather than assume that the majority must be correct. In competitive ice skating and diving, because of the danger of a judge’s scoring being contaminated by what other judges say, scoring is done individually, rather than as a group decision.
SOCIAL ERRORS AND BIASES
“One of us/one of them” errors: Our brain seems programmed to classify people as either “one of us” or “one of them.” We tend to treat people who are similar to us with respect and those who are different from us with suspicion.
Social expectations: The influence of social expectations is so powerful that it can lead to collective delusions in which people attempt to fit evidence into their cultural worldview.
Group pressure and conformity: Group pressure can influence individual members to behave in ways or take positions that they would never do by themselves.
Diffusion of responsibility: A phenomenon that occurs in groups of people above a critical size where responsibility is not explicitly assigned to us so we tend to regard it as not our problem or as belonging to someone else.
► APPLICATION: Identify an example in the text of each of the social errors and biases.
Diffusion of Responsibility
Diffusion of responsibility is a social phenomenon that occurs in groups of people above a critical size. If responsibility is not explicitly assigned to us, we tend to regard it as not our problem but as belonging to someone else. We are much more likely to come to someone’s aid if we are alone than if we are in a crowd.
The phenomenon of “diffusion of responsibility” was regrettably illustrated when no one came to the aid of a seriously injured man lying in a busy street in Hartford, Connecticut, after being struck by a hit-and-run driver in May 2008. The victim, Angel Torres, later died from the injuries he sustained.
This phenomenon is also known as bystander apathy or the Kitty Genovese syndrome. In 1964, 28-year-old Kitty Genovese was murdered outside her New York City apartment building. In the half hour that lapsed during the attack, none of Genovese’s many neighbors, who had heard her repeated cries for help, called the Page 128police. More recently, in June 2008, an elderly man was struck by a hit-and-run driver on a busy street in Hartford, Connecticut. The man lay in the street paralyzed and bleeding from his head while bystanders gawked at or ignored him. Motorists drove around his body without stopping. No one offered any assistance until an ambulance finally turned up. Diffusion of responsibility can also occur in group hazing at fraternities where no one comes to the rescue of a pledge who is clearly in distress.
We are much more likely to come to someone’s aid if we are alone than if we are in a crowd.
As social beings, we are vulnerable to the “one of us/ one of them” error, social expectations, and group conformity. When in groups, we also tend to regard something as not our problem unless responsibility is assigned to us. Although these traits may promote group cohesiveness, they can interfere with effective critical thinking. As good critical thinkers we need to be aware of these tendencies, and to cultivate the ability to think independently while still taking into consideration others’ perspectives. Errors in our thinking also make us more vulnerable to falling for or using fallacies in arguments. We’ll be studying some of these fallacies in the following chapter.
STOP AND ASSESS YOURSELF
- Whom do you define as “us” and whom do you put in the category of “them”? Discuss how you might go about widening the “us” category to include more people who are now in your “them” category.
- Humans seem to have inborn biases toward particular types of people. According to a University of Florida study, when it comes to hiring, employers have a more favorable view of tall people. When it comes to earnings, every extra inch of height above the norm is worth almost $1,000 a year. In fact, nine of ten top executives are taller than the typical employee.43Given this cognitive error and its impact on hiring practices, discuss whether or not affirmative action policies should apply to very short people. Relate your answer to the discussion in the text of the effect of this cognitive error on our thinking.
- Think of a time when your social expectations led you to misjudge a person or a situation. Discuss strategies for improving your critical-thinking skills so that this is less likely to happen.
- Think of a time when the public got caught up in a “witch hunt.” Identify the worldviews and social expectations that supported this “witch hunt.” Which critical-thinking skills would make you less likely to go along with a “witch hunt”? Discuss what actions you could take to develop or strengthen these skills.
- Polls before elections can influence how people vote by swaying undecided voters to vote for the candidate who is in the lead. Analyze whether election polls should be forbidden prior to the election itself.
- The democratic process depends on social consensus. Given people’s tendency to conform to social expectations and what others think, is democracy the best form of government? If so, what policies might be put in place to lessen the effect of social biases? Be specific.
- Think of a time when you failed to speak out against an injustice or failed to come to someone’s aid simply because you were in a large group and felt it wasn’t your responsibility. Discuss ways in which improving your critical-thinking skills may make you less susceptible to the diffusion of social responsibility error.
- Computers (AI) programmed with an inductive logic program can, after sufficient experience working with the ups and downs of the financial market, predict the market with greater accuracy than most experienced financial planners. Given that these computers are not as prone to cognitive errors as are humans, critically evaluate whether we should rely more on AI to make decisions about such issues as college admissions, medical diagnoses, matchmaking, and piloting an airplane.
- What are some of the sources of knowledge?
- Sources of knowledge include both reason and experience. Experience encompasses direct and indirect experience, expert testimony, and research resources such as printed material and the Internet.
- In what ways might experience be misleading?
- Experience can be distorted through false memories, confirmation bias, and reliance on hearsay and anecdotal evidence, as well as perceptual, cognitive, and social errors in our thinking.
- What are some of the types of cognitive and social errors in our thinking?
- Cognitive and social errors are in part the way our brain interprets the world. They include misperception of random data, memorable-events errors, probability errors, self-serving biases, self-fulfilling prophecies, one of us/one of them error, social expectations, group pressure and conformity, and diffusion of responsibility.
Critical THiNKing Issue
Perspectives on Evaluating Evidence for the Existence of Unidentified Flying Objects (UFOs)
Sightings of unexplained phenomena in the sky have been reported since ancient times. However, it was not until the late 1940s, following the famous “flying saucer crash” incident in Roswell, New Mexico, that UFO reports began to proliferate. There is little doubt that sensationalist media coverage stimulated reports of more UFO sightings, just as the 1909 story in the Boston Herald of the invention of an airship was followed by hundreds of sightings of the bogus ship.
In 1948, the U.S. Air Force began to keep a file of UFO sightings as part of Project Blue Book. By 1969, the project had recorded 12,618 UFO sightings. Ninety percent of these UFO sightings have been identified with astronomical and weather phenomena, aircraft, balloons, searchlights, hot gases, and other natural events. Ten percent remain unexplained. In 1968, the U.S. Air Force commissioned a study under the direction of University of Colorado professor Edward Condon.44 The study concluded that there was no evidence for UFOs and that scientific study of the phenomenon should be discontinued. As a result of the study, Project Blue Book was suspended.
Despite official consensus that UFOs do not exist, 56 percent of Americans believe that UFOs exist.45 In addition, a National Geographic Survey reports that 10 percent claim to have actually seen a UFO. The survey also found that 79 percent of Americans think that the government is hiding information from them about the existence of UFOs and alien life forms.
Following are readings from the U.S Air Force Blue Book Project and by Royston Paynter. Many if not most scientists believe that UFOs do not exist. These scientists argue that there are natural explanations for UFO phenomena, including meteorites, balloons, hallucinations, and perceptual and social error in our thinking. While Blue Book Project is more dismissive of UFOs, both readings leave open the possibility that UFOs may be real.
Project Blue Book: Analysis of Reports of Unidentified Aerial Objects
UNITED STATES AIR FORCE
Project Blue Book summarizes a series of studies of unidentified flying objects (UFOs) conducted by the U.S. Air Force beginning in 1952. The following selection is from the summary and conclusion of the report. To read the entire report, go to http://www.ufocasebook.com/pdf/specialreport14.pdf.
It is not possible to derive a verified model of a “flying saucer” from the data that have been gathered to date. This point is important enough to emphasize. Out of about 4,000 people who said they saw a “flying saucer,” sufficiently detailed descriptions were given in only 12 cases. Having culled the cream of the crop, it is still impossible to develop a picture of what a “flying saucer” is. . . .
On the basis of this evidence, therefore, there is a low probability that any of the UNKNOWNS represent observations of a class of “flying saucers.” It may be that some reports represent observations of not one but several classes of objects that might have been “flying saucers”; however, the lack of evidence to confirm even one class would seem to make this possibility remote. It is pointed out that some of the cases of KNOWNS, before identification, appeared fully as bizarre as any of the 12 cases of good UNKNOWNS, and, in fact, would have been placed in the class of good UNKNOWNS had it not been possible to establish their identity.
This is, of course, contrary to the bulk of the publicity that has been given to this problem. . . . It is unfortunate that practically all of the articles, books, and news stories dealing with the phenomenon of the “flying saucer” were written by men . . . had read only a few selected reports. This is accentuated by the fact that, as a rule, only the more lurid-sounding reports are cited in these publications. Were it not for this common psychological tendency to be captivated by the mysterious, it is possible that no problem of this nature would exist.
The reaction, mentioned above, that after reading a few reports, the reader is convinced that “flying saucers” are real and are some form of sinister contrivance, is very misleading. As more and more of the reports are read, the feeling that “saucers” are real fades, and is replaced by a feeling of skepticism regarding their existence. The reader eventually reaches a point of saturation, after which the reports contain no new information at all and are no longer of any interest. This feeling of surfeit was universal among the personnel who worked on this project, and continually necessitated a conscious effort on their part to remain objective.
It can never be absolutely proven that “flying saucers” do not exist. This would be true if the data obtained were to include complete scientific measurements of the attributes of each sighting, as well as complete and detailed descriptions of the objects sighted. It might be possible to demonstrate the existence of “flying saucers” with data of this type, IF they were to exist.
Although the reports considered in this study usually did not contain scientific measurements of the attributes of each sighting, it was possible to establish certain valid conclusions by the application of statistical methods in the treatment of the data. Scientifically evaluated and arranged, the data as such did not show any marked patterns or trends. The inaccuracies inherent in this type of data, in addition to the incompleteness of a large proportion, of the reports, may have obscured any patterns or trends that otherwise would have been evident. This absence of indicative relationships necessitated an exhaustive study of selected facets of the data in order to draw any valid conclusions.
A critical examination of the distributions of the important characteristics of sightings, plus an intensive study of the sightings evaluated as UNKNOWN, led to the conclusion that a combination of factors, principally the reported maneuvers of the objects and the unavailability of supplemental data such as aircraft flight plans or balloon-launching records, resulted in the failure to identify as KNOWNS most of the reports of objects classified as UNKNOWNS.
An intensive study, aimed at finding a verified example of a “flying saucer” or at deriving a verified model or models of “flying saucers” (defined on Page 1 as “any aerial phenomenon or sighting that remains unexplained to the viewer”), led to the conclusion that neither goal could be attained using the present data.
It is emphasized that there was a complete lack of any valid evidence consisting of physical matter in any case of a reported unidentified aerial object. Thus, the probability that any of the UNKNOWNS considered in this study are “flying saucers” is concluded to be extremely small, since the most complete and reliable reports from the present data, when isolated and studied, conclusively failed to reveal even a rough model, and since the data as a whole failed to reveal any marked patterns or trends. Therefore, on the basis of this evaluation of the information, it is considered to be highly improbable that any of the reports of unidentified aerial objects examined in this study represent observations of technological developments outside the range of present-day scientific knowledge.
- How does Project Blue Bookdistinguish between KNOWNS and UNKNOWNS in assessing reports of UFO sightings?
- How do the authors account for the fact that so many people believe in UFOs?
- What conclusion do the authors of Project Blue Bookdraw regarding the existence of UFOs and why?
Physical Evidence and Unidentified Flying Objects
Royston Paynter has a Ph.D. in materials science from the University of Surrey in the United Kingdom and is currently a professor at the Institut National de la Recherche Scientifique in Quebec, Canada. In this article, Dr. Paynter writes that claims about the existence of UFOs and alien abductions should be conducted “according to the highest standards of scientific inquiry.”46 Without any physical evidence, he argues, we should remain skeptical about these claims.
Skeptics are sometimes criticized for demanding physical evidence of alien visitations. It is an unreasonable demand, believers say, because aliens are intelligent and cunning, and one cannot expect them to leave physical evidence of their presence on Earth.
Well, such an argument may make sense to somebody who is prepared to believe in alien visitations as an act of faith, in the same way that some people believe in angels. But the undeniable fact of the matter is that there is no probative physical evidence that compels us to conclude that aliens are visiting the Earth.
There simply is no alien space ship on display in a museum somewhere, in fact, there is no object in existence on Earth of which we can say “this must have been made by aliens.” Of course it is possible to believe in alien visitations nonetheless, as an act of faith, but the great majority of scientists do not believe it, because it has not been proven in a rigorous scientific manner.
Those believers that reject the more extreme claims of popular UFOlogy, such as cattle mutilations, crop circles and even perhaps alien abductions, tend to fall back upon government and military reports obtained under the Freedom of Information Act. A well-known example is the US Air Force’s own Project Sign “Estimate of the Situation,” issued in 1948, that concluded that flying saucers were real and that they came from outer space. Presentation On Cognitive Errors
To what extent is such a report authoritative? A scientifically trained individual looking at such a statement would ask “is this conclusion justified by the data presented?” That is to say, is such a conclusion forced upon us as the most economical way to explain that data, or is it the result of sloppy analysis and/or wishful thinking? In the case of the Project Sign “estimate,” General Hoyt S. Vandenberg did not believe that the report’s evidence was sufficient to support its conclusions, and he rejected it.
For those among us that are not prepared to believe in alien visitations simply as an act of faith, physical evidence is the key to everything. We will believe, if some artifact can be found on Earth that is demonstrably alien. Let us note here that “unidentified” and “demonstrably alien” are not synonymous. Just because a given UFO sighting cannot be explained it does not follow that it has been proved to be an alien space ship.
Short of a flying saucer landing on the White House lawn, where lie the best chances to obtain a demonstrably alien artifact? If we are to believe the stories told (or “remembered” under hypnosis) by those claiming to have been abducted by aliens, it seems that we should direct our attention first to those “alien implants” recovered from these people.
The stakes here are extremely high. If these “implants” can be shown to have been manufactured by aliens, then people really are being abducted by aliens. If, on the other hand, it cannot be shown that the “implants” are alien, then we must ask serious questions of the “researchers” who have elicited the testimony from the “abductees.”
With the stakes so high, it is essential, in our opinion, that these analyses be conducted in accordance with the highest standards of scientific inquiry. Most importantly, we must demand that the UFOlogists prove what they claim. They are claiming that the “implants” have an alien origin. It is therefore not enough to show that they are “100% pure” or that they have an “unusual composition” or that they contain chemical elements also found in radio transmitters. They have to show that aliens made them.
One simple test would be enough to prove such a claim to the satisfaction of most scientists—an isotopic analysis of the material from which the implant is composed. We can reasonably expect that a device made by aliens from materials obtained in another solar system will exhibit isotope ratios different than those found on Earth. Such a test goes straight to the heart of the claim being made for the “implant” and would avoid all the obfuscation and hyperbole about “100% purity” and the like.
We urge the UFOlogical community to adopt properly scientific standards of investigation and proof in their work. They have to support their conclusions with probative evidence and rigorous reasoning and to confront the skeptics with the evidence they so dearly seek—a demonstrably alien artifact. Presentation On Cognitive Errors
Source: Royston Paynter, “Physical Evidence and UFOs,” 1996. Reprinted with the permission of Royston Paynter
- Why do some believers maintain that the demand for physical evidence of alien visitations is unreasonable? How does Paynter respond to their objection?
- What type of evidence does a scientist such as Paynter argue is necessary to establish the claim that UFOs exist?
- What type of evidence does Paynter argue is necessary to prove the claim that people have been abducted by aliens?
THiNK AND DISCUSS
PERSPECTIVES ON THE EXISTENCE OF UNIDENTIFIED FLYING OBJECTS
- What conclusion do both readings draw regarding the existence of UFOs? Compare and contrast the arguments used by the authors of Project Blue Book and by Paynter to support their conclusion(s). Evaluate the evidence each uses. Which reading presents the best argument? Explain.
- Discuss the role of cognitive and perceptual errors, as well as social errors, in the debate over the existence of UFOs. Be specific.
- Both the authors of Project Blue Book and Paynter concede that neither the lack of actual physical evidence of UFOs nor the ability to explain UFO “sightings” as sightings of familiar objects is not sufficient prove that UFOs do not exist (see fallacy of Appeal to Ignorance on page 148). Discuss what proof or evidence, if any, would be sufficient to convince a logical person that UFOs existed.
- Do you believe in the existence of UFOs? Write down the evidence and premises you use to support your conclusion. Working in small groups, critically analyze each other’s arguments.
Presentation On Cognitive Errors