Readings+-+Jackson+and+Jamieson


 * SUMMARY: Jackson and Jamieson, Chapter 04: "UFO Cults and Us: Why We Get Spun" in //UnSpun: Finding Facts in a World of Disinformation,// Pages: 65-81**

In //UFO Cults and Us: Why We Get Spun//, according to Jackson and Jamieson people are not wired to think rationally; this was confirmed through brain scans and many experiments. Humans, even when presented with hard evidence, act against facts and rely on their emotions, biases, and faulty assumptions. Psychologists are fascinated by this phenomenon, and they are studying people’s irrational reactions to concrete proof. Together, the authors agree that the human mind operates to defy logic. Jackson and Jamieson describe how the mind challenges logic by discussing specific “traps” it uses to force people into thinking and acting senselessly. The article suggests that knowing and understanding the tricks human psyche uses to drive people into absurdity will allow them to control their irrationality.

Jackson and Jamieson discourse on the first mind trap which they call “pictures in our heads”. It means that people easily accept misinformation that reinforces their believes. To illustrate, George Bush, a republican, in his two-thousand and four year campaign against John Kerry, falsely accused his democratic opponent of proposing an extreme gasoline tax increase eleven times when fuel prices were soaring. Many republican voters believed in this absurdity without applying any thought to Bush’s statement. If they did, people would learn that Kerry voted for a single four point three percent gasoline tax increase and only toyed with the idea of fifty cent a gallon fuel tax. However, shortly after he rejected the idea. Democrat candidates have exploited this psychological trap as well. Kerry falsely accused Bush of cutting the Social Security benefits thirty to forty-five percent. Only thirty-seven percent of people found this statement to be false. Consequently, supporters of the democrats voted in favor of this statement. The reason was simple. People generally think of Social Security as a Democratic program and think of Republicans as opposing it. The stereotypical believes that Republicans are here to destroy Social Security was the reason why people agreed to Kerry’s false claim without giving it any thought. Jackson and Jamieson encourage people to think about such absurd claims in order to avoid being manipulated.

Another mind trick that is closely related to the one above is what the authors describe as “root for my side” trap. Psychologists provide evidence that people’s believes not only generate the way they think, but also affect how people see the world around them. Jackson and Jamieson look at a study conducted by Albert Hasorf and Hadley Cantril in the year nineteen fifty-four of how “Princeton and Dartmout football fans saw a penalty-ridden game in which the Princeton quarterback was taken off the field with a broken nose and a mild concussion, and a Dartmouth player later suffered a broken leg” (73-74). The Princeton fans thought that Dartmouth’s player started the fight while Dartmouth fans believed the opposite. Besides this study, there are brain scans which support the idea that people see things differently depending on whom they support. This is also called the hostile media phenomenon where if a person truly supports a specific side, he or she will find that side to be right in all situations. Jackson and Jamieson continue to give many more examples on the matter but conclude, once again, with an advice to apply scientific method to political claims, marketing messages, etc.

A third mind trick that people fall into is the infamous “I know I’m right” trap. It is a fact that the more misinformed people are, the more they will claim that they are correct. An interesting example that Jackson and Jamieson discuss is on a research conducted in the year two-thousand where James H. Kulinski and his colleagues declared that a thousand one-hundred and sixty Illinois residents knew very little about the welfare system. Those under half who said correctly that seven percent of Americans were getting welfare, were confident in their answers. However, the people that overestimated their answers were either very confident or highly confident. This means that those who are overconfident in their believes are the ones who need to revise their ideas because odds are they will most likely be incorrect.

Lastly, Jackson and Jamieson discuss the “close call” trap, which means that those who encounter tough decisions and close calls exaggerate the differences. Jack Brehm demonstrated this in a famous experiment published in nineteen fifty-six. There were two groups of women who were asked to rate eight different products such a toasters and coffeemakers, then the researcher allowed them to keep one product. One group’s decisions were close calls, and another group’s were given the easy calls. The results showed that women who were asked to make a tough choice between which product to pick showed more positive attitudes toward the chosen product and less positive about the product they rejected even though the two things were practically identical. “Psychologists call this the ‘spreading of alternatives’ effect, a natural human tendency to make ourselves feel better about the choices we have made, even at the expense of accuracy or consistency” (79). The reason for this is that people function on automatic pilot, taking mental shortcuts instead of thinking things through. The women in the example in the close call group did not bother to question the fact that the products are basically the same, and it does not matter which one they choose. Jackson and Jamieson continue to give more examples and describe how people act on these mental shortcuts with instances of the Xerox Machine and a campus bake sale where people bought bogus excuses such as “May I use the Xerox machine, because I need to make copies?” or “Would you like to buy a good cookie? It’s for a good cause.” Because most people are on an autopilot in many situations they encounter, in the experiments no one assessed the validity of the excuses, and more than ninety percent of people either let a person cut in line for the Xerox machine or purchased a cookie after hearing the excuse. This proves the authors’ point that people do, indeed, fall into this psychological trap.

Jackson and Jamieson said that “it’s better to know psychology, to know that [human] brain ‘lights up’ to reinforce…existing beliefs when [people] hear [their] favorite candidates or positions challenged. To avoid being deceived [people] have to make sure the pictures in [their] heads come as close to reflecting the world outside as they reasonably can” (81). In other words, people must ask questions, “Are there facts that I do not know? What information is missing in this claim?” The authors affirm the reader that if people do not ask such questions, then they might become like Mrs. Keech’s UFO cultists preaching that guardians from planet Clarion are coming down to Earth to save the believers and destroy the infidels.


 * Discussion**

In Jackson and Jamieson’s book //Unspun// in the fourth chapter, they describe the UFO cult that couldn’t stop believing in their leader even after her prediction “from the aliens” had been proven false. They continued to follow her (all except one) and even began converting new members into their cult. The authors ask the question why? Why didn’t they see the obvious evidence that their belief system was flawed? My theory, along with feeling embarrassed that they were proven wrong, is that it was easier to just keep believing. The book gives the example of “what Festinger calls cognitive dissonance” (67), but I think there was another psychologist who could have also been considered. The child psychologist Piaget came up with a theory that explained the processes of integrating new information into one’s mind. When you bring in new information and have to alter your current ideas or beliefs it puts an individual into a state of “disequilibrium”; which isn’t a comfortable state to be in. By avoiding changing their belief systems they kept themselves in a state of “equilibration”. So ultimately they were just being lazy. The same goes for several other examples given in the text, such as the truthout incident. -Sarah Phillips

-The first mindtrap that they talk about, the "pictures in our heads" mindtrap seems to me to be the way we learn news in the first place. When we look at news, I believe that we link information to perceptions we already seem to uphold towards a certain issue. When reading something that we feel we have background information on, whether true or false, we instantly make stronger connections with these facts because they make sense to us and give us a feeling that we've learned something.

-Jackson and Jamieson emphasize many different “traps” that humans tend to fall into when absorbing information, and they present facts explaining how we can be tricked by our emotions into believing what isn’t necessarily true. I felt the most intriguing section was the “root for my side” trap. It was interesting to see psychologists’ actual proof, through brain scans, that humans will literally let their emotions control what they believe to be true. I definitely agree with that idea and, also, the author’s statement, “material that counters our biases stands out in our minds and makes us look for a reason to reject it.” If our beliefs and values are rooted strong enough in an idea, or even tied to a specific person (like in politics), bias will almost always come into play. Thus, directing our focus on the good of our “side” and the bad of the other, regardless of what facts are stacked against us. Emotions can be a serious factor in holding us back from seeing the truth.

The article was very factual in the sense of a psychologists stand point. I am in rolled in a psychology class and we learned about these phenomenas occurring in the the human brain. When you think you know something you will stand behind it a hundred percent, no matter what the situation may be. The other so called “trick” about how people follow what everyone else is doing is a study that has be done countless times. If amongst a group of people that are all doing the same thing or following a certain way of acting, no one will step out of their comfort zone and stand for themselves. This being even if they knew it was against their own beliefs. But on the other side of that, if the numbers are severely reduced the action in which people do step out of the norm is highly increased.


 * SUMMARY: Chapter 05: "The Great Crow Fallacy: Finding the Best Evidence" in //UnSpun: Finding Facts in a World of Disinformation//, 103-125**

In //The Great Crow Fallacy: Finding the Best Evidence//, Jackson and Jamieson discuss five criteria that guarantee the best evidence: “do not confuse anecdote with data,” “do not tread in utter ignorance,” “not all ‘studies’ are equal,” “saying it does not make it so,” and “extraordinary claims need extraordinary evidence.” The authors use many detailed examples to prove this criteria leads to the best evidence. Some major instances are the crow fallacy, John Godfrey’s poem about the six blind men, the great fertilizer scare, Mitch Snyder’s ‘Meaningless’ numbers, the estate tax argument, and the case with increasing abortion rates. The authors then discuss the situation of Cold-Eeze medication, and how it departs from the five criteria listed above. After discussing about what makes a confirmation valid, the writers finish by listing things that do not count as evidence at all. Lastly, through numerous examples and analyses, Jackson and Jamieson suggest that people must never draw fast conclusion from limited evidence.

First, Jackson and Jamieson make it clear that “one or two interesting stories do not prove anything.” (105) The authors in detail describe the story of crows using human technology to crack nuts. Later, it was proven that “crows merely are using the hard road surface to facilitate opening of walnuts, and their interactions with cars are incidental”. (105) Although it is understandable that seeing a crow using a car as a nutcracker, one might suspect that this behavior is deliberate. However, people must consider term anecdotal evidence, which deduces a conclusion found on insufficient amount of true data. Jackson and Jamieson conclude that one, in order to avoid spin, must consider all possible evidences not just one particular. The authors suggest that although it is natural for people to trust what is seen, their own experiences can mislead them.

Second, Jackson and Jamieson discuss a human tendency to exaggerate immediate experiences. John Godfrey wrote a poem about six blind men who touched different parts of an elephant, and argued about what they though they touched. The men uttered in ignorance while naming random object they believed the elephant was. Therefore, people need to understand that what they think they see, does not explain the full picture. Furthermore, Jackson and Jamieson provide an important example about the great fertilizer scare of 1964. It all started with Bob Water, a member of San Francisco 49ers football team, who was diagnosed with Lou Gehrig’s disease. Soon, news came out that a common lawn fertilizer might be the cause for this illness, which the groundskeeper at the stadium the team was practicing recalled using. This frightened the public and caused the company who makes the fertilizers to reach bankruptcy. However, later research was done and it was found that the fertilizer did not cause Lou Gehrig’s disease.

Third, Jackson and Jamieson explain that “not all ‘studies’ are equal.” A credible study will have a qualified individual or a group of individual, the study can be repeated and result are consistent with the original findings, the study is based on scientific methods with concrete data and no biases, and most importantly, a valid research will have very little guesswork or not any at all. Jackson and Jamieson discuss an unscientific study concerning the amount of homeless people in the United States conducted by Mitch Snyder, an ex-convict and former advertising man. In 1960s, Snyder and the members of the Community for Creative Non-Violence asked local clergymen, city officials, and other volunteers to aid the homeless in twenty-five cities. Snyder also requested an estimate number of people helped. However, an estimate is not a true count and thus an illegible way of conducting research. Some people believed his study and supported his crusade to safe the homeless. However, after real research declared only one percent of American population homeless, Snyder lost much of his credibility (if he even had any to begin with). He then decaled, “These numbers are in fact meaningless. We have tried to satisfy your gnawing curiosity for a number because we are Americans with western little minds that have to quantify everything in sight, whether we can or not." (111)

Fourth, the authors argue that “saying it does not make it so.” Jackson and Jamieson agree that farmers and other small businesses were at risk of bankruptcy in the 2005, if this radio station’s advertisement were to be believed: “When you die, the IRS can bury your family in crippling tax bills. It can cost them everything.” (114) In reality, a report in July 2005 was issued asserting the public that a larger number of estates, including those of farmers and small businesses, had enough liquid assets to pay the estate taxes they owed. The authors are not taking sides on the issue of estate taxes, but they agree that data from tax returns, which was processed by an objective body known for its use of reliable research methods, correctly reported this claim as false.

Fifth, the authors argue that “extraordinary claims need extra ordinary evidence.” In other words, when people come across claims of massive proportions, there is always a need for confirmation. A famous example with rising abortion rates was faulty because the individual researching the case projected the national trend from a nonrandom sample. He also did not collect the data from all of the states and did not consider the fact that the way people have began recording abortions is different. What the researcher “thought was an increase in the number of abortions really reflected an improvement in procedures for counting them.” (118)

Jackson and Jamieson discuss the case with Cold-Eeze medication, which claims that it is “proven to cut colds by nearly half” due its high dosage of zinc. (118) The company did only a single research experiment, which showed positive results. However, it is wrong to come to conclusion that zinc does anything just from one study. The company did not bother to look at other evidence to determine if zinc is really a cure for common cold. Instead, they “tread[ed] in utter ignorance” as said by John Godfrey. Moreover, the study itself seemed solid, but when it was redone, the result were opposite of the original positive findings. This would imply that the study was not up to standard, because it did not fit one important rule that makes successful experiment: one must be able to repeat the research and get approximately the same results. Obviously, saying that zinc is a cure for common cold did not make it so, and the scale of this claim did not have the appropriate evidence to prove that zinc in any way helps fight a cold.

Jackson and Jamieson up to this point have discoursed on evidence; however, now the authors touch upon the non-evidence. First, claims made by famous, powerful, or influential people are not necessarily correct. Jackson and Jamieson want people to question such persons and their credibility. Second, appeals to popularity must be avoided. Just because a hospital is “preferred two to one” does not make it good. Perhaps the reason it is most visited is that it is the only hospital in the region. Third, the authors are against faulty logic. One must always remember that when two events happen, the first one does not always cause the second. This is called the post hoc fallacy and can be very dangerous to rational thinking. On a final note, Jackson and Jamieson remind the reader to always “be careful about jumping to conclusions. Always ask, ‘Are these facts really connected?’ And-always-keep asking, ‘What's the evidence?’” (125)