Why We Can’t Trust Our Brains
By: Leila Keihani






I. Introduction

With 100 billion nerve cells developed at birth, it is no wonder that the human brain is considered the “most sophisticated andcomplex structure in the known universe” (Schwartz, & Begley, 2002). These nerve cells then make trillions of connections that are “wired by experience with sights, sounds, feelings, and thoughts” (Schwartz, & Begley, 2002).With all these complexities in mind, scientists have yet to understand all the processes that occur in the brain. However, what scientists do know is that the brain is not perfect and humans are constantly making errors in processes such as decision making, problem solving, and reasoning. The brain can only hold so much information within its capacity and the information that is stored filters through the individual’s belief system with a combination of social influences resulting in an output of untrustworthy substance. Consequently, there are many implications as to why we can’t trust our brains.
The purpose of this research is to investigate how the complexities of our mind use trickery in simple cognitive processes including judgments, decision making, and overall behaviors in our everyday lives. Foundational to these mistakes are as follows: we want to be right, we are simplistic, and we are often influenced by our environment. Knowingly and even not knowingly, we, then, utilize various tools in order to go about fulfilling those characteristics. Throughout the exploration of faulty thinking and judgment, I hope to make the reader aware and encourage better decision making and behavior

II. We Want to be Right

When we perceive reality, it is processed through our theories, leaving reality independent from the observer (Shermer, 2002). This leaves the whole concept of reality subjective and open to interpretation by the perceiver. Generally speaking, we create perception based on personal biases made without seeking factual information and underlying the errors in perception is the fundamental confidence that our beliefs and observations are correct. For instance, people tend to seek confirmatory evidence in new information and problem solving. So, when presented with new information that is consistent with our beliefs, it is accepted uncritically, and if it is inconsistent with our beliefs, we carefully analyze and are critical of it. If we can’t come up with the right solution, we still form hypotheses about relationships due to coincidences. As a result, our perceptions bias cognitive thinking on decision making, judgments, and overall opinions on the environment.

III. Confirmation Bias & Pareidolia

As best described by author Michael Shermer, people do not rationally engage in decision making by weighing options on equal terms; but rather come to conclusions based on previous experiences, biases, influence, as well as a list of other controls (Shermer, 2002). People rationalize failure and have a difficult time accepting it. One of the best explanations for this phenomenon is called the confirmation bias. The confirmation bias states that people have a tendency to look for and pay attention to confirming information and ignore those that disconfirm. The confirmation bias helps to explain some of the logic behind the mistakes people make in cognitive thinking. For one example, people have a tendency to form something out of seemingly nothing, also known as pareidolia. In 1978, a woman named Mario Rubio infamously spots Jesus’ face on a tortilla and attracts thousands of curious onlookers to attribute this formation as a divine sign. Her claim makes it to the newspaper bringing about a spur of controversy. As a result, pareidolia came into light. Nevertheless, this only marked the beginning of spotting ambiguous images onto everyday experiences. Referring back to the confirmation bias, it is likely that many of these ambiguously formed images were created by pre existing beliefs. Thus, it is highly likely that Rubio’s belief of Jesus as an existent and real entity and as a result, her beliefs attributed to the perception that she had a received a divine sign from Jesus, seeing what she wants to see.

IV. Peer Judgments

A. Halo Effects

Consequently, our expectations and desires highly influence our perceptions. They also affect how we judge others (Kida, 2006). As far as expectancies go, people tend to attribute their own biases to certain events and people. For instance, if a person is described as “warm”, peers tend to rate the person as more “considerate of others, more informal, more sociable, more popular, better natured, more humorous and more humane” (Kelly, 1950) than if he/she was described as “cold”. Thus, people have a tendency to form prejudgments on others consistent with what they already believe. This is known as the halo-effect. . Thorndike (1920) was one of the firsts to explore this matter. In a comprehensive review on the halo effect, Thorndike illustrates just how personal and physical qualities can hinder the perception of personality traits. Personal qualities included dependability, loyalty, and general value to the service while physical qualities included physique, bearing, and neatness. As a result, there was a significant correlation between physique and intelligence, leadership, and character leading to the belief that individual perceptions of personal and physical qualities are dependent on one another. Through the halo effect, Thorndike convinces us “that even a very capable foreman, employer, teacher, or department head is unable to treat an individual as a compound of separate qualities and to assign a magnitude to each of these in independence of the others” (Thorndike, 1920). Recently, Srivastava et al. (2010) examined the origins and cause of this occurrence. Particularly, the origins of the halo effect are rather an analysis of stereotyping that “guides initial impressions of others” (Srivastava et al., 2010). Consequently, there is a consistency in preexisting perceptions among individuals, shedding light on the human nature to stereotype. Consistency in individual perceptions, however, does not last when interactions increase overtime. For instance, one may have consistent beliefs of others across groups, yet, as the individual begins to experience and associate with those target individuals, their perceptions begin to merge into group-specific stereotypes. Differences in social cognitive processes as well as social experience have potential for creating biases of others. Thus, the halo effect is not “fully formed” (Srivastava et al., 2010) and fortunately, target individuals are not doomed by initial perceptions. So if we feel uneasy about feeling prejudged, as long as we get to know that person more their perceptions of us can easily change.

B. Mere Exposure Effects

Even better, what if you were told that you can form a liking towards something simply by the amount of times you are exposed to it? The mere exposure effect demonstrates just this. More specifically, it states that through exposure, we feel a liking towards people or things of familiarity. “Mere” in this sense means to make the stimulus or object accessible to the perceiver (Zajonc, 1968). As described earlier, halo effects were consistent across individuals until exposure of that target individual increased. However, their attitudes toward that individual did not necessarily improve, it just changed. In this case, there is a liking or preference towards an object or stimulus with increased exposure. Psychologist Robert Zajonc (1968) was the first to demonstrate just how the mere exposure of a stimulus enhances an attitude towards another stimulus. Zajonc illustrated the effects of mere exposure by first exposing a list of meaningless syllables and then locating the effect of exposure time on personal preference. After some exposure time, subjects rated on a 7-point good-bad scale whether a word meant something good or bad and to what extent because the syllables all meant something good or bad. As a result, the distribution among low and high frequency subjects demonstrated a significant superior rate among those who were exposed to the nonsense words at higher frequency than those of low frequency. More specifically, individuals judged the familiar syllables better than non-familiar syllables. Additionally, D. W. Rajecki (1974) explored the mere exposure effect on humans and nonhumans prenatally. In one of many examples, chicken eggs were exposed to two different tones at different exposures and as a result, the chick hatchlings showed a reduction in stress calling towards the higher exposed tone (Rajecki, 1974). Nevertheless, this effect is so highly influential that even when the subjects were unaware that they were exposed to the stimulus, they continually showed a preference to the stimulus with repeated exposure (Zajonc, 2001). Thus, people do not even need to subjectively experience the stimulus in order to form a liking towards it.

V. Underestimating Coincidence and Role of Chance

An alternative way of confirming one’s own belief is seeking order in the environment. In this way, we attempt to eliminate the chaos that surrounds us and transform it into something feasible to fit more comfortably in our minds. People want the world to be systematic and ordered even when it is random and chaotic, thus individuals create systems that are easier to process particularly when it comes to information that appears too complex. As a result, people are confirmatory seeking beings that like to see order in the environment particularly when it comes to events that happen in coincidences. An act of coincidence, then, takes place under the condition of two events occurring in contingency patterns “without apparent design” (Shermer, 2002). However, people tend to underestimate the power of coincidence and attribute causations for correlations. As a result, people believe that higher order, mystical force at work. One of the mistakes here lies in the perspective of probability. In a situation where someone believes a certain event is destined to happen due to its odds of occurring, the individual sees it from their perspective rather than the perspective as a whole. For instance, if a man travels to a foreign country and runs into someone they know, most likely he is thinking: what are the odds of that? However, the odds are quite great if the person looks at the probability as a whole. The probability that he will see someone he knows in that country seems unlikely, yet the probability of seeing someone you know in a foreign country in your lifetime is more likely to happen. Thus, in order to understand the likelihood of events to happen, it is best to look at the picture from a grand scale of things rather than one’s own individual perspective.

VI. Superstitions

A consequence of coincidences, then, can formulate into superstitions illuminating the perception of higher order. If an event follows another in a generally unlikely association, individuals create a contingency pattern leading to, what is called, a superstition. Coincidence, then, is the framework of superstitions because superstitious beliefs happen during times of unlikely situations or coincidence. British philosopher, Bertrand Russell (1950) once quoted, “Fear is the main source of superstition and one of the main sources of cruelty.To conquer fear is the beginning of wisdom.” Consequently, the general occurrence of superstitions happens in times of fear and uncertainty, which is why superstitions occur mostly in sports. There is a high level of uncertainty in sports particularly in the areas that the players cannot keep under control. Subsequently, players want to feel that they can control the outcome to their preference and superstition gives the faulty belief that they somehow can control the outcome in its entirety. Superstitions can come in bizarre and nonsensical forms such as the superstitious practices carried out by 1996 World Series Professional Major League Baseball Champion, Wade Boggs. Considered as the “chicken man”, Bogg ate chicken before every game, walked out in the field at precisely 5:17pm and started sprints at 7:17pm. In the batter’s box, Boggs drew “Chai”, a Hebrew word meaning “life”, before each at-bat. With each growing year, Boggs continued to perform these rituals throughout his career in the belief that the practices contributed to his success. Then, in order to conquer fear is through mere wisdom, underlying the basis of this faulty premise. However, the term wisdom itself can be misleading. Boggs understood that his rituals were bizarre, so, he needed to be proven otherwise that these tactics were not getting him success. Yet, since Boggs career improved every season, it strengthened his commitment to superstition. Like many other superstitions, they form as a result of strengthened reinforcement. Reinforcement, then, plays a critical influence in our interactions with the world. As we know, the variable-ratio schedule of reinforcement predicts the strongest commitment to behavior, and this, like most other superstitions, is a result of this kind of reinforcement.

VII. "Hot Hand" & "Gambler's Fallacy"


Example of where Gambler's Fallacy is at its most frequent occurrence via Wikimedia Commons
Example of where Gambler's Fallacy is at its most frequent occurrence via Wikimedia Commons


These forms of mistaken contingencies also result in what is known as the “Hot Hand”and “Gambler’s Fallacy”phenomenon.In particular, “Hot Hand”is when someone is on a winning streak and they believe that future incidents will be as positive as a result as their history,believing they have a “Hot Hand”.You see this phenomenon commonly apparent in sports.In the game of basketball for instance,a coach will tell his players to feed the ball to a player who is on a shooting streak with the perception that he has a better chance of making the next shot.Conversely,the “Gambler’s Fallacy”says that individuals believe that their performance will present the opposite results of the prior winning or losing streak.For instance,when a gambler lands on a certain number numerous times in a row,individual erroneously believe that the next event of landing on that certain number again is not likely to happen.Thomas Kida describes these phenomenon as individuals who “disregard regression to the mean” (Kida,2006).The error is the same for both fallacies: the prediction of future events based on previous history does not show causation.Rather,it is more likely that the outcomes average to the mean.This is due to the common misperception of random events.When the average person thinks about randomness,they think of different events occurring one after another.However,it is more likely that the same events happen multiple times in a row before a change occurs.In the instance of flipping a coin,it is more likely that you will see multiple lands on head than you would see lands of head and tail continuously one after the other. In order to more clearly understand how chance works, it is best to look at the role of chance and its distribution of the mean.There is a mean for all occurrences and it particularly depends on the number of trials.For instance, we see a clearer mean with a larger sample size than with a small sample size.If we take 100 trials of shooting baskets, we are more likely to see a mean of making to missing the shots than if we had only 10 trials.So, if we take the basketball shooter and we see that he is making multiple shots in a row, it is more likely that he will digress to the mean than to continue a relentless shooting spree.Thus, in order to accurately judge an outcome, we need to take a look at the distribution of shots in relation to the bell curve.However, just like superstitions,these fallacies demonstrate how people persistently see patterns that are not actually there.
As you can see,our biases are extremely influential on our perceptions and beliefs and how we judge and behave towards others.Unfortunately, we are often unaware that we are making these judgments.However, we can improve faulty thinking before coming to conclusions by simply becoming aware of how these concepts play in our everyday lives.

VIII. We are Simplistic

Even though errors in perceptions have been brought to our understanding, it is not possible to completely avoid inaccuracies in cognitive thinking. This is because the environment provides incalculable amount of information that cannot all be held in the brain accurately resulting in some information left out. In reaction to this incapability, organisms are made to categorize information as efficient and simplistic as possible in order to store within the brain’s capacity. If not, we would be spending excessive amounts of time studying our environment, particularly when we meet novel stimuli.

A. Heuristics and Schemas

The foundation of simplifying strategies begins with instinct properties such as heuristics and schemas. Schemas provide us with a way to organize and interpret information. Heuristics, as well, provide a way to make mental shortcuts for decision making and problem solving. Nonetheless, these concepts are used to describe how our mind files and stores information efficiently to our advantage. For instance, we are able to create schemas about the world around them so that when faced with a novel situation or thing, we can react accordingly without the need to put extra effort into examining or putting extra thought in information processing. Creating mental short cuts, however, also has its shortcomings when seemingly important information is excluded, generating biases or stereotypes. When an object is given membership due to how it represents a class, representative heuristic is at stake. For example, if a man is described as being shy natured, inquisitive, and likes to read, we would most likely characterize him as a librarian rather than a teacher or lawyer. Thus, it is through our experience and preexisting categories that these representations have been formed. Accordingly, when we use base our judgments on information that is relevantly easier to access or recall we are constructing the availability heuristic. The most common example is the misperception that causes of death due to airplane accidents are more common than vehicle accidents. People attribute the severity of airplane accidents and frequency in the media that it is more common than car accidents, even though car accidents are far more common. Additionally, anchoring and adjustment heuristic occurs when an initial value influences an outcome. For example, if someone were to ask if you thought a particular car costs greater or less than $10,000 and then asked to give an exact estimate, you would most likely choose a number relative to $10,000 because it set an initial value. Thus, simplifying strategies are both beneficial and troublesome. In order to make quick judgments we must sacrifice some information in favor others, because, when we simplify we do not take into account all information that is relevant.

B. An Example of Information Overload: Savant Syndrome

Though it may seem favorable to be able to store as much information in our minds as possible, it could be very costly and detrimental. An excellent example of this is the rare case of individuals with Savant syndrome. Historically known as “idiot savant”, this syndrome causes an expertise in one or more areas with the sacrifice of others. The expertise provided by Savants is no ordinary one, but extraordinary and commonly known as super human. Here, examples of expertise include learning a new language in under two weeks, reading a page of a book in a matter of seconds, and learning to play a piece of music after simply listening in one trial. Two of the greatest and most popular Savants are “megasavant” Laurence Kim Peek and the high functional Daniel Paul Tammet. Peek, the inspiration for the movie Rain Man, had photographi memory being able to read a page of a book in about 8-10 seconds. It is so astonishing that when Peek read a book, it took him about an hour reading one page with the left eye and following page with his right eye simultaneously. He could memorize facts and recall at least 12,000 books from memo
ry. Unfortunately, all did not fare so well. Peek could not perform normal functioning abilities such as buttoning his own shirt, and other motor skills common to most people. Peek had a below average IQ of 87 demonstrating that there were many other things that he lacked. On the other side of the spectrum, Tammet acquires the high functioning type of savant syndrome. Tammet is known for his abilities to perform complex mathematical problems as well as reciting Pi from memory to 22,514 digits in a little over five hours. Tammet also demonstrates his superhuman abilities when he learned Icelandic, one of the most difficult languages, in about one week. Nonetheless, this capability lies along his other disorder, Asperger syndrome with other severe cognitive impairments. As demonstrated, these individuals are able to take in and retain information in an enormous amount. However, this overload in information has appeared to take place of other normal functioning capabilities that most humans have. So, if we start to wish that we had these superhuman capabilities, we might need to think twice about the consequence of losing normal functioning abilities in place of these higher processes.


So,the obvious downside to simplifying strategies is that we neglect some information in favor of others and in normal functioning humans that information tends to be one’s in which we favor.In the selective attention process,we are literally selecting what we will attend to.For example, if someone holds the belief that brunettes are smart, that person is more likely to pay attention to brunettes that are smart and ignore those that aren’t or those that are smart with different hair color.Thus,individuals selectively attend to information for two reasons: to simplify and support personal beliefs.It provides the feeling of being correct and for that reason we selectively attend to that evidence by ignoring those that disconfirm in order to gain that feeling.

C. In-group/Out-group as a Result of Stereotyping and Categorization

To organize and react to novel stimuli without effort, we are equipped with the ability to categorize in order to save effort and time.Thus,categorization is a very important tool for adapting to the environment in our constant interaction with novel stimuli.It “alleviates the need to attend to each individual member of the category as an individual” (Hamilton,2005).It also allows us to determine how the novel stimuli are related to others in the category.As a result,we learn that components of the category are similar to one another and secondly,those members of a category are different from members of other categories (Hamilton,2005).The process of the categorization does not necessarily constitute stereotyping,however,the formation of stereotypes involve organizing items in an exaggerated form of categorization (Hamilton,2005).Stereotypes,then,are a result of expectancies that bias categorization of novel stimuli.For this reason,stereotyping is notorious for creating discrimination and prejudice.Both terms are used interchangeably; however,where prejudice is negative thoughts about certain memberships,discrimination is the harmful behavior towards other members both as a result of stereotyping.In other words,we are quick to judge based through our stereotypic perceptions. It is so common that we stereotype on daily basis specifically when making judgments about others without putting much thought to it. Stereotyping uses qualities of simplifying techniques such as selectively attending to certain information and forming categories and concepts from the information. As we categorize, we form concepts such as in groups and out groups. Here, in group constitutes the perception that a group includes ourselves and others who share similar characteristics. Out group, then, constitutes a group of people who do not share characteristics similar to us otherwise known as “outsiders”. Thus, being in a group has many implications for behavior, judgment, and decision changes. A dramatic demonstration of in group and out group behaviors in the infamous study conducted by Philip Zimbardo in 1975, proves just how far group membership can affect behaviors. Originally, Zimbardo set out to investigate the psychological effects of imprisonment by having a group of subjects enact the roles of prisoners as well as the prisoner guards. The study was very realistic: it was conducted in the basement of the Stanford psychology building made into the setting of the Stanford County Prison, subjects were assigned a role as either a prison guard or a prisoner and were asked to act accordingly to their roles. Prisoners
were even handcuffed, finger printed and transported to the prison. During the experiment, the prisoners had to follow standard procedures that an actual prisoner endured such as follow all orders given by the guards and asking for permission to do anything. The only thing that Zimbardo required the prison guards to do behavior wise was to simply play the roles and keep order. To Zimbardo’s surprise, the subjects did just that and more. The prison guards began to be verbally abusive and order the prisoners to do daunting tasks such as pushups while the guards stood on their backs. The prisoners, in turn, had a negative reaction to their treatment. Many became depressed and only a few days into the study, all the prisoner subjects dropped from the experiment. Finally, Zimbardo terminated the study due to the dramatic effects these roles played on the groups’ behavior towards one another. Thus, simply by being perceived as in a particular group and interacting with perceived out group members resulted in negative otherwise noted as aggressive behaviors between the two groups. It is surprising in this case that these roles were fabricated. Much of stereotyping does just this. We create these mental representations of what these roles mean and often times perceive outsiders as being the enemy.

In conclusion,because the world is so complex, andpeople have a tendency to look for reasons things happen the way they do so they can understand the environment more clearly and efficiently. Thus, establishing connections and creating contingencies may well be in the human nature to help human kind thrive. It is critical to understand these underlying processes and for this reason we can come to a conclusion that certain simplifying strategies are both useful and hindering to some extent.

IX. Influenced by Environment

Since people live in an environment surrounded by information laden resources, individuals fall prey to misinformation given by trusted people or authority figures. From birth, we are susceptible to other opinions and on certain occasions we must rely on others’. Thus, it is important to mention the vulnerability we have towards others.

A. Appeal to Authority

Elliot P's demonstration of authority figures via Wikimedia Commons
Elliot P's demonstration of authority figures via Wikimedia Commons

Many, if not all, people rely a great deal on authorities.This could be anyone from a parental figure or a teacher to a President or religious leader.While these people are generally seen as reliable sources of information, many of these figures have made mistakes and some even intentional.However,individuals are still willing to listen and obey to someone they consider authority.For example,German Nazi Adolf Eichman who was the master mind behind organizing and facilitating the deportation of the Jews to concentration camps during World War II,led an entire population of Germans to follow his radical ideologies to detain the Jewish community. This occurred simply because Eichman was a figure of authority to which they should submit to and trust for their decisions.Consequently, Stanley Miligram tested this phenomenon and what
extent people obey authority figures despite moral beliefs.To set up his study,Miligram assigned one individual as the learner and another as a teacher.The role of the teacher was to ask questions to the learner and if the learner answered incorrectly,the teacher must administer shock to him,or so he believed.In reality,the learner did not receive a shock,yet the teacher was manipulated to believe so.Further,the teacher was given a set of questions to ask the learner.Each question the learner answered wrong,the teacher was told to give shock increasing in voltage as the number of incorrect answers increased.As the voltage increased,the learner would verbalize the amount of pain they received,leaving the teacher in stress and anguish for the learner.The experimenter stood close by to insist that the teacher continue to shock the learner at each miss.At Stanley’s surprise,sixty-five percent of participants continued to shock the learner at the maximum capacity despite the learner’s verbal cues of extreme pain.Even though,the teachers were in a great deal of stress and knew they would not receive any material loss or punishment regardless if they continued or not,they apparently felt compelled to continue.Consequently,Stanley had underestimated the amount of obedience the participants would pursue and the amount of tension they generated throughout the testing periods.These results shed light unto the massive obedience to corrupt authority during the second world war and many other influences of authority.Stanley provided a significant key point, “Obedience occurs not as an end in itself,but as an instrumental element in a situation that the subject construes as significant,and meaningful”(Miligram,1963).Thus,those who abide may not see the full significance but trust that the authorities do.

B. Rumors and Anecdotes

Uknown author creates Illustration of the spreading of rumors via Wikimedia Commons
Uknown author creates Illustration of the spreading of rumors via Wikimedia Commons

Influences from environment are not always manipulative and detrimental as it may appear. For instance, misinformation can simply be circulated from our surroundings leaving us vulnerable to faulty information. In rumors and anecdotes, these forms of information can easily become influential especially if the perceived credible sources are constructing such information. Nonetheless, “Rumors are unconfirmed information circulating among persons endeavoring to make sense of a situation that is ambiguous or one that is potentially threatening. Like news, a rumor is of current or topical interest and is generally considered important or significant; unlike news, it is never verified” (“Rumors”, 2008). Often used interchangeably with rumors, anecdotes are claims made without any sources to support it. Rumors, however, generally have a tendency to start from some truth and then transform into an irrational account. Anecdotes are normally told by “fallible human storytellers” (Shermer, 2002). Nevertheless, rumors and anecdotes are everywhere and anyone is bound to hear a few. Rumors and anecdotes have been so compelling that it has caused people to make major and even threatening decisions. Some of the most recent impacting rumors that lead to mass hysteria include the year two thousand bug when rumors were spread that all the clocks on the computer would quit working causing the computers to shut down. This led to the computer industry spending over 300 billion dollars in preparation for the made up catastrophe that eventually turned out to be false. Another more dangerous threat was the rumor made in 1998 that the MMR vaccine causes Autism. Although no scientific evidence supported this statement, this scare caused many believers not to give their children immunizations placing their children in great jeopardy. So, in order to prevent from falling into the deception that rumors and anecdotes place, it is important to first consider that “bold statements do not make claims true” (Shermer, 2002). Just because a statement has been made seemingly extraordinary and powerful, doesn’t mean it denotes truth. More specifically, when an ad for a brand makes an astounding claim that they are the best, we shouldn’t easily believe it just because it says so. Unfortunately, many of the marketing companies utilize these bold statements in order to convince buyers to fall into paying money for their product. Secondly, “heresy does not equal correctedness” (Shermer, 2002). Heresy is just one of the many tools pseudoscientists use to reconcile rejected claims. They do this by demonstrating how past emerging scientists became successful despite ridicule, then utilizing this as justification for their own ridicule. Many pseudoscientists use this concept in order to gain some kind of attention or second glance by convincing others that he is being victimized. Thus, he may state, “Well, they laughed at Copernicus just as you are laughing at me” falsely categorizing himself with successful individuals in order to gain recognition. Thirdly, evidence requires “burden of proof” (Shermer, 2002). In other words, the one who is making a claim against evidence supported data has the responsibility to prove their claim in opposition to it. For instance, since evolution has the upper hand in substantial data and evidence to support it, creationists currently hold the burden of proof to find sizeable data against evolution. For argumentation sake, it is important that more focused attention needs to be on seeking proof from those that are going against the majority.

C. Argumententation and Logical Fallacies

Furthermore in the face of rumors and anecdotes, appeals and logical fallacies are commonly referred to when all else fails. These influences are used as flawed patterns of argumentation to defend poor claims. I will only cover a few despite the fact that there are a number of fallacies. The first is the argument from authority fallacy. This fallacy demonstrates how someone believes a claim from authority, simply because of their position. In the Milligram’s study, the man in the white lab coat stood behind the participant influencing him to continue giving shock. Here, the argument from authority fallacy could have very well played a factor in the participant’s persistence. Ad Ignorantiam or the “appeal to ignorance” (Shermer, 2002). This challenges that since an argument has not been proven false, it must hold truth. For instance, many UFO-logists hastily determine sights of UFO’s even though they cannot explain what it is. Their argument is that since they can not accurately depict what the object is, it must be a UFO. Another more common fallacy is Ad Hominem and Tu Quoque or “to the man” and “you also” (Shermer, 2002). This fallacy conveys the argument towards the source making the claim, in attempt to discredit that person hoping that it will then question their argument. Holders of this fallacy commonly assign negative terms towards an individual such as criminal, murderer, abuser, and so on, in order to mask the validity of the victim’s statements. The fallacy of negation or the false dilemma uses an “either-or” argument by dichotomizing the world (Shermer, 2002). In the statement if he is atheist, he must not be a nice person, it is suggesting that a person is either nice or not nice based on if that person is atheist or not. Further, this fallacy neglects the in-between factors. The fallacy of redundancy, begging the question, or tautology uses circular reasoning by beginning and ending with the same argument. For example, the paranormal exist because I have had experiences that can only be depicted as paranormal. Here, there is no actual reasoning to support that the paranormal exists and their statement that paranormal exists begin and end with the same statements without actual evidence. Finally, Reductio ad absurdum and the Slippery Slope hastily move an argument to a pretentious conclusion. Reductio ad absurdum “is the refutation of an argument by carrying the argument to its logical end and so reducing it to an absurd conclusion” (Shermer, 2002). Thus, just because an argument quickly came to an end, doesn’t mean it is unreasonable. Similarly, Slippery Slope exercises a construction of scenarios that don’t necessarily show correlation. In the following scenario: Eating pizza makes you put on weight, putting on weight will then cause you to become overweight, being overweight causes you to have diabetes leading to death. The scenario begins with a statement that is not consistently supported with evidence and ends with a hasty conclusion. As mentioned earlier, these faults in argumentation towards a respondent generally cause faulty conformity in thinking and judgment.

D. Solomon Asch demonstration of Social Conformity

As an illustration of the powerful influence that conformity has on thinking and judgment, Solomon Asch demonstrates a line of studies investigating this influence that others have on decision-making and judgments. More specifically, Asch demonstrates how a student can conforms to an answer simply because of the influence of others. In the study, Asch provided a figure showing four lines: line a, line 1, line 2, and line 3. In the figure, lines a and 3 are the same length. Asch asked a group of seven students which lines were equal. However, Asch was only interested in one of the student’s answers because six of the students were actually a part of the study and their answers were scripted. Interestingly, when the six students gave a unanimous wrong answer, the experiment subject conformed to their answer regardless if the answer was wrong or not. In Asch’s surprise, these results revealed how vulnerable we can be in the face of group decisions (Asch, 1951, 1955 & 1956). So, being a person of authority is not the only reason for conformity but also in groups; we have a tendency to conform to either.

E. Groupthink

Similarly, the need to feel accepted and valued can also affect accurate decision making, particularly in groups. Irving Janis’ notion of groupthink demonstrates that when members of a group value concurrence over anything else, they tend to make “quick and painless unanimity on issues that the group has to confront” (Hart, 1991) leading to inaccurate judgment. Group cohesiveness has made a dramatic impact in Social Psychology because of its pervasiveness and naivety among victims of groupthink. Irving attempts to formulate the logic behind this: “The more cohesive the group, the greater the members’ satisfaction with it and the greater their willingness to remain part of it, hence the greater their incentives to think and act as the group does” (Hart, 1991). However, the contribution of alternative ideas and quality of group brainstorming has also shown success in group problem solving. So when should decision makers be weary of falling prey to conformity? To make a clearer picture, Irving created a model illustrating signs and consequences to look for in group decision making. It starts off with the composition of a cohesive group in addition to structural faults of the organization plus the situational context. These conditions lead to three types of observable consequences including overestimation of the group, closed-mindedness, and pressures toward uniformity (Hart, 1991). Examples of groupthink are apparent in real world events such as the space shuttle Challenger accident in 1986. Seventy three seconds upon launch, the Challenger exploded killing all seven crew members leaving “focused attention on the process leading to the decision to launch”(Esser,1989). Prior to the launch, the management team for the shuttle underwent several safety tests on the conditions of the shuttle and found two major concerns: the cold temperature and the ice buildup on the launch pad. Interestingly, despite these obvious conditions, the team carried on with the flight leaving everyone questioning why. Esser & Lindoerfer (1989) used statements from the Report of the Presidential Commission on the Space Shuttle Challenger Accident to locate examples and frequencies in instances of groupthink. Consequently, they acknowledged thirty-nine statements as indications of groupthink. The most common instances of groupthink categories showing: failure to examine risks of preferred choice, self-appointed mindguards and selective bias in processing information at hand. Thus, the fatal launch of the Challenger was recognized as a symptom of groupthink. The damaging effect of groupthink is also demonstrated in World War II and the Bay of Pigs fiasco. As a result, group decision making has powerful shortcomings that could cause a great deal of problems if not handled correctly.

"Symptoms of Groupthink" (Lunenberg, 2010)
Stereotyping
Group members develop stereotyped views of opposition leaders as too evil to warrant genuine attempts to negotiate or as too weak and stupid to counter whatever risky attempts are made to defeat their purposes.
Pressure
Group members apply direct pressure on any member who expresses strong arguments against any of the group’s stereotypes, illusions, or commitments, making clear that this type of dissent is contrary to what is expected of all loyal members.
Self-Censorship
Group members censor themselves from any deviations from the apparent group consensus, reflecting each member’s inclination to minimize the importance of his or her doubts and counterarguments.
Unanimity
Group members perceive a shared illusion of unanimity concerning judgments conforming to the majority view (partly resulting from self-censorship of deviations, augmented by the false assumption that silence means consent).
Mindguards
Some group members appoint themselves to protect the group from adverse information that might shatter their shared complacency about the effectiveness and morality of their decision.
The likelihood that groupthink will emerge is greatest when: (a) the group is cohesive (b) the group becomes insulated from qualified outsiders, and (c) the leader promotes his own favored solution (Janis, 1982). In suggesting ways of avoiding groupthink, Janis hopes to reduce cohesiveness and open up decision activity in various ways. One way is to select ad hoc groups to solve problems; in this way, the members do not already belong to a cohesive group. Another approach is to have higher-level administrators set the parameters of the decision. Still another method is to assign different groups to work on the same problem. And, finally, different group decision-making techniques can be used to limit the effects of groupthink and other problems inherent in shared decision making.

F. False Memory or Misinformation Effect

Taking a turn, another phenomenon worth mentioning is the false memory or misinformation effect. Emerged by Psychologist Elizabeth Loftus in the 1970’s, this concept demonstrates just how susceptible the mind is to deception. Loftus’ studies on this idea discovered “that when people who witness an event are later exposed to new and misleading information about it, their recollections often become distorted”(Loftus, 1997). The first public case of the misinformation effect occurred when a mother sought psychiatric therapy to help her cope with past traumatic events with her daughter. However, throughout therapy, the psychiatrist used hypnosis and other various forms of manipulation that convinced her that she had experienced more than what she came in to believe. Through suggestive techniques, she was convinced to believe memories including childhood sexual and physical abuse among other things. Weeks after training, she came to the realization that she was manipulated marking the foundation of the false memory effect. Today, false memory is prominently evident in mistakes during eyewitness testimony. Through the support of DNA testing, it has been discovered that hundreds of people have been wrongly accused due to false memories during eyewitness testimony. Since these discoveries, scientists have attempted to understand these manipulative effects, particularly in the brain. The hippocampus, which plays a large role in the creation of memories, undergoes much transformation particularly through infancy. Consequently, children are highly susceptible to forming false memories due to the immature mind. Further, Loftus discovered that the delay in time between the actual event and the exposure to the misinformation effect caused false memories, presumably because memories weaken over time. Socially, credible sources from where the misinformation is fed from poses as a higher threat than sources that lack credibility. Further, credible sources who feed misinformation are more likely to persuade victims rather than sources who lack credibility that appear to have higher intentions of misleading. Lastly, a finding demonstrated by Zaragoza et al. discovered that, “the magnitude of the misinformation effect is also influenced by more subtle social cues, such as the perceived power and social attractiveness conveyed by the accent of the person providing the misinformation” (Zaragoza et al., 1993). Essentially, the result of forming false memories is not solely due to manipulation but our own mistaken perception as well.

The common denominator among these catastrophes is the overreliance on authority and our vulnerability towards others. Thus, the issue, then, lies in the individuals’ over assurance on the information they have been provided without confirming or seeking substantial evidence.
To resist yielding to inadequate information, be weary of who you trust as sources and remember that “extraordinary claims require extraordinary evidence” (David Hume).

X. Conclusion

We all want to be right, create a perception of the world that is more simplistic and understandable, and lean on authorities to help guide the way. With this mindset, we are subject to being tricked by our own brains when attempting to perceive reality. What we lack, then, is the ability to critically and skeptically think about our environment. Unfortunately, we do not come out of the womb able to perform such thinking processes, yet, we can learn overtime through “training, experience, and effort” (Shermer, 2002). Even so, we must constantly monitor ourselves from falling into the traps that our minds play so often in times of vulnerability.

References

Asch, S. E. (1951). Effects of group pressure upon the modification and distortion of judgment. In H. Guetzkow (ed.) Groups, leadership and men. Pittsburgh, PA: Carnegie Press.
Asch, S. E. (1955). Opinions and social pressure. Scientific American, 193, 31-35.
Asch, S. E. (1956). Studies of independence and conformity: A minority of one against a unanimous majority. Psychological Monographs, 70
Bertrand Russell, Unpopular Essays (1950), “Outline of Intellectual Rubbish”
British author, mathematician, & philosopher (1872 - 1970)
Carroll, R. (2010). Confirmation Bias. The skeptics dictionary. Retrieved November 15, 2010, from http://www.skepdic.com/confirmbias.html
Charness, G., Rigotti, L., & Rustichini, A. (2007). Individual Behavior and Group Membership. American Economic Review, 97(4), 1340-1352.
Einstein, A. (1905). A heuristic point of view concerning the production and transformation of light. American Journal of Physics, 33(5), Retrieved from
http://www.scribd.com/doc/10571708/Albert-Einstein-On-a-Heuristic-Point-of-View-Concerning-the-Production-and-Transformation-of-Light
Esser, J., & Lindoerfer, J. (1989). Groupthink and the space shuttle Challenger accident: Toward a quantitative case analysis. Journal of Behavioral Decision Making, 2(3), 167-177.
Grohol, J. (2009, January 10). Fundamental attribution error. Retrieved from http://psychcentral.com/encyclopedia/
Hamilton, D. (2005). Social cognition. New York, NY: Psychology Press.
Hart, P. (1991). Irving l. janis; victims of groupthink.. Political psychology, 12(2),a
Kelley, H. (1950). The warm-cold variable in first impressions of persons. Journal of Personality, 18(4), 431.
Kida, T. (2006). Don't believe everything you think. Amherst, NY: Prometheus Books.
Loftus, E. (1997, September). Creating false memories. Retrieved from http://faculty.washington.edu/eloftus/Articles/sciam.htm
Lunenberg, F. (2010). Group decision making: the potential for groupthink. Internation Journal of Management, Business, and Administration, 13(1),
Milgram, S. (1963). Behavioral Study of obedience. The Journal of Abnormal and Social Psychology, 67(4), 371-378.
Rajecki, D. (1974). Effects of prenatal exposure to auditory or visual stimulation on postnatal distress vocalizations in chicks. Behavioral Biology, 11(4), 525-536.
Schwartz, J., & Begley, S. (2002). The mind & the brain. New York City, NY: HarperCollins.
Shermer, M. (2002, September). Smart people believe weird things. Scientific American, Retrieved from http://www.michaelshermer.com/2002/09/smart-people-believe-weird-things/
Shermer, M. (2002). Why people believe weird things. New York, NY: St. Martin's Press.
Srivastava,S.,Guglielmo,S.,& Beer,J. (2010). Perceiving Others' Personalities: Examining the Dimensionality, Assumed Similarity to the Self, and Stability of Perceiver Effects. Journal of Personality & Social Psychology, 98(3), 520-534. Retrieved from Academic Search Alumni Edition database.
Thorndike, E. (1920). A constant error in psychological ratings. Journal of Applied Psychology, 4(1), 25-29.
Zaragoza, Maria, Belli, R, & Payment, Kristie. (n.d.). Misinformation effects and the suggestibility of eyewitness testimony. Retrieved from http://www.personal.kent.edu/~mzaragoz/publications/Zaragoza%20chapter%204%20Garry%20Hayne.pdf
Zajonc, R. (1968). Attitudinal effects of mere exposure. Journal of Personality and Social Psychology, 9(22), 1-27.
Zajonc, R. (2001). Mere Exposure: A Gateway to the Subliminal. Current Directions in Psychological Science (Wiley-Blackwell), 10(6), 224. Retrieved from Academic Search Alumni Edition database.