Errors in Thinking
Cognitive Errors, Wishful Thinking and Sacred Truths

By Vexen Crabtree 2008 Oct 26

Cognitive errors are those types of systematic mistakes which our brains make when presenting ideas and correlations to our conscious selves. They result from us applying evolutionarily developed rules of thumb to the complexities of life without taking due heed of the need for critical and cautious evaluation of our own thought processes. Human errors in general thinking can lead individuals, or whole communities, to come to explain types of events and experiences in fantastical ways. Before we can comprehensively guard against such cumulative errors, we need to learn the ways in which our brains can misguide us. They are divided into three areas: the internal errors of thinking, errors with our actual perceptions, and social errors that result from the way we communicate ideas and the effects of traditions and dogmas.


1. Cognitive and Internal Errors

Many errors in thinking do not come from the fact that we don't know enough. Sometimes, we do. But most the time our brains fill in the gaps, and unfortunately the way our brains put together partial evidence results in potentially false conclusions. These systematic problems with the way we think are called cognitive errors, and they occur below the level of consciousness. Much research has been done in this area, and continues. "The research is painting a broad new understanding of how the mind works. Contrary to the conventional notion that people absorb information in a deliberate manner, the studies show that the brain uses subconscious "rules of thumb" that can bias it into thinking that false information is true"1. These cognitive errors and rules of thumb are discussed on this page. Thomas Gilovich published a book concentrating on the subtle mistakes made in everyday thinking, and introduces the subject like this:

People do not hold questionable beliefs simply because they have not been exposed to the relevant evidence [...] nor do people hold questionable beliefs simply because they are stupid or gullible. [... They] derive primarily from the misapplication or overutilization of generally valid and effective strategies for knowing. [... Many] have purely cognitive origins, and can be traced to imperfections in our capacities to process information and draw conclusions. [...] They are the products, not of irrationality, but of flawed rationality.

"How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life"
Thomas Gilovich (1991)2

1.1. Pareidolia: Seeing Patterns in Random, Complex or Ambiguous Data

The cognitive process of seeing patterns and drawing conclusions from random patterns and ambiguous data is called pareidolia. It is highly common. People often misperceive random events in a way that supports their own already-existing beliefs or hunches3. Some patterns seem and feel so natural and real that it goes against common-sense to deny them. But against all expectations and against our judgements, many patterns turn out to be false and misleading. Psychologist Jonah Lehrer says "The world is more random than we can imagine. That's what our emotions can't understand"4. The tendency for people to see more order in nature than there is was noted as long ago as the thirteenth century by Roger Bacon - he called such errors due to human nature the 'idols of the tribe'5. To study pareidolia sociologists have presented true sets of random results and analysed subject's responses to them:

Psychologists have discovered that people have faulty intuitions about what chance sequences look like. People expect sequences of coin flips, for example, to alternate between heads and tails more than they actually do. [...] Streaks of 4, 5, or 6 heads in a row clash with our expectations about the behaviour of a fair coin, although in a series of 20 tosses there is a 50-50 chance of getting 4 heads in a row [and] a 10 percent chance of a streak of six. [...] People who work in maternity wards witness streaks of boy births followed by streaks of girl births that they attribute to a variety of mysterious forces like the phases of the moon. Here too, the random sequences of births to which they are exposed simply do not look random.

"How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life"
Thomas Gilovich (1991)6

This type of cognitive illusion-making results from our subconscious evaluation of evidence and correlations. Prof Edward Bono7 goes to lengths to explain that the complexity of the real world is the greatest "enemy of thinking", and talks about how our conscious attempts at understanding ideas cannot be done through raw thinking. To make progress, he argues, we need to structure and organize our thoughts, and the best way to do this is through training. I argue throughout this page that another best way is simply to know about the types of errors that our brains make, therefore allowing us to spot them and habitually work around them.

1.2. We Miss Disconfirmations

It is the peculiar and perpetual error of the human understanding to be more moved and excited by affirmatives than negatives."

Francis Bacon (1620)8

Our brains have evolved to rapidly spot patterns, because it saved our animal necks, and allowed us to procreate. But now in a complex world where ideas and beliefs are much more serious affairs, our early pattern-recognition circuits now do us a disservice. Our consciousness gets excited about occurrences and events that appear to be in sync; as we saw with pareidoliac errors above. If the bus is late twice on a row on Tuesdays whilst we are waiting, we try to work out the cause. We would do much better to accurately note on how many other days it is late, and note how many times it isn't late on a Tuesday. Only by actively and methodically gathering data can we actually ascertain if a trend exists. But we hardly ever do this in our daily lives - most of us don't have time!

If we believe a correlation exists, we notice and remember confirming instances. If we believe that premonitions correlate with events, we notice and remember the joint occurrence of the premonition and the event's later occurrence. We seldom notice or remember all the times unusual events do not coincide. If, after we think about a friend, the friend calls us, we notice and remember the coincidence. We don't notice all the times we think of a friend without any ensuing call, or receive a call from a friend about whom we've not been thinking.

People see not only what they expect, but correlations they want to see. This intense human desire to find order, even in random events, leads us to seek reasons for unusual happenings or mystifying mood fluctuations. By attributing events to a cause, we order our worlds and make things seem more predictable and controllable. Again, this tendency is normally adaptive but occasionally leads us astray.

"Social Psychology" by David Myers (1999)9

The same conclusions are drawn by psychologist Jonah Lehrer in "The Decisive Moment: How the Brain Makes Up Its Mind" (2009). This lack of comprehensiveness in our data-gathering leads us a mistaken confidence in our ability to ascertain trends. Gilovich states that people "are poor at drawing conclusions from incomplete and unrepresentative data". Applying automatic skepticism to ideas born from little evidence, or to subjective evidence of a single person (or two or three!), is a massive leap in the avoidance of error. Only if we have actively attempted to measure the disconfirming events should we entertain an idea.

1.3. Blinded by the Small Picture

A psychologist [Dr Hecke] argues that much what we label "stupidity" can be better explained as blind spots. She draws on research in creativity, cognitive psychology, critical thinking, child development, education, and philosophy to show how we (all of us, including the smartest people) create the very blind spots that become our worst liabilities. She devotes a chapter to each of ten blind spots, among them: not stopping to think, my-side bias, jumping to conclusions, fuzzy evidence, and missing the big picture.

Skeptical Inquirer (2007)10

Much of what Hecke argues is from the way we examine the data surrounding events. We tend to concentrate on the coincidences and specifics of the case in question, and miss out all the surrounding facts and figures. Without taking disconfirmation (as seen above) into account, the coincidences that we notice can become misleading.

1.4. We Dislike Changing Our Minds: The Status Quo Bias

People tend to prefer arguments and evidence that backs up what they currently believe and are poor at objectively interpreting opposing arguments and counter-evidence. Social psychologist David Myers summarizes much of the research that has been done on this topic. "Self-created" justifications for their own beliefs are often imaginative and occur after beliefs have been solidified and such "belief perseverance" feels urgent even when it is more productive to consider beliefs' worth more rationally and calmly. The result is a "cognitive inertia" that opposes changes in belief - especially when such changes can be noticed by other people. We often deny that any change has occurred, and we often believe we haven't changed our minds too!11

One of the greatest challenges for scientists and educators is persuading people to give up beliefs they hold dear when the evidence clearly indicates that they should. Why aren't most people grateful for the data? [...] The motivational mechanism underlying the reluctance to be wrong, change our minds, admit serious mistakes, and accept unwelcome findings is cognitive dissonance. The theory of cognitive dissonance was invented fifty years ago by Leon Festinger, who defined dissonance as a state of tension that occurs whenever a person holds two cognitions that are psychologically inconsistent. [...] One of the most effective ways the mind maintains constant beliefs is through the confirmation bias - the fact that we tend to notice and remember information that confirms what we believe and ignore or forget information that disconfirms it.

Tavris & Araonson (2007)12

In short, we suffer mental distress when perceived evidence, our own actions, or counter-arguments threaten beliefs and opinions we already hold. This cognitive dissonance and its effects have gone by many names, and been discovered in extensive studies. It is a status quo bias towards what we already believe. We have belief perseverance13, which is especially strong if previous a person has constructed rational arguments in favour of their belief or if they have emotional attachments to the belief14. When changes of mind do occur, subjects then refuse to admit that they have had changes in beliefs over time. As we invest mental time on a belief, we become less likely to notice evidence against it, and less likely to change our mind even when we do see evidence. In total, our beliefs work sometimes against us, preventing us from accepting new evidence or counter arguments. This is why peer review is a fundamental part of the scientific process.

The recognition of this inertia is not a new discovery to the academic world, Aristotle pondered over it almost 2400 years ago and concluded that people invest much time and effort in their opinions and beliefs, and are therefore unlikely to easily abandon them:

Book CoverAll people value most what has cost them much labour in the production; for instance, people who have themselves made their money are fonder of it than those who have inherited it: and receiving kindness is, it seems, unlaborious, but doing it is laborious. And this is the reason why the female parents are most fond of their offspring; for their part in producing them is attended with most labour, and they know more certainly that they are theirs. This feeling would seem also to belong to benefactors.

"Ethics" by Aristotle (350BCE)15

1.5. Physiological Causes of Strange Experiences

If our biochemical brains and physiology sometimes go haywire but we are not aware of a dysfunction, we are apt to misinterpret the events. So, our expectations, culture and abstract thinking all conspire to present to us a false experience. Some medical conditions can cause these to be recurring and frequent occurrences. Through studying these science has come to understand such experiences as out of body experiences, possession, night time terrors, and through more simple biological knowledge have understood the Incubus and Succubus, and many other superstitious beliefs. Physical problems with the eyes can cause wild effects such as seeing blood run up walls, dust storms, clouds, and haloes that are not really there16. Isolated fits and seizures can cause bizarre effects such as odd lights, feelings, smells, auras and déjá vu that can be mild and subtle17.

Sleep Paralysis

Sleep paralysis, or more properly, sleep paralysis with hypnagogic and hypnopompic hallucinations have been singled out as a particularly likely source of beliefs concerning not only alien abductions, but all manner of beliefs in alternative realities and otherworldly creatures. Sleep paralysis is a condition in which someone, most often lying in a supine position, about to drop off to sleep, or just upon waking from sleep realizes that s/he is unable to move, or speak, or cry out. This may last a few seconds or several moments, occasionally longer. People frequently report feeling a "presence" that is often described as malevolent, threatening, or evil. An intense sense of dread and terror is very common. The presence is likely to be vaguely felt or sensed just out of sight but thought to be watching or monitoring, often with intense interest, sometimes standing by, or sitting on, the bed. On some occasions the presence may attack, strangling and exerting crushing pressure on the chest. People also report auditory, visual, proprioceptive, and tactile hallucinations, as well as floating sensations and out-of-body experiences (Hufford, 1982). These various sensory experiences have been referred to collectively as hypnagogic and hypnopompic experiences (HHEs). People frequently try, unsuccessfully, to cry out. After seconds or minutes one feels suddenly released from the paralysis, but may be left with a lingering anxiety. Extreme effort to move may even produce phantom movements in which there is proprioceptive feedback of movement that conflicts with visual disconfirmation of any movement of the limb. People may also report severe pain in the limbs when trying to move them. [...] A few people may have very elaborate experiences almost nightly (or many times in a night) for years. Aside from many of the very disturbing features of the experience itself (described in succeeding sections) the phenomenon is quite benign.

"Sleep Paralysis and associated hypnagogic and hypnopompic experiences"

There are many other physiological causes of strange experiences of course, and recent experiments and investigations have allowed exciting results to shed light on the inner working of the Human brain. In particular, a series of experiments have resulted in a deeper understanding of why people experience the idea of god, and these I have put on a separate page: "Experiences of God are Illusions, Derived from Malfunctioning Psychological Processes" by Vexen Crabtree (2002).

1.6. Emotions: The Secret and Honest Bane of Rationalism

Where we have strong emotions, we're liable to fool ourselves.

Carl Sagan

In one hand, we all acknowledge the role of emotions in underlying our choices, beliefs and behaviours. We allow whole certain areas of our lives to be governed by emotions. But in the other hand is everyone's hidden secret: emotions also underlie many of the opinions that we pretend to hold for rational, intellectualized reasons. That's right - the psychologists are on to everyone. Human beings are just not cut out for careful rational thought. It takes dedication, (self-)training and pigheadedness to approach things purely rationally. Every when we think we have our emotions under lock and key, or we think we are in an area of thought that is emotionless, there is often a worrying doubt: Am I deluding myself for some subconscious emotional reason? We will never be like Data, the android from Star Trek, who struggles to show emotion. We will always struggle against it, hiding it and suspecting its influence in all our decisions.

Book CoverBut that isn't the whole story and it is not as simple as trying to suppress and override emotion. The theme of the book "The Decisive Moment: How the Brain Makes Up Its Mind" by Jonah Lehrer (2009) is that our emotions have evolved in order to aid rapid decision-making, taking account of all kinds of subtle stimulus and as a result, when it comes to making quick decisions, it is often important to trust our emotions sometimes. Especially when gauging other people, their behaviour, and their motives. The key is a balance, and knowing when to trust emotions, and when to trust the intellect.

You should not trust your emotions or instincts when it comes to statistics, trends, raw data, and analysis of anything which does not form part of our daily lives, because our emotions evolved for social interpretation and rapid decision-making, therefore, that is where they work best. Science, philosophy and rationalism ought to be done instead in a cool, methodical and systematic manner.

2. Mirrors That We Thought Were Windows

2.1. Perception is Actively Interpreted. The Role of Expectation

The very things we see and hear, clear as daylight and right in front of our eyes, can be falsehoods caused not by physiological dysfunctions but by brainal expectations. This "top-down" effect on perception became a hot-topic in cognitive research from the 1970s, according to two cognitive psychologists, Eysenck and Keane:

Towards the end of the 1970s, theorists argued that virtually all cognitive activity consists of interactive bottom-up and top-down processes occurring at the same time. Perception and remembering might seem to be exceptions, because perception obviously depends heavily on the precise stimuli presented (and thus on bottom-up processing), and remembering depends crucially on stored information (and thus on top-down processing). However perception is also much affected by the perceiver's expectations about to-be-presented stimuli, and remembering depends far more on the exact nature of the environmental cues provided to facilitate recollection than was thought at one time.

"Cognitive Psychology" by Michael Eysenck and Mark Keane (1995)18

It is not just random events, confusing correlations or data processing where our expectations cause us to have certain experiences in place of others. Our mental thoughts can influence the very objects that we see before us, overriding what is real or merging reality with fantasy. Donald D. Hoffman, cognitive psychologist, affirms that "vision is not merely a matter of passive perception, it is an intelligent process of active construction"19. After citing some more experiments and examples, Eysenck and Keane report on a particularly colourful demonstration of it:

Another illustration [...] comes in a classic study by Bruner, Postman, and Rodrigues (1951). Their subjects expected to see conventional playing cards, but some of the cards used were incongruous (e.g. black hearts). When these incongruous cards were presented briefly, subjects sometimes reported seeing brown or purple hearts. Here we have an almost literal blending of stimulus information (bottom-up processing) and stored information (top-down processing).

"Cognitive Psychology" by Michael Eysenck and Mark Keane (1995)20

Visual illusions betray the fact that what we think we perceive in the world is actually a result of subjective construction. We are given a representation of the world that looks to us to be coherent, but is in reality a hotch-potch of bits of evidence and guesswork. The real problem is that we trust what we think we see, without realizing that most of it is assumption intermingled with reality.

The brain seems to make a 'best guess' of a plausible filler and normally we notice nothing unusual. [...] There are numerous other illusions which suggest that what we are conscious of can be a distortion of what is actually there. [...] Some important parts of our conscious experience may be underpinned by the brain 'jumping to conclusions' or applying rules of thumb.

Saroj Datta (2004) in "From Cells to Consciousness" by Toates, Romero and Datta (2004)21

These types of effects are well-known, not just amongst academics but in industry. Marketing practices routinely abuse our subconscious thinking errors. Take the example of wines, or fizzy drinks, as reported on by a neurologist in "The Decisive Moment: How the Brain Makes Up Its Mind" by Jonah Lehrer (2009)22: During blind trials when people don't know which one they are drinking, the cheapest wines came out as the favourite, according to smell, taste and texture. When packaging was added to coke, wine or energy drinks, the brands with the best reputations, and the more expensive ones, were the immediate favourites. If you put a Coca Cola sticker on cheap coke, everyone rates that coke much better from its taste. Our experiences are deeply tied to our rational brain; our sensory inputs are not driven purely by the outside world, rather our conscious awareness is largely constructed out of expectation. Lehrer notes that we can often make better decisions (more realistic ones!) when we have less information! This is because our rational brain tricks us with a distorted awareness of the world.

2.2. What Every Lover Knows: Memory is Actively Interpreted

After extensive studies into memory recall, psychologists have concluded and explained many times that our memories of events are active interpretations of the past. Our recall of the past is affected by our present expectations, knowledge and state of mind.

Book CoverBartlett concluded that interpretation plays a large and largely unrecognized role in the remembering of stories and past events. [...] Rather than human memory being computer-like, with the output matching the input, Bartlett and Hunter believe that we process information in an active attempt to understand it. Memory is an 'imaginative reconstruction' of experience (Bartlett, 1932)."

"Psychology: The Science of Mind and Behaviour" by Richard Gross (1996)23

Studies by Elizabeth Loftus on leading questions has shown that our memories of even quite recent events is skewed by the way we think about them or are asked about them; such studies have become prodigal for police interview methods and in particular for cases of child abuse where forceful questions can 'lead' a child into 'recalling' events in the way that the adult is implying they occurred. We have all experienced the guilty feeling that, after arguing with a friend, we have perhaps reconstructed past events into how we wanted them to have occurred, rather than how our friend saw them.

3. Social Errors

3.1. Story Telling and Second-Hand Information24

Book CoverIn "How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life" by Thomas Gilovich (1991) chapter six is devoted to the problems with second-hand information and the way that story-telling is the cause of much error; I do not know if this is good or bad, but, much of the exaggeration and error is subconscious and comes from the way that we tell stories rather than from any intention to lie. I say I don't know if this is good or bad because: it is good that we know it is not intentional because we would otherwise call most people liars, but, it is bad because as a subconscious side-effect of our social instincts, it is very hard to prevent. Anyway, Gilovich gives a brilliant examination of the subject and it is not worth trying to replicate his efforts here.

3.2. Roger Bacon (1214-1294): Four General Causes of Ignorance

Roger Bacon (1214-1294) was concerned with science and the gathering of knowledge but he suffered many restrictions and punishments from his Church, and the world had to wait another few hundred years before Francis Bacon could exert a greater influence in the same direction. In Opus Majus Roger Bacon says there are four general causes of human ignorance. All of them are social in nature.

First, the example of frail and unsuited authority. [...] Second, the influence of custom. Third, the opinion of the unlearned crowd. [...] Fourth, the concealment of one's ignorance in a display of apparent wisdom. From these plagues, of which the fourth is the worst, spring all human evils.

"History of Western Philosophy" by Bertrand Russell (1946)25

Although I doubt that all human evils stem from ignorance (let alone just one of its causes), it is certainly worth thinking carefully about these four sources of misinformation. As social animals, our beliefs are often functions of our communities, and our belief-forming and belief-espousing mechanisms are deeply tied to the instinctual politics of social positions and consensus rather than evidence-based analysis.

3.3. Traditions and Customs

As a species, we experienced the world as flat. So sure was everyone, that those who thought otherwise were declared insane! yet the common-sense view was wrong, and the common consensus was eventually changed. The consensus was harming the search for truth. Proof by consensus is no proof at all. We do not have any consistent or absolute way of approaching what we consider true, however we are capable of telling with certainty when something is false. The Earth is not flat, the Northern Lights are not spiritual in basis, neither are rainbows, the stars or the sun and the moon. Earthquakes are not gods moving underneath the Earth, nor can volcanoes or the weather (probably) be influenced by spirit or totem worship. We are not sure if science has "disproved" souls or God, but it certainly seem less likely that they exist than they used to.

We often rely on traditional explanations for events because we simply do not have time to investigate everything ourselves, and, most of the time the pronouncements and findings of experts do not filter down to the general populace. The result is a mixture of messy mass beliefs battling against the top-down conclusions of proper investigations. Status-quo bias and other social factors frequently cause resistance against the conclusions of scientists. For this reason, even such major theories as evolution were sometimes discovered by scientists (in this case, in around the 4th century BCE), and then rejected by society and forgotten until they were re-discovered in a world where society was more ready to accept it.

Skeptical and rational thinking is pitted, socially, against the subconscious, sub-literate and sub-intellectual world of mass belief.

3.4. Expectations and Culture

Anthropologists find that the particular elements experienced by people change over time and from culture to culture the elements are experienced differently. With UFOology, different decades and countries have sighted completely different UFOs. A period of time in the UK saw that UFO reportings were generally furry/fuzzy aliens. Modern UFO reports are "grays", thin child-like bodies with enlarged eyes and heads. Different countries report different types of aliens, due to culture. As culture changes, so does the phenomenon. What does that say about the phenomenon?

Such large scale change over time casts doubt on the basis of the experience and has been presented (especially in the case of UFOs and demons) as evidence that the phenomenon is self-generated, not real, so is a function of a the aspirations and expectations of the experiencer.

3.5. Sacred Truths and Dogmas (Religion)

It seems that the chances of us discovering the truth escape most rapidly from us when it comes to the religious truths (dogmas) put forward by organized religious bodies. Here, authority, tradition and psychology all combine to present a formidable wall against the acceptance of new evidence and truths.

Tell a devout Christian that his wife is cheating on him, or that frozen yogurt can make a man invisible, and he is likely to require as much evidence as anyone else, and will be persuaded only to the extent to which that you give it. Tell him that the book he keeps by his bed was written by an invisible deity who will punish him with fire for eternity if he fails to accept every incredible claim about the universe, and he seems to require no evidence whatsoever.

"The End of Faith: Religion, Terror and the Future of Reason" by Sam Harris (2006)28

The issue isn't evidence par se, it's the lack of application of critical thinking to religious received truths, because society has placed religious theories into a sacred zone where they cannot be criticized. You can rationally apply physics to debate the pros and cons of string theory and be praised for your diligence in your search for truth, but if you rationally apply the truths of biology against religion, then, many feel you are being not only impolite, but immoral.

That such religion-inspired truths are due more to social factors than to reality can be seen by the highly culture-dependent interpretation of religious experiences.

Book CoverReligious experiences tend to conform closely to cultural and religious expectations. Village girls in Portugal have visions of the Virgin Mary - not of the Indian goddess Kali. Native Americans on their vision quest see visions of North American animals not of African ones. Thus it would seem that religious experiences, no matter how intense and all-consuming, are subject to constraint by the cultural and religious norms of the person to whom they occur. Another way of looking at this is to say that there can be no such thing as a pure experience. An experience always happens to a person, and that person already has an interpretive framework through which he or she views the world. Thus, experience and interpretation always combine and interpenetrate.

"The Phenomenon Of Religion: A Thematic Approach" by Moojan Momen (1999) [Book Review]29

4. The False and Conflicting Experiences of Mankind

People have all kinds of weird beliefs. Some of these come about because of experiences the person has had. A person who experiences God and believes in demons has an explanation for the occurrence of night terrors, when people feel a 'force' holding them down terrifyingly on to the bed, making it hard for them to believe. But those who believe in UFOs have a completely different explanation, and say that the 'God' of the theists is merely the effects of advanced civilisation (aliens) on early Humanity. Both explanations are foiled however by neuroscientists who understanding that the cause of these experiences is biochemical in nature. Weird beliefs and experiences can be caused by neurological and physiological problems, like tiny seizures in the brains causing visions. Cultural expectations play a large part in the interpretation of personal events. We all find rational arguments to explain away those who experience things that contradict our own interpretation of reality. The Christian Pentecostals have a saying, "the man with an experience is never at the mercy of the man with a doctrine", meaning that rationality subverts itself to experience. In total, we cannot entirely trust our experiences nor those of others.

"The False and Conflicting Experiences of Mankind: How Other Peoples' Experience Contradict Our Own Beliefs" by Vexen Crabtree (2002)

5. Francis Bacon (1561-1626): Four Idols of the Mind

Francis Bacon's four idols of the mind is one of the most influential parts of his work26. In the words of Bertrand Russell, these idols are the "bad habits of mind that cause people to fall into error". E O Wilson describes them as being the causes of fallacy "into which undisciplined thinkers most easily fall"27. To view the world as it truthfully is, said Bacon, you must avoid these errors. Although they are categorized somewhat different nowadays they have been affirmed time and time again by modern cognitive psychologists as the causes of human errors:

  1. Idols of the tribe: "are those that are inherent in human nature"26 so this includes much of what is discussed on this page as the cognitive sources of errors in human thinking. Bacon mentions in particular the habit of expecting more order in natural phenomena than is actually to be found. This is called pareidolia, discussed above. I suspect Bacon calls this error a 'tribal' one because of the preponderance of magical rituals and beliefs that are based around controlling nature through specific actions; when in reality there is no cause and effect between them. Hence, pareidolia: the seeing of patterns in random, complex or ambiguous data where there are no patterns, and that these misperceived correlations form the basis of superstitious, magical and religious behaviour.

  2. Idols of the imprisoning cave: "personal prejudices, characteristic of the particular investigator"26, "the idiosyncrasies of individual belief and passion"27.

  3. Idols of the marketplace: "the power of mere words to induce belief in nonexistent things"27. Charisma, confidence and enthusiasm are three things that lead us to believe what someone is telling us, even though word of mouth is the poorest method to propagate fact.

  4. Idols of the theatre: "unquestioning acceptance of philosophical beliefs and misleading demonstrations"27. The lack of doubt we have in our own opinions; something I broadly address in "Why Question Beliefs?" by Vexen Crabtree (2009) and "Satan Represents Doubt: Satanic Epistemology" by Vexen Crabtree (2002).

Francis Bacon is cited by many as one of the important founders of the general idea of the scientific method, which is largely concerned with overcoming errors in human thought and perception.

6. The Prevention of Errors

Unless we recognize these sources of systematic distortion and make sufficient adjustments for them, we will surely end up believing some things that just aren't so.

Thomas Gilovich (1991)30

6.1. Skeptical Thinking

The awesome scientist and public educator, Carl Sagan, defends the general American public by explaining that they are not lacking in intelligence but in the correct ways of applying it. Although general thinking errors lead to false conclusions about reality, the fact that people construct viable-seeming internal theories to explain events is itself a sign of intelligence. What is missing are the methods of telling good theories from bad ones. Sagan says:

Both Barnum and H.L. Mencken are said to have made the depressing observation that no one ever lost money by underestimating the intelligence of the American public. The remark has worldwide application. But the lack is not intelligence, which is in plentiful supply; rather, the scarce commodity is systematic training in critical thinking.

Carl Sagan (1979)31

Book CoverProf. Bono7 who we saw earlier commenting on the negative effects of complexity on accurate thought, produced a book that endeavoured to offer some training in how to think in a structured and methodical way. He named the critical aspect black hat thinking. He found that even with a little training, people's abilities in brainstorming and analysis increase manifold. Likewise in "Thinking, Fast and Slow" by Daniel Kahneman (2011)32 the author explains how slow, deliberate and organized thinking on a subject is one of the greatest ways to overcome the errors introduced by the instinctual fast thinking which has given us great evolutionary advantage, but, which of course introduces many errors to how we perceive and rationalize things. Finally, psychologist Jonah Lehrer in The Decisive Moment: How the Brain Makes Up Its Mind teaches that we must encourage ourselves to face "inner dissonance" and think about the information we don't want to think about, and "pay attention to the data that disturbs our entrenched beliefs"33.

6.2. Scientific Method as Cure for Human Error

Unfortunately, we do not have time to analyse every one of our beliefs with a careful skeptical analysis. It is practically impossible. We have instincts to only analyse new information critically, but re-training our thoughts on existing data is very hard. Also, because knowledge is highly specialized, we simply can never know enough to skeptically analyze everything that perhaps we should. An article in Skeptical Inquirer bemourns these facts:

We all have limitations and built-in biases that hinder our ability to apply the methods of skepticism objectively and consistently. Nonskeptics and professed skeptics alike are equally vulnerable to developing beliefs that have not been subjected to rigorous skeptical inquiry. [...] This is because (a) we do not have time to evaluate every claim that becomes part of our belief system and may rely upon what is commonly believed or what we would like to be true ; (b) we are more likely to perform a skeptical evaluation for claims that are inconsistent with our current belief systems [...] while simply accepting claims consistent with out beliefs [...]; (c) many beliefs are already formed and reinforced prior to learning how to think skeptically; (d) some beliefs are formed based primarily upon an emotional evaluation; and (e) skeptics have limited areas of expertise (.e.g., a biologist may know little about economics), which restricts our ability to skeptically evaluate all potential claims because knowledge is extremely specialized.

Todd C. Riniolo & Lee Nisbet (2007)34

To overcome these personal difficulties, we need to reach rational conclusions in groups. So, scientific institutions that specialize in specific subjects are trusted in their pronouncements on those subjects, and authority is granted to subject matter experts who have proven qualifications in those fields, as long as their assertions are not being accepted for peer review. You will see that creates a requirement for a whole arena of checks, publications, verifications and a technical management of papers. This careful and cautious process by which we, in groups, formulate theories about the world and put them to the test, in order to minimize human error, is called science, and as formulated technically, as the scientific method, about which I have a separate text:

Thankfully, the scientific method is all about eliminating human errors. Most of us like to learn from the things that other people tell us (anecdotes), but this isn't good enough for science, "even if the anecdotes number in the millions and even if the storytellers are Nobel Prize-winning anointed saints"35. Ideas have to be tested statistically and in a way where systemic human error can be controlled for and eliminated. The procedures of peer review and independent verification both ensure that mistakes in theory, application or analysis of data are spotted when other scientists examine and repeat the experiment. Competing theories will have proponents that actively do not want new information threatening their own theory, so will actively seek out weakspots in new experiments. Many scientific publications report on the failures of data and experiments, and in the face of this, science self-regulates in a very efficient and detailed manner. "The essence of science is that it is self-correcting" says the eminent scientist Carl Sagan, "new experimental results and novel ideas are continually resolving old mysteries" (1995)36.

Sometimes, then, individual experiments are found to be faulty. Sometimes, the conclusions of the scientist are questioned even if the data is good. And sometimes, rarest of all, entire scientific paradigms are questioned such as when Newtonian physics gave way to Einstein. New evidence can cause entire theoretical frameworks to be undermined, resulting in a scientific revolution in understanding.

"Science and The Scientific Method: Its Character and History: 6. The Scientific Method Counteracts Human Error" by Vexen Crabtree (2014)

6.3. Finding Good Sources of Information and Understanding Statistics

Although we have arrived at sensible mechanisms for overcoming individual cognitive bias, science, a new problem thus emerges. Multiple institutions and organisations all clamour for space in trying to sell their truths. It is not just that the specialisation of knowledge requires the existence professional organisations to pronounce on technical matters, but, all non-specialists still need to know which specialists to trust on which subjects. Given that scientific English can be hard to comprehend too, many rely on news services and mass media products for their information. We end up with a situation where the main obstacle for the discovery of truth is in the politics and sociology of the passage of accurate information, in a globalised world full to the brim with broadcasters and public conversations.

Researchers who have examined which sciences and disciplines best prepare people for the interpretation of everyday evidence have found that teaching and working in the social sciences best fosters reasonable-thinking.

It appears that the probabilistic sciences of psychology and medicine teach their students to apply statistical and methodological rules to both scientific and everyday-life problems, whereas the nonprobabalistic science of chemistry and the nonscientific discipline of the law do not affect their students in these respects. [Lehman, Lempert & Nibett]

It seems, then, that social scientists may have a special opportunity to impart some wisdom about how to properly evaluate the evidence of everyday experience. The authors of the study just described argue that there are certain formal properties of the subject matter of social science (e.g., considerable irregularity and uncertainty, the relative lack of necessary and sufficient causal relations) that make it particularly effective for teaching some important principles of sound reasoning.

Social scientists are generally more familiar than those in other fields with how easy it is to be misled by the evidence of everyday experience, and they are more aware of the methodological controls that are necessary before one can draw firm conclusions from a set of data.

"How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life"
Thomas Gilovich (1991)37

7. Conclusions

We all suffer from systematic cognitive dysfunctions; they infuse the very way we notice and analyse data, and distort our forming of conclusions. Emotional and societal factors influence our thinking much more than we like to admit. Our expectations and recent experiences change the way we recall memories. Even our very perceptions are effected by pre-conscious cognitive factors; what we see, feel, taste and hear are all subject to interpretation before we are even aware of them. Our brains were never meant to be the cool, rational, mathematical-logical computers that we like to sometimes pretend them to be.

We can take preventative steps. Learning to think skeptically and carefully and to recognize that our very experiences and perceptions can be coloured by societal and subconscious factors should help us to maintain our cool. Beliefs should not be taken lightly, and evidence should be cross-checked. This especially applies to "common-sense" facts that we learn from others by word of mouth and traditional knowledge. Above all, however, our most important tool is knowing what types of cognitive errors we, as a species, are prone to making.

Read / Write Comments

By Vexen Crabtree 2008 Oct 26
Last Updated: 2012 Dec 10
http://www.humantruth.info/thinking_errors.html
Parent page: Science and Truth Versus Mass Confusion

Social Media

Links:

References: (What's this?)

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Skeptical Inquirer. Pro-science magazine published bimonthly by the Committee for Skeptical Inquiry, New York, USA.

Philosophy Now. Philosophy Now, 41a Jerningham Road, Telegraph Hill, London SE14 5NQ, UK. Published by Anja Publications Ltd. www.philosophynow.org

Anderson & Kellam
(1992). In Riniolo & Nisbet Skeptical Inquirer (2007 May/Jun p49-53). Todd C. Riniolo is Associate Professor of Psychology at Medaille College (Buffalo, New York). Lee Nisbet is Professor of Philosophy at Medaille College and a founding member and Fellow of the Committee for Skeptical Inquiry.

Aristotle. (384-322BCE)
Ethics (350BCE). Amazon's Kindle digital edition. Originally published around 340BCE. Public Domain.

Bear, Connors and Paradiso
Neuroscience (1996). Published by Williams & Wilkins, Baltimore, Maryland, USA. The Amazon link is to a newer version. Mark F. Bear Ph.D. and Barry W Connors Ph.D. are both Professors of Neuroscience at Brown University, Rhode Island, USA, and Michael A. Paradiso Ph.D., associate professor.

Bono, Prof E.
Six Thinking Hats (1985). Quotes taken from Penguin Books 2000 edition based on revised First Back Bay 1999 edition. Penguin Books Ltd, The Strand, London, UK.

Carroll, Robert. Taught philosophy at Sacramento City College from 1977 until retirement in 2007. Created The Skeptic's Dictionary in 1994.
Unnatural Acts: Critical Thinking, Skepticism, and Science Exposed! (2011). Kindle edition. Published by the James Randi Educational Foundation.

Coolican, Hugh
Research Methods and Statistics in Psychology (2004). Fourth edition. Published by Hodder Headline, London, UK.

Crabtree, Vexen
"The False and Conflicting Experiences of Mankind: How Other Peoples' Experience Contradict Our Own Beliefs" (2002). Accessed 2014 Mar 12.
"Experiences of God are Illusions, Derived from Malfunctioning Psychological Processes" (2002). Accessed 2014 Mar 12.
"General Neophobia in Everyday Life: Humankind's Fear of Progress and Change" (2009). Accessed 2014 Mar 12.
"What Causes Religion and Superstitions?" (2013). Accessed 2014 Mar 12.
"Science and The Scientific Method: Its Character and History" (2014). Accessed 2014 Mar 12.

D. R. Lehman, R. O. Lempert, & R. E. Nisbett
(1988) The effects of graduate training on reasoning: Formal discipline and thinking about everyday-life events. American Psychologist, 43, 431-42. In Gilovich (1991) p191-192.

Edwards, K.
(1990) The interplay of affect and cognition in attitude formation and change. Journal of Personality and Social Psychology 59: 202-216. In Riniolo & Nisbet Skeptical Inquirer (2007 May/Jun p49-53). Todd C. Riniolo is Associate Professor of Psychology at Medaille College (Buffalo, New York). Lee Nisbet is Professor of Philosophy at Medaille College and a founding member and Fellow of the Committee for Skeptical Inquiry.

Eysenck, Michael and Keane, Mark
Cognitive Psychology (1995). 3rd edition. Published by Psychology Press, Hove, UK.

Gilovich, Thomas
How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life (1991). 1993 paperback edition published by The Free Press, NY, USA.

Gross, Richard
Psychology: The Science of Mind and Behaviour (1996). 3rd edition. Published by Hodder & Stoughton, London UK.

Harris, Sam
The End of Faith: Religion, Terror and the Future of Reason (2006). 2006 edition. Published in UK by The Great Free Press, 2005.

Kahneman, Daniel
Thinking, Fast and Slow (2011). Published by Farrar, Strauss and Giroux; New York, USA.

Lehrer, Jonah
The Decisive Moment: How the Brain Makes Up Its Mind (2009). Hardback. Published by Canongate Books, Edinburgh.

Momen, Moojan
The Phenomenon Of Religion: A Thematic Approach (1999). Published by Oneworld Publications, Oxford, UK. [Book Review]

Myers, David
Social Psychology (1999). 6th 'international' edition. First edition 1983. Published by McGraw Hill.

Russell, Bertrand. (1872-1970)
History of Western Philosophy (1946). Quotes from 2000 edition published by Routledge, London, UK.

Sagan, Carl
Cosmos (1995). Originally published 1981 by McDonald & Co. This edition published by Abacus.

Toates, Romero & Datta
From Cells to Consciousness (2004). A neurology textbook by Frederick Toates, Ignacio Romero and Saroj Datta. Published by The Open University, Milton Keynes, UK.

Weatherley, Lionel A.
The Supernatural? (1891). Hardback. Published by J. W. Arrowsmith, Bristol. This is a hard to find book. Bath Library has a copy, accessed 2012 Dec 14.

Wilson, E. O.
Consilience: The Unity of Knowledge (1998). Hardback. Published by Little, Brown and Company, London, UK. Professor Wilson is one of the foremost sociobiologists.

Footnotes

  1. Shankar Vedantum. Skeptical Inquirer (2008 Jan/Feb) Vol.32 No.1 p6 article "Difficulty in Debunking Myths Rooted in the Way the Mind Works".^
  2. Gilovich (1991) p2-4.^
  3. "Crocker, 1981; Jennings & others, 1982; Trolier & Hamilton, 1986". All in David Myers (1999) p114.^
  4. Lehrer (2009) p71-2, 79-81. Added to this page on 2013 May 03.^
  5. Russell (1946) p528-529.^
  6. Gilovich (1991) p15, 19.^
  7. Bono (1985). That complexity requires us to divide thinking into areas is the main theme of the book. The author concentrates on brainstorming sessions and the analysis of ideas. The specific quote that "the biggest enemy of thinking is complexity, for that leads to confusion" is found on p176.^^
  8. Francis Bacon (1620) The new organon and related writings (1960). New York: Liberal Arts Press. Via Gilovich (1991) p32.^
  9. Myers (1999) p114.^
  10. Skeptical Inquirer (2007 Sep/Oct) edition book review.^
  11. Myers (1999) p97-101.^
  12. Social psychologists Carol Tavris & Elliot Aronson in Skeptical Inquirer (2007 Nov/Dec) "'Why Won't They Admit They're Wrong?' and Other Skeptic Mysteries", p12-4. They are authors of "Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts", Harcourt (2007), which the SI article refers to.^
  13. Anderson & Kellam (1992).^
  14. Edwards (1990).^
  15. Aristotle (350BCE) Book IX chapter IV. Added to this page on 2012 Dec 10.^
  16. Charles Catania. Skeptical Inquirer (2007 Jan/Feb) Vol.31 No.1 p49. Charles Catania is a professor in the Department of Psychology at the University of Maryland, Baltimore Country. I have posted further notes on this article to vexen.insanejournal.com.^
  17. Bear et al (1996) p464. Additional commentary can be found in "What Causes Religion and Superstitions?" by Vexen Crabtree (2013).^
  18. Eysenck & Keane (1995) p2.^
  19. Hoffman, Donald D in Visual Intelligence: How we Create What We See. According to Hans Heimer, from Cheshire, in a letter in Philosophy Now (2006 Sep/Oct). Published by Anja Publications Ltd.^
  20. Eysenck & Keane (1995) p75.^
  21. Saroj Datta (2004) in Toates, Romero & Datta, p171.^
  22. Lehrer (2009) p145-6. Added to this page on 2010 Jun 13.^
  23. Gross (1996) p303.^
  24. Added to this page on 2013 May 03.^
  25. Russell (1946) p455.^
  26. Russell (1946) p528-529.^
  27. Wilson (1998) p27.^
  28. Harris (2006) p19.^
  29. Momen (1999) p114.^
  30. Gilovich (1991) p48.^
  31. Carl Sagan (1979) Broca's Brain: Reflections on the Romance of Science. New York: Ballantine Publishing Group: 58. In Skeptical Inquirer (2007 Jul/Aug) Vol.31 No.4 p34.^
  32. Kahneman (2011) Added to this page on 2013 May 03.^
  33. Lehrer (2009) p207-208. Added to this page on 2013 May 03.^
  34. Todd C. Riniolo & Lee Nisbet. Skeptical Inquirer (2007 May/Jun) p49-53.^
  35. Carroll (2011) p131. Added to this page on 2014 Jan 11.^
  36. Sagan (1995) p16.^
  37. Gilovich (1991) p191-193.^

© 2014 Vexen Crabtree. All rights reserved.

Google Advertisements: