The Human Truth Foundation

Errors in Thinking
Cognitive Errors, Wishful Thinking and Sacred Truths

By Vexen Crabtree 2008

Like this page:

Share this page:
Comments:
FB, LJ

#beliefs #pseudoscience #psychology #thinking_errors

We all suffer from systematic thinking errors1,2 which fall into three main types: (1) internal cognitive errors; (2) errors of emotion3, perception and memory; and (3) social errors that result from the way we communicate ideas and the effects of traditions and dogmas. Some of the most common errors are the misperception of random events as evidence that backs up our beliefs, the habitual overlooking of contradictory data, our expectations and current beliefs actively changing our memories and our perceptions and using assumptions to fill-in unknown information. These occur naturally and subconsciously even when we are trying to be truthful and honest. Many of these errors arise because our brains are highly efficient (rather than accurate) and we are applying evolutionarily developed cognitive rules of thumb to the complexities of life4,5. We will fly into defensive and heated arguments at the mere suggestion that our memory is faulty, and yet memory is infamously unreliable and prone to subconscious inventions. They say "few things are more dangerous to critical thinking than to take perception and memory at face value"6. We were never meant to be the cool, rational and logical computers that we pretend to be. Unfortunately, and we find it hard to admit this to ourselves, many of our beliefs are held because they're comforting or simple7. In an overwhelming world, simplicity lets us get a grip. Human thinking errors can lead individuals, or whole communities, to come to explain types of events and experiences in fantastical ways. Before we can guard comprehensively against such cumulative errors, we need to learn the ways in which our brains can misguide us - lack of knowledge of these sources of confusion lead many astray8.

Learning to think skeptically and carefully and to recognize that our very experiences and perceptions can be coloured by societal and subconscious factors should help us to maintain impartiality. Beliefs should not be taken lightly, and evidence should be cross-checked. This especially applies to "common-sense" facts that we learn from others by word of mouth and to traditional knowledge. Above all, however, our most important tool is knowing what types of cognitive errors we, as a species, are prone to making.


1. Cognitive and Internal Errors

Many errors in thinking do not come from the fact that we don't know enough. Sometimes, we do. But most the time our brains fill in the gaps, and unfortunately the way our brains put together partial evidence results in potentially false conclusions. These systematic problems with the way we think are called cognitive errors, and they occur below the level of consciousness. Much research has been done in this area, and continues. "The research is painting a broad new understanding of how the mind works. Contrary to the conventional notion that people absorb information in a deliberate manner, the studies show that the brain uses subconscious "rules of thumb" that can bias it into thinking that false information is true"9. These cognitive errors and rules of thumb are discussed on this page. Thomas Gilovich published a book concentrating on the subtle mistakes made in everyday thinking, and introduces the subject like this:

People do not hold questionable beliefs simply because they have not been exposed to the relevant evidence [...] nor do people hold questionable beliefs simply because they are stupid or gullible. [... They] derive primarily from the misapplication or overutilization of generally valid and effective strategies for knowing. [... Many] have purely cognitive origins, and can be traced to imperfections in our capacities to process information and draw conclusions. [...] They are the products, not of irrationality, but of flawed rationality.

"How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life"
Thomas Gilovich (1991)10

Research psychologists have for a long time explored the mind's impressive capacity for processing information. We have an enormous capacity for automatic, efficient, intuitive thinking. Our cognitive efficiency, though generally adaptive, comes at the price of occasional error. Since we are generally unaware of these errors entering our thinking, it can pay us to identify ways in which we form and sustain false beliefs.

"Social Psychology" by David Myers (1999)5

1.1. Pareidolia: Seeing Patterns That Aren't There

#thinking_errors

The cognitive process of seeing patterns and drawing conclusions from random patterns and ambiguous data is called pareidolia. It is a highly common 'thinking error'11. A part of our brain, the fusiform face area, actively looks for any shapes, lines or features that might possibly be a human face. It does so devoid of context, and reports with urgency and confidence when it thinks it has results. That's why most pareidolia is involved with the perception of human forms in messy visual data. It "explains why some people see a face on Mars or the man in the moon [... or] the image of Mother Teresa in a cinnamon bun, or the Virgin Mary in the bark of a tree"12. Auditory pareidolia is responsible for the way that all of sometimes mistake a whistling breeze for a whispering human voice. When people listen with expectation of hearing a voice they will hear spoken words in pure noise13. We often misperceive random events in a way that supports our own already-existing beliefs or hunches14. Some patterns seem and feel so natural and real that it goes against common-sense to deny them. But against all expectations and against our judgements, many perceived objects turn out to be illusions. Psychologist Jonah Lehrer says "the world is more random than we can imagine. That's what our emotions can't understand"15. The tendency for people to see more order in nature than there is was noted as long ago as the thirteenth century by Roger Bacon - he called such errors due to human nature the 'idols of the tribe'16. To study pareidolia sociologists have presented true sets of random results and analysed subject's responses to them. Coin flips, dice throws and card deals have all revealed that we are naturally prone to spotting non-random trends exist when in fact they don't17. Pareidolia results in superstitions, magical thinking, ghost and alien sightings, 'Bible codes', pseudo-science and beliefs in all kinds of religious, nonsensical and magical things17,18,19.20

"Pareidolia - Seeing False Patterns in Random Data: Faces in Toast, Bible Codes and Conspiracy Theories" by Vexen Crabtree (2016)

1.2. Selection Bias and Confirmation Bias

It is the peculiar and perpetual error of the human understanding to be more moved and excited by affirmatives than negatives.

Francis Bacon (1620)21

Selection Bias is the result of poor sampling techniques22 whereby we use partial and skewed data in order to back up our beliefs. It includes confirmation bias which is the unfortunate way in which us Humans tend to notice evidence that supports our ideas and we tend to ignore evidence that contradictions it23,24 and actively seek out flaws in evidence that contradicts our opinion (but automatically accept evidence that supports us)25,26. We are all pretty hard-working when it comes to finding arguments that support us, and when we find opposing views, we work hard to discredit them. But we don't work hard to discredit those who agree with us. In other words, we work at undermining sources of information that sit uncomfortably with what we believe, and we idly accept those who agree with us.

All of this is predictable enough for an individual, but another form of Selection Bias occurs in mass media publications too. Cheap tabloid newspapers publish every report that shows foreigners in a bad light, or shows that crime is bad, or that something-or-other is eroding proper morality. And they mostly avoid any reports that say the opposite, because such assuring results do not sell as well. Newspapers as a whole simply never report anything boring - therefore they constantly support world-views that are divorced from everyday reality. Sometimes commercial interests skew the evidence that we are exposed to - drugs companies conduct many pseudo-scientific trials of their products but their choice of what to publish is manipulative - studies in 2001 and 2002 shows that "those with positive outcomes were nearly five times as likely to be published as those that were negative"27. Interested companies just have to keep paying for studies to be done, waiting for a misleading positive result, and then publish it and make it the basis of an advertising campaign. The public have few resources to overcome this kind of orchestrated Selection Bias.

RationalWiki uses mentions that "a website devoted to preventing harassment of women unsurprisingly concluded that nearly all women were victims of harassment at some point"28. And another example - a statistic that is most famous for being wrong, is that one in ten males is homosexual. This was based on a poll done on the community surrounding a gay-friendly publication - the sample of respondents was skewed away from the true average, hence, misleading data was observed.

The only solution for these problems is the proper and balanced statistical analysis, after actively and methodically gathering data. Alongside raising awareness of Selection Bias, Confirmation Bias, and other thinking errors. This level of critical thinking is quite a rare endeavour in anyone's personal life - we don't have time, the inclination, the confidence, or the skill, to properly evaluate subjective data. Unfortunately fact-checking is also rare in mass media publications. Personal opinions and news outlets ought to be given little trust.

The above text is an extract from "Selection Bias and Confirmation Bias" by Vexen Crabtree (2014).

1.3. Blinded by the Small Picture

A psychologist [Dr Hecke] argues that much what we label "stupidity" can be better explained as blind spots. She draws on research in creativity, cognitive psychology, critical thinking, child development, education, and philosophy to show how we (all of us, including the smartest people) create the very blind spots that become our worst liabilities. She devotes a chapter to each of ten blind spots, among them: not stopping to think, my-side bias, jumping to conclusions, fuzzy evidence, and missing the big picture.

Skeptical Inquirer (2007)29

Much of what Hecke argues is from the way we examine the data surrounding events. We tend to concentrate on the coincidences and specifics of the case in question, and miss out all the surrounding facts and figures. Without taking disconfirmation (as seen above) into account, the coincidences that we notice can become misleading.

1.4. The Forer (or Barnum) Effect30

#astrology #barnum_effect #forer_effect #horoscopes #pseudoscience #psychology #thinking_errors

The Forer effect is the seeing of a personality statement as "valid even though it could apply to anyone", and is named after the psychologist who famously demonstrated it31. In 1949, Bertram Forer conducted a personality test, and then gave all of his students exactly the same personality profile, which he constructed from random horoscopes. The students rated the accuracy of their profile at over 80% on average32! This was occasionally previously known as the Barnum effect after a popularist entertainer. Extensive studies have found that this effect applies well to horoscopes, other astrological readings, messages given from the dead by spiritualists, various cold reading tricks and other profiling endeavours33. It is often mistaken for being a product of magical or supernatural means.

"The Forer (or Barnum) Effect" by Vexen Crabtree (2016)

1.5. We Dislike Changing Our Minds: The Status Quo Bias

People tend to prefer arguments and evidence that backs up what they currently believe and are poor at objectively interpreting opposing arguments and counter-evidence. Social psychologist David Myers summarizes much of the research that has been done on this topic. "Self-created" justifications for their own beliefs are often imaginative and occur after beliefs have been solidified and such "belief perseverance" feels urgent even when it is more productive to consider beliefs' worth more rationally and calmly. The result is a "cognitive inertia" that opposes changes in belief - especially when such changes can be noticed by other people. We often deny that any change has occurred, and we often believe we haven't changed our minds too!34

One of the greatest challenges for scientists and educators is persuading people to give up beliefs they hold dear when the evidence clearly indicates that they should. Why aren't most people grateful for the data? [...] The motivational mechanism underlying the reluctance to be wrong, change our minds, admit serious mistakes, and accept unwelcome findings is cognitive dissonance. The theory of cognitive dissonance was invented fifty years ago by Leon Festinger, who defined dissonance as a state of tension that occurs whenever a person holds two cognitions that are psychologically inconsistent. [...] One of the most effective ways the mind maintains constant beliefs is through the confirmation bias - the fact that we tend to notice and remember information that confirms what we believe and ignore or forget information that disconfirms it.

Tavris & Araonson (2007)35

In short, we suffer mental distress when perceived evidence, our own actions, or counter-arguments threaten beliefs and opinions we already hold. This cognitive dissonance and its effects have gone by many names, and been discovered in extensive studies. It is a status quo bias towards what we already believe. We have belief perseverance36, which is especially strong if previous a person has constructed rational arguments in favour of their belief or if they have emotional attachments to the belief37. When changes of mind do occur, subjects then refuse to admit that they have had changes in beliefs over time. As we invest mental time on a belief, we become less likely to notice evidence against it, and less likely to change our mind even when we do see evidence. In total, our beliefs work sometimes against us, preventing us from accepting new evidence or counter arguments. This is why peer review is a fundamental part of the scientific process.

The recognition of this inertia is not a new discovery to the academic world, Aristotle pondered over it almost 2400 years ago and concluded that people invest much time and effort in their opinions and beliefs, and are therefore unlikely to easily abandon them:

Book CoverAll people value most what has cost them much labour in the production; for instance, people who have themselves made their money are fonder of it than those who have inherited it: and receiving kindness is, it seems, unlaborious, but doing it is laborious. And this is the reason why the female parents are most fond of their offspring; for their part in producing them is attended with most labour, and they know more certainly that they are theirs. This feeling would seem also to belong to benefactors.

"Ethics" by Aristotle (350BCE)38

1.6. Physiological Causes of Strange Experiences

If our biochemical brains and physiology sometimes go haywire but we are not aware of a dysfunction, we are apt to misinterpret the events. So, our expectations, culture and abstract thinking all conspire to present to us a false experience. Some medical conditions can cause these to be recurring and frequent occurrences. Through studying these science has come to understand such experiences as out of body experiences, possession, night time terrors, and through more simple biological knowledge have understood the Incubus and Succubus, and many other superstitious beliefs. Physical problems with the eyes can cause wild effects such as seeing blood run up walls, dust storms, clouds, and haloes that are not really there39. Isolated fits and seizures can cause bizarre effects such as odd lights, feelings, smells, auras and déjá vu that can be mild and subtle40.

Sleep Paralysis

Sleep paralysis, or more properly, sleep paralysis with hypnagogic and hypnopompic hallucinations have been singled out as a particularly likely source of beliefs concerning not only alien abductions, but all manner of beliefs in alternative realities and otherworldly creatures. Sleep paralysis is a condition in which someone, most often lying in a supine position, about to drop off to sleep, or just upon waking from sleep realizes that s/he is unable to move, or speak, or cry out. This may last a few seconds or several moments, occasionally longer. People frequently report feeling a "presence" that is often described as malevolent, threatening, or evil. An intense sense of dread and terror is very common. The presence is likely to be vaguely felt or sensed just out of sight but thought to be watching or monitoring, often with intense interest, sometimes standing by, or sitting on, the bed. On some occasions the presence may attack, strangling and exerting crushing pressure on the chest. People also report auditory, visual, proprioceptive, and tactile hallucinations, as well as floating sensations and out-of-body experiences (Hufford, 1982). These various sensory experiences have been referred to collectively as hypnagogic and hypnopompic experiences (HHEs). People frequently try, unsuccessfully, to cry out. After seconds or minutes one feels suddenly released from the paralysis, but may be left with a lingering anxiety. Extreme effort to move may even produce phantom movements in which there is proprioceptive feedback of movement that conflicts with visual disconfirmation of any movement of the limb. People may also report severe pain in the limbs when trying to move them. [...] A few people may have very elaborate experiences almost nightly (or many times in a night) for years. Aside from many of the very disturbing features of the experience itself (described in succeeding sections) the phenomenon is quite benign.

"Sleep Paralysis and associated hypnagogic and hypnopompic experiences"

There are many other physiological causes of strange experiences of course, and recent experiments and investigations have allowed exciting results to shed light on the inner working of the Human brain. In particular, a series of experiments have resulted in a deeper understanding of why people experience the idea of god, and these I have put on a separate page: "Experiences of God are Illusions, Derived from Malfunctioning Psychological Processes" by Vexen Crabtree (2002).

1.7. Emotions: The Secret and Honest Bane of Rationalism

#emotions #psychology #thinking_errors

We all know that emotions can get in the way of sensible thinking: our decisions and beliefs are influenced unduly by emotional reactions that often make little sense3. We jump to instinctive conclusions that are not warranted to a rational appraisal of the evidence, and we only seek disconfirming evidence for opinions that we don't like. We are compelled to dismiss doubts and dismiss evidence that goes against our train of thought if our emotional intuitions continue to drive us41. With academic understatement, one psychologist states "we are not cool computing machines, we are emotional creatures [and] our moods infuse our judgments"42 and another states "some beliefs are formed based primarily upon an emotional evaluation"43. In public we heartily deny that our emotions influence our formal decisions, especially in non-trivial or non-social matters. We do the same with our private thoughts; when debating with ourselves we often falsely "construe our feelings intellectually"44. There's got to be a better way! Many argue that logical thinking requires the suppression of emotion45. Should we strive to be like the Vulcans from Star Trek? Psychologists are saying that this isn't for the best. Our emotions are experts at interpreting social situations and predicting other people. Behind the scenes, our brains take many things into account that we don't consciously notice. So in many situations (especially those involving living beings) our emotional reactions are more well thought-out than intellectual alternatives5. We have to learn to be wise and to learn when our feelings are working to our advantage, and when they are merely getting in the way.,

"Rationalism Versus Irrationality: Are Emotions Useful?" by Vexen Crabtree (2017)

2. Mirrors That We Thought Were Windows

2.1. Perception is Influenced By Expectation: You Can't Always Trust Your Senses

#epistemology #expectation #experiences #psychology #thinking_errors

The very things we see and hear, clear as daylight and right in front of our eyes, can be falsehoods caused not by physiological dysfunctions but by brainal expectations46. We trust our senses far too much; we will fly into defensive and heated arguments at the mere suggestion that we saw something wrongly, and yet our Human faculties are unreliable and very prone to such invention. They say "few things are more dangerous to critical thinking than to take perception and memory at face value"47. One fascinating aspect of this is our enjoyment of drinks. For example studies have shown consistently that when people don't know which wine they are drinking they tend to enjoy the cheapest ones most, as judged by their taste, texture and smell. But if you add packaging then they rate the wines that are most expensive to be the best48. The effects of expectation are not always trivial however, and police psychologists such as Elizabeth Loftus have spent their lives educating investigators on the power of expectation over our senses. You simply cannot blindly trust what you see!

"Perception is Influenced By Expectation: You Can't Always Trust Your Senses" by Vexen Crabtree (2016)

2.2. What Every Lover Knows: Memory is Actively Interpreted

#thinking_errors

Our memories change over time and frequently they are simply wrong49,50. But most of us tend to think of memories as accurate accounts50. Extensive studies have shown that our memories of events are active interpretations of the past rather than picture-perfect records of it51,52. The best way to avoid errors is to write things down as early as possible50. Our recall of the past is affected by our present expectations and by our current knowledge and state of mind53. We tend to suppress and alter memories that damage our self-esteem50. Psychologists such as Elizabeth Loftus have shown through repeated experiments that simply by asking people questions about what they think they saw or heard previously, you make their subconscious whir into a creative drive in order to answer the question, even if it means making up details, and allowing assumptions and feelings to silently trick our minds into inventing elements of memories. Memories that are full of details give us a great sense of confidence, but, such memories are just as likely to contain accidental fabrications, many errors, and a great number of "filled-in" details which we simply subconsciously invented.

"Why are Our Memories Unreliable?" by Vexen Crabtree (2016)

2.3. Our Attempts to be Logical 54

We often try hard to be coldly rational and logical. It is difficult and requires concentration and time. The actual use of formal logic is a product of philosophy, and requires quite a period of institutional training to accomplish. By writing down assumptions, relationships and factoids and working them out step-by-step, you can learn what conclusions are warranted - or more frequently - learn which ones are not warranted. But this is a laborious and specialist process. Even the basic steps, such as defining the terms we want to use, is often too difficult. Failure to perform that first step is a "want [failure] of method" and results in many "absurd conclusions", to use the words of Thomas Hobbes in 165155.

It is simply the case that most of our opinions are not based on logical analysis even when we are trying our best to make them so. "Many believe certain [untrue] things because they follow faulty rules of reasoning" writes the skeptic Robert Todd Carroll (2011)2.

3. Social Errors

We naturally and instinctively gravitate toward beliefs that make us comfortable or fit in with the beliefs of our families and friends.

"Unnatural Acts: Critical Thinking, Skepticism, and Science Exposed!" by Robert Todd Carroll (2011)2

3.1. Story Telling and Second-Hand Information

#beliefs #buddhism #causes_of_religion #myths #religion #stories #subjectivism #thinking_errors

We humans have a set of instinctive behaviours when it comes to telling stories; we naturally embellish, exaggerate, cover up doubts about the story and add dramatic and cliff-edge moments. We do this because we are genetically programmed to try to tell good stories. These subconscious side-effects of our social instincts have a downside: we are poor at accurately transmitting stories even when we want to.56. All the major world religions went through periods of oral transmission of their founding stories, and the longer this state persisted the more variant the stories became57. Hundreds of years of oral tradition in Buddhism led to communities in different regions thinking that the Buddha gained enlightenment in the 5th century BCE, the 8th, the 9th or even in the 11th century BCE and each community thinks it has received the correct information through its oral transmission58. A scientific study of Balkan bards who have memorized "traditional epics rivaling the Iliad in length show that they do not in fact retain and repeat the same material verbatim but rather create a new version each time they perform it" based around a limited number of memorized elements59. A sign of untrustworthiness is that as stories spread they often become marvellous, more sure of themselves, more fantastic and, more detailed rather than less60.

"Human Story Telling: The Poor Accuracy of Oral Transmission" by Vexen Crabtree (2016)

3.2. Roger Bacon (1214-1294): Four General Causes of Ignorance

Roger Bacon (1214-1294) was concerned with science and the gathering of knowledge but he suffered many restrictions and punishments from his Church, and the world had to wait another few hundred years before Francis Bacon could exert a greater influence in the same direction. In Opus Majus Roger Bacon says there are four general causes of human ignorance. All of them are social in nature.

First, the example of frail and unsuited authority. [...] Second, the influence of custom. Third, the opinion of the unlearned crowd. [...] Fourth, the concealment of one's ignorance in a display of apparent wisdom. From these plagues, of which the fourth is the worst, spring all human evils.

"History of Western Philosophy" by Bertrand Russell (1946)61

Although I doubt that all human evils stem from ignorance (let alone just one of its causes), it is certainly worth thinking carefully about these four sources of misinformation. As social animals, our beliefs are often functions of our communities, and our belief-forming and belief-espousing mechanisms are deeply tied to the instinctual politics of social positions and consensus rather than evidence-based analysis.

3.3. Traditions and Customs

As a species, we experienced the world as flat. So sure was everyone, that those who thought otherwise were declared insane! Yet the common-sense view was wrong, and the common consensus was eventually changed. The consensus was harming the search for truth. Proof by consensus is no proof at all. We do not have any consistent or absolute way of approaching what we consider true, however we are capable of telling with certainty when something is false. The Earth is not flat, the Northern Lights are not spiritual in basis, neither are rainbows, the stars or the sun and the moon. Earthquakes are not gods moving underneath the Earth, nor can volcanoes or the weather (probably) be influenced by spirit or totem worship. We are not sure if science has "disproved" souls or God, but it certainly seem less likely that they exist than they used to.

We often rely on traditional explanations for events because we simply do not have time to investigate everything ourselves, and, most of the time the pronouncements and findings of experts do not filter down to the general populace. The result is a mixture of messy mass beliefs battling against the top-down conclusions of proper investigations. Status-quo bias and other social factors frequently cause resistance against the conclusions of scientists. For this reason, even such major theories as evolution were sometimes discovered by scientists (in this case, in around the 4th century BCE), and then rejected by society and forgotten until they were re-discovered in a world where society was more ready to accept it.

Many believe palpably untrue things because they were brought up to believe them and the beliefs have been reinforced by people they admire and respect.

"Unnatural Acts: Critical Thinking, Skepticism, and Science Exposed!" by Robert Todd Carroll (2011)62

Skeptical and rational thinking is pitted, socially, against the subconscious, sub-literate and sub-intellectual world of mass belief.

3.4. Expectations and Culture

#UK

Anthropologists find that the particular elements experienced by people change over time and from culture to culture the elements are experienced differently. With UFOology, different decades and countries have sighted completely different UFOs. A period of time in the UK saw that UFO reportings were generally furry/fuzzy aliens. Modern UFO reports are "grays", thin child-like bodies with enlarged eyes and heads. Different countries report different types of aliens, due to culture. As culture changes, so does the phenomenon. What does that say about the phenomenon?

Such large scale change over time casts doubt on the basis of the experience and has been presented (especially in the case of UFOs and demons) as evidence that the phenomenon is self-generated, not real, so is a function of a the aspirations and expectations of the experiencer.

3.5. Sacred Truths and Dogmas (Religion)

It seems that the chances of us discovering the truth escape most rapidly from us when it comes to the religious truths (dogmas) put forward by organized religious bodies. Here, authority, tradition and psychology all combine to present a formidable wall against the acceptance of new evidence and truths.

Tell a devout Christian that his wife is cheating on him, or that frozen yogurt can make a man invisible, and he is likely to require as much evidence as anyone else, and will be persuaded only to the extent to which that you give it. Tell him that the book he keeps by his bed was written by an invisible deity who will punish him with fire for eternity if he fails to accept every incredible claim about the universe, and he seems to require no evidence whatsoever.

"The End of Faith: Religion, Terror and the Future of Reason" by Sam Harris (2006)63

The issue isn't evidence par se, it's the lack of application of critical thinking to religious received truths, because society has placed religious theories into a sacred zone where they cannot be criticized. You can rationally apply physics to debate the pros and cons of string theory and be praised for your diligence in your search for truth, but if you rationally apply the truths of biology against religion, then, many feel you are being not only impolite, but immoral.

That such religion-inspired truths are due more to social factors than to reality can be seen by the highly culture-dependent interpretation of religious experiences.

Book CoverReligious experiences tend to conform closely to cultural and religious expectations. Village girls in Portugal have visions of the Virgin Mary - not of the Indian goddess Kali. Native Americans on their vision quest see visions of North American animals not of African ones. Thus it would seem that religious experiences, no matter how intense and all-consuming, are subject to constraint by the cultural and religious norms of the person to whom they occur. Another way of looking at this is to say that there can be no such thing as a pure experience. An experience always happens to a person, and that person already has an interpretive framework through which he or she views the world. Thus, experience and interpretation always combine and interpenetrate.

"The Phenomenon Of Religion: A Thematic Approach" by Moojan Momen (1999) [Book Review]64

4. The False and Conflicting Experiences of Mankind

#beliefs #epistemology #experiences #god #illusion #religion #subjectivism

There are many neurological and physiological causes of odd experiences in our lives. These range from ordinary tricks of the eye65 through to repeated minor epileptic fits that cause nothing more than visual hallucinations combined with emotional cues. Strange and unusual experiences often give rise to strange and unusual beliefs66 especially for those people are not inclined towards finding natural explanations for events. Experiments have found that a person's "psychological tendencies may also be used to predict exact types of 'unexplained' phenomena in which they are likely to believe" and that those who display signs of dissociation are likely to see "a given stimulus item as a paranormal creature, whether Bigfoot or an alien"67.

For example during night terrors, our brain is still suppressing bodily movement and it feels we're being forced down on to the bed, complete with much anxiety and fear. Some people have such attacks multiple times. The belief that this is a demonic attack seems natural to many and even after explaining the physiological and natural causes of night terrors, many continue to believe in a supernatural source. Of course, this is ridiculed by those who know that such attacks are really alien investigations of the Human body. Both explanations are foiled however by neuroscientists who understand that the cause of these experiences is biochemical in nature.

Cultural expectations play a large part in the interpretation of personal events, meaning that the resultant beliefs vary according to geographical region. We all find rational arguments to explain away those who experience things that contradict our own interpretation of reality - in secret, when we find people who have experienced things that we don't believe in, we all think we can explain others' experiences better than they themselves can explain them. Investigations into the physiological causes of strange beliefs comes from an ancient line of skepticism. In "The Supernatural?" by Lionel A. Weatherley (1891) the author intelligently outlines many such investigations - and that was well before modern neurology started making its amazing strides towards understanding the sources of experiences. Despite this knowledge, the masses remain generally unconvinced, and stick to ages-old cultural and subjective beliefs. The Christian Pentecostals have a saying, "the man with an experience is never at the mercy of the man with a doctrine"68, meaning that rationality subverts itself to experience. In total, we cannot entirely trust our experiences nor those of others, and more careful investigation is needed by all.

"The False and Conflicting Experiences of Mankind: How Other Peoples' Experience Contradict Our Own Beliefs: 1. Introduction" by Vexen Crabtree (2008)

5. Francis Bacon (1561-1626): Four Idols of the Mind

Francis Bacon's four idols of the mind is one of the most influential parts of his work16. In the words of Bertrand Russell, these idols are the "bad habits of mind that cause people to fall into error". E O Wilson describes them as being the causes of fallacy "into which undisciplined thinkers most easily fall"69. To view the world as it truthfully is, said Bacon, you must avoid these errors. Although they are categorized somewhat different nowadays they have been affirmed time and time again by modern cognitive psychologists as the causes of human errors:

  1. Idols of the tribe: "are those that are inherent in human nature"16 so this includes much of what is discussed on this page as the cognitive sources of errors in human thinking. Bacon mentions in particular the habit of expecting more order in natural phenomena than is actually to be found. This is called pareidolia, discussed above. I suspect Bacon calls this error a 'tribal' one because of the preponderance of magical rituals and beliefs that are based around controlling nature through specific actions; when in reality there is no cause and effect between them. Hence, pareidolia: the seeing of patterns in random, complex or ambiguous data where there are no patterns, and that these misperceived correlations form the basis of superstitious, magical and religious behaviour.

  2. Idols of the imprisoning cave: "personal prejudices, characteristic of the particular investigator"16, "the idiosyncrasies of individual belief and passion"69.

  3. Idols of the marketplace: "the power of mere words to induce belief in nonexistent things"69. Charisma, confidence and enthusiasm are three things that lead us to believe what someone is telling us, even though word of mouth is the poorest method to propagate fact.

  4. Idols of the theatre: "unquestioning acceptance of philosophical beliefs and misleading demonstrations"69. The lack of doubt we have in our own opinions; something I broadly address in "Why Question Beliefs? Dangers of Placing Ideas Beyond Doubt, and Advantages of Freethought" by Vexen Crabtree (2009) and "Satan Represents Doubt: Satanic Epistemology" by Vexen Crabtree (2002).

Francis Bacon is cited by many as one of the important founders of the general idea of the scientific method, which is largely concerned with overcoming errors in human thought and perception.

6. The Prevention of Errors

Unless we recognize these sources of systematic distortion and make sufficient adjustments for them, we will surely end up believing some things that just aren't so.

Thomas Gilovich (1991)70

6.1. Skeptical Thinking

The awesome scientist and public educator, Carl Sagan, defends the general American public by explaining that they are not lacking in intelligence but in the correct ways of applying it. Although general thinking errors lead to false conclusions about reality, the fact that people construct viable-seeming internal theories to explain events is itself a sign of intelligence. What is missing are the methods of telling good theories from bad ones. Sagan says:

Both Barnum and H.L. Mencken are said to have made the depressing observation that no one ever lost money by underestimating the intelligence of the American public. The remark has worldwide application. But the lack is not intelligence, which is in plentiful supply; rather, the scarce commodity is systematic training in critical thinking.

Carl Sagan (1979)71

Book CoverProf. Bono72 who we saw earlier commenting on the negative effects of complexity on accurate thought, produced a book that endeavoured to offer some training in how to think in a structured and methodical way. He named the critical aspect black hat thinking. He found that even with a little training, people's abilities in brainstorming and analysis increase manifold. Likewise in "Thinking, Fast and Slow" by Daniel Kahneman (2011)73 the author explains how slow, deliberate and organized thinking on a subject is one of the greatest ways to overcome the errors introduced by the instinctual fast thinking which has given us great evolutionary advantage, but, which of course introduces many errors to how we perceive and rationalize things. Finally, psychologist Jonah Lehrer in The Decisive Moment: How the Brain Makes Up Its Mind teaches that we must encourage ourselves to face "inner dissonance" and think about the information we don't want to think about, and "pay attention to the data that disturbs our entrenched beliefs"74.

6.2. Scientific Method as Cure for Human Error

#knowledge #science

Unfortunately, we do not have time to analyse every one of our beliefs with a careful skeptical analysis. It is practically impossible. We have instincts to only analyse new information critically, but re-training our thoughts on existing data is very hard. Also, because knowledge is highly specialized, we simply can never know enough to skeptically analyze everything that perhaps we should. An article in Skeptical Inquirer bemourns these facts:

We all have limitations and built-in biases that hinder our ability to apply the methods of skepticism objectively and consistently. Nonskeptics and professed skeptics alike are equally vulnerable to developing beliefs that have not been subjected to rigorous skeptical inquiry. [...] This is because (a) we do not have time to evaluate every claim that becomes part of our belief system and may rely upon what is commonly believed or what we would like to be true ; (b) we are more likely to perform a skeptical evaluation for claims that are inconsistent with our current belief systems [...] while simply accepting claims consistent with our beliefs [...]; (c) many beliefs are already formed and reinforced prior to learning how to think skeptically; (d) some beliefs are formed based primarily upon an emotional evaluation; and (e) skeptics have limited areas of expertise (.e.g., a biologist may know little about economics), which restricts our ability to skeptically evaluate all potential claims because knowledge is extremely specialized.

Todd C. Riniolo & Lee Nisbet (2007) in Skeptical Inquirer43

To overcome these personal difficulties, we need to reach rational conclusions in groups. So, scientific institutions that specialize in specific subjects are trusted in their pronouncements on those subjects, and authority is granted to subject matter experts who have proven qualifications in those fields, as long as their assertions are not being accepted for peer review. You will see that creates a requirement for a whole arena of checks, publications, verifications and a technical management of papers. This careful and cautious process by which we, in groups, formulate theories about the world and put them to the test, in order to minimize human error, is called science, and as formulated technically, as the scientific method, about which I have a separate text:

The nature of the scientific method lends itself to continual improvement in knowledge, based on a continual stream of new data. Scientists frequently admit when their theories have been superseded or corrected by their peers75,76. The procedures of peer review and independent verification both ensure that mistakes in theory, application or analysis of data are spotted when other scientists examine and repeat the experiment. "The essence of science is that it is self-correcting" says the eminent scientist Carl Sagan (1995)77. Yet it is still rare that new evidence completely destroys a theory. Philosopher Bertrand Russell states that "theories, if they are important, can generally be revived in a new form"78. Hence, theories undergo continual improvement.

Still perhaps it may appear better, nay to be our duty where the safety of the truth is concerned, to upset if need be even our own theories, specially as we are lovers of wisdom: for since both are dear to us, we are bound to prefer the truth.

"Ethics" by Aristotle (350BCE)79

Sometimes, individual theories do have to be abandoned. Rarely, entire scientific paradigms are questioned such as when Newtonian physics gave way to Einstein. New evidence can cause entire theoretical frameworks to be undermined, resulting in a scientific revolution. "Science involves an endless succession of long, peaceful periods [... and then periods of] scientific revolution" (Kuhn 1962). To overthrow a theory requires strong evidence and a full cycle of the scientific method. To overthrow an established theory, which is often supported by many other theories and mountains of testing and evidence, requires extraordinary evidence.

"Scientific Theories Must Make Way for New Evidence"
Vexen Crabtree
(2017)

6.3. Finding Good Sources of Information and Understanding Statistics

Although we have arrived at sensible mechanisms for overcoming individual cognitive bias, science, a new problem thus emerges. Multiple institutions and organisations all clamour for space in trying to sell their truths. It is not just that the specialisation of knowledge requires the existence professional organisations to pronounce on technical matters, but, all non-specialists still need to know which specialists to trust on which subjects. Given that scientific English can be hard to comprehend too, many rely on news services and mass media products for their information. We end up with a situation where the main obstacle for the discovery of truth is in the politics and sociology of the passage of accurate information, in a globalised world full to the brim with broadcasters and public conversations.

Researchers who have examined which sciences and disciplines best prepare people for the interpretation of everyday evidence have found that teaching and working in the social sciences best fosters reasonable-thinking.

It appears that the probabilistic sciences of psychology and medicine teach their students to apply statistical and methodological rules to both scientific and everyday-life problems, whereas the nonprobabalistic science of chemistry and the nonscientific discipline of the law do not affect their students in these respects. [Lehman, Lempert & Nibett]

It seems, then, that social scientists may have a special opportunity to impart some wisdom about how to properly evaluate the evidence of everyday experience. The authors of the study just described argue that there are certain formal properties of the subject matter of social science (e.g., considerable irregularity and uncertainty, the relative lack of necessary and sufficient causal relations) that make it particularly effective for teaching some important principles of sound reasoning.

Social scientists are generally more familiar than those in other fields with how easy it is to be misled by the evidence of everyday experience, and they are more aware of the methodological controls that are necessary before one can draw firm conclusions from a set of data.

"How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life"
Thomas Gilovich (1991)80

Current edition: 2008 Oct 26
Last Modified: 2017 Feb 05
http://www.humantruth.info/thinking_errors.html
Parent page: The Human Truth Foundation

All #tags used on this page - click for more:

#astrology #barnum_effect #beliefs #buddhism #causes_of_religion #emotions #epistemology #expectation #experiences #forer_effect #god #horoscopes #illusion #knowledge #myths #pseudoscience #psychology #religion #science #stories #subjectivism #thinking_errors #UK

Social Media

References: (What's this?)

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Book Cover

Skeptical Inquirer. Magazine. Published by Committee for Skeptical Inquiry, NY, USA. Pro-science magazine published bimonthly.

The Economist. Published by The Economist Group, Ltd. A weekly newspaper in magazine format, famed for its accuracy, wide scope and intelligent content. See vexen.co.uk/references.html#Economist for some commentary on this source..

Anderson & Kellam
(1992). In Riniolo & Nisbet Skeptical Inquirer (2007 May/Jun p49-53). Todd C. Riniolo is Associate Professor of Psychology at Medaille College (Buffalo, New York). Lee Nisbet is Professor of Philosophy at Medaille College and a founding member and Fellow of the Committee for Skeptical Inquiry.

Aristotle. (384-322BCE)
(350BCE) Ethics. E-book. Amazon Kindle digital edition. Originally published 340BCE (approx).

Boyer, Pascal
(2001) Religion Explained. Hardback book. Published by William Heinemann, Random House Group Ltd, London, UK.

Carroll, Robert Todd. (1945-2016). Taught philosophy at Sacramento City College from 1977 until retirement in 2007. Created The Skeptic's Dictionary in 1994.
(2003) The Skeptic's Dictionary. Published by John Wiley & Sons, New Jersey, USA.
(2011) Unnatural Acts: Critical Thinking, Skepticism, and Science Exposed!. E-book. Amazon Kindle digital edition. Published by the James Randi Educational Foundation.

Coolican, Hugh
(2004) Research Methods and Statistics in Psychology. 4th edition. Published by Hodder Headline, London, UK.

Crabtree, Vexen
(2002) "Experiences of God are Illusions, Derived from Malfunctioning Psychological Processes" (2002). Accessed 2017 Apr 13.
(2008) "The False and Conflicting Experiences of Mankind: How Other Peoples' Experience Contradict Our Own Beliefs" (2008). Accessed 2017 Apr 13.
(2009) "General Neophobia in Everyday Life: Humankind's Fear of Progress and Change" (2009). Accessed 2017 Apr 13.
(2013) "What Causes Religion and Superstitions?" (2013). Accessed 2017 Apr 13.
(2014) "What is Science and the Scientific Method?" (2014). Accessed 2017 Apr 13.

D. R. Lehman, R. O. Lempert, & R. E. Nisbett
(1988) The effects of graduate training on reasoning: Formal discipline and thinking about everyday-life events. American Psychologist, 43, 431-42. In Gilovich (1991) p191-192.

Dawkins, Prof. Richard
(2004) A Devil's Chaplain. Paperback book. Originally published 2003 by Weidenfeld & Nicolson. Current version published by Phoenix of Orion Books Ltd, London UK.

Edwards, K.
(1990) The interplay of affect and cognition in attitude formation and change. Journal of Personality and Social Psychology 59: 202-216. In Riniolo & Nisbet Skeptical Inquirer (2007 May/Jun p49-53). Todd C. Riniolo is Associate Professor of Psychology at Medaille College (Buffalo, New York). Lee Nisbet is Professor of Philosophy at Medaille College and a founding member and Fellow of the Committee for Skeptical Inquiry.

Eysenck, Michael and Keane, Mark
(1995) Cognitive Psychology. 3rd edition. Published by Psychology Press, Hove, UK.

Gilovich, Thomas
(1991) How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. Paperback book. 1993 edition. Published by The Free Press, NY, USA.

Goldacre, Ben. MD.
(2008) Bad Science. Published by Fourth Estate, an imprint of HarperCollins Publishers, London, UK.

Gross, Richard
(1996) Psychology: The Science of Mind and Behaviour. Paperback book. 3rd edition. Published by Hodder & Stoughton, London UK.

Harris, Sam
(2006) The End of Faith: Religion, Terror and the Future of Reason. Paperback book. 2006 edition. Published in UK by The Great Free Press, 2005.

Harrison, Guy P.
(2008) 50 Reasons People Give for Believing in a God. E-book. Amazon Kindle digital edition. Published by Prometheus Books, New York, USA.

Hobbes, Thomas
(1651) Leviathan. E-book. Amazon Kindle digital edition produced by Edward White, British Columbia, Canada from the Pelican Classics edition.

James, William. (1842-1910)
(1902) The Varieties of Religious Experience. Paperback book. Subtitled: "A Study in Human Nature". 5th (1971 fifth edition) edition. Originally published 1960. From the Gifford Lectures delivered at Edinburgh 1901-1902. Quotes also obtained from Amazon digital Kindle 2015 Xist Publishing edition. Book Review.

Kahneman, Daniel
(2011) Thinking, Fast and Slow. Paperback book. Published by Farrar, Strauss & Giroux, New York, USA.

Lehrer, Jonah
(2009) The Decisive Moment: How the Brain Makes Up Its Mind. Hardback book. Published by Canongate Books, Edinburgh.

Mackintosh, Bundy
(2004) Emotion. This essay is chapter 2 of "Emotions and Mind" by Toates, Mackintosh and Nettle (2004).

Momen, Moojan
(1999) The Phenomenon Of Religion: A Thematic Approach. Paperback book. Published by Oneworld Publications, Oxford, UK. Book Review.

Myers, David
(1999) Social Psychology. Paperback book. 6th ('international') edition. Originally published 1983. Current version published by McGraw Hill.

Park, Robert L.
(2008) Superstition: Belief in the Age of Science. E-book. Amazon Kindle digital edition. Published by Princeton University Press, New Jersey, USA.

Price, Robert M.
(2003) Incredible Shrinking Son of Man: How Reliable Is the Gospel Tradition?. Published by Prometheus Books, NY, USA.

Russell, Bertrand. (1872-1970)
(1946) History of Western Philosophy. Paperback book. 2000 edition. Published by Routledge, London, UK.

Sagan, Carl
(1995) Cosmos. Paperback book. Originally published 1981 by McDonald & Co. Current version published by Abacus.

Sharps, Matthew J.
(2012) "Eyewitness to the Paranormal: The Experimental Psychology of the 'Unexplained'" in Skeptical Inquirer (2012 Jul/Aug) p39.

Toates, Mackintosh & Nettle
(2004) Emotions and Mind. Published by The Open University, Milton Keynes, UK. A neurology textbook by Frederick Toates, Bundy Mackintosh and Daniel Nettle.

Weatherley, Lionel A.
(1891) The Supernatural?. Hardback book. Published by J. W. Arrowsmith, Bristol, UK. This is a hard to find book. Bath Library has a copy, accessed 2012 Dec 14.

Footnotes

  1. Gilovich (1991) .^
  2. Carroll (2011) chapter 9 "Are We Doomed to Die with Our Biases On?" p186-189. Added to this page on 2017 Feb 05.^^^
  3. Carroll (2011) chapter 1 "Believing in the Palpably Not True" p7.^^
  4. Kahneman (2011) .^
  5. Myers (1999) p119.^^^
  6. Carroll (2011) chapter 2 "How to Lose Friends and Alienate Your Neighbors" p28. Added to this page on 2016 Nov 21.^
  7. Carroll (2011) chapter 9 "Are We Doomed to Die with Our Biases On?" p189. Added to this page on 2017 Feb 05.^
  8. Carroll (2011) chapter 9 "Are We Doomed to Die with Our Biases On?" p188 . Author states 'Many believe palpably untrue things because they are ignorant of such things as fundamental biological or physical processes'. Added to this page on 2017 Feb 05.^
  9. Shankar Vedantum. Skeptical Inquirer (2008 Jan/Feb) Vol.32 No.1 p6 article "Difficulty in Debunking Myths Rooted in the Way the Mind Works".^
  10. Gilovich (1991) p2-4.^
  11. Goldacre (2008) chapter 13 "Why Clever People Believe Stupid Things" digital location 3408.^
  12. Carroll (2003) p275.^
  13. Park (2008) digital location 1830.^
  14. Myers (1999) p114 cites "Crocker, 1981; Jennings & others, 1982; Trolier & Hamilton, 1986".^
  15. Lehrer (2009) p71-2,79-81.^
  16. Russell (1946) p528-529.^^
  17. Gilovich (1991) p15,19.^
  18. Coolican (2004) p318.^
  19. Harrison (2008) chapter 12 "A sacred book proves my god is real" .^
  20. BBC News article "Pareidolia: Why we see faces in hills, the Moon and toasties" (2013 May 31).^
  21. Gilovich (1991) p32 , cites Francis Bacon (1620) The new organon and related writings (1960). Published by Liberal Arts Press, New York, USA.^
  22. Coolican (2004) p51.^
  23. Myers (1999) p114.^
  24. Lehrer (2009) .^
  25. Gilovich (1991) p50-51,82.^
  26. Goldacre (2008) chapter 13 "Why Clever People Believe Stupid Things" digital location 3491.^
  27. The Economist (2008 Nov 29) article "Pharmaceuticals: Absence of evidence" discusses this (p94). It states: "A study published this week in PLoS Medicine, an online journal, confirms [that] drugs companies try to spin the results of clinical trials. [... There is] troubling evidence of suppression and manipulation of data in studies published in (or often withheld from) peer-reviewed medical journals". And the full text on the 2001 and 2002 study states: "The results are distressing. Only three-quarters of the original trials were ever published, and it turned out that those with positive outcomes were nearly five times as likely to be published as those that were negative. Earlier investigations have shown that the explanation is not editorial bias; well-designed studies in which drugs fail have as good a chance of being published in leading journals as those in which drugs succeed". Results published of 155 sets of data, 17 "appeared in publications without having first been discussed in regulatory filings. Fifteen of these 17 made the drugs in question look better".^
  28. RationalWiki.org article "Selection bias" accessed 2014 Nov 10.^
  29. Skeptical Inquirer (2007 Sep/Oct) edition book review.^
  30. Added to this page on 2016 Jun 30.^
  31. Gilovich (1991) p60.^
  32. Skeptical Inquirer (2012 Jan/Feb) article "Laughing Ghosts and Scowling Sheep: Humour in Paranormal Discourse" by Jonathan C. Smith .^
  33. Carroll (2011) chapter 3 "Believing is Seeing: Trust No One, Not Even Yourself" p49,210.^
  34. Myers (1999) p97-101.^
  35. Social psychologists Carol Tavris & Elliot Aronson in Skeptical Inquirer (2007 Nov/Dec) "'Why Won't They Admit They're Wrong?' and Other Skeptic Mysteries", p12-4. They are authors of "Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts", Harcourt (2007), which the SI article refers to.^
  36. Anderson & Kellam (1992).^
  37. Edwards (1990).^
  38. Aristotle (350BCE) Book IX chapter IV. Added to this page on 2012 Dec 10.^
  39. Charles Catania. Skeptical Inquirer (2007 Jan/Feb) Vol.31 No.1 p49. Charles Catania is a professor in the Department of Psychology at the University of Maryland, Baltimore Country. I have posted further notes on this article to vexen.insanejournal.com.^
  40. Bear et al (1996) p464. Additional commentary can be found in "What Causes Religion and Superstitions?" by Vexen Crabtree (2013).^
  41. James (1902) digital location 1023.^
  42. Myers (1999) p117.^
  43. Skeptical Inquirer (2007 May/Jun) article "The Myth of Consistent Skepticism: The Cautionary Case of Albert Einstein" by Todd C. Riniolo & Lee Nisbet. Date last accessed 2017 Jan 13.^^
  44. James (1902) digital location 5743.^
  45. Mackintosh (2004) p40.^
  46. Eysenck & Keane (1995) p2.^
  47. Carroll (2011) chapter 2 "How to Lose Friends and Alienate Your Neighbors" p28.^
  48. Lehrer (2009) p145-6.^
  49. Carroll (2011) chapter 2 "How to Lose Friends and Alienate Your Neighbors" p28,198-199.^
  50. Eisold (2012).^
  51. Multiple sources cited in Sharps 2012:
    1. Bartlett, F.C. 1932 Remembering: A Study in Experimental and Social Psychology. Published by Cambridge University Press, Cambridge, UK.
    2. Loftus, E.F. 1975. Leading questions and the eyewitness report. Cognitive Psychology 7(3):560-72.
    3. Ahlberg, S.W., and M.J.Sharps. 2002. Bartlett revisited: Reconfiguration of long-term memory in young and older adults. Journal of Genetic Psychology 163(2):211-18.
    ^
  52. Gross (1996) p303.^
  53. Eysenck & Keane (1995) p262.^
  54. Added to this page on 2017 Feb 05.^
  55. Hobbes (1651) chapter 5 "Of Reason, and Science" p16. Added to this page on 2017 Feb 05.^
  56. Gilovich (1991) chapter 6.^
  57. Momen (1999) p332-335.^
  58. "Criticisms of Buddhism: Its History, Doctrine and Common Practices" by Vexen Crabtree (2011)^
  59. Price (2003) digital location 333.^
  60. Weatherley (1891) p99.^
  61. Russell (1946) p455.^
  62. Carroll (2011) chapter 9 "Are We Doomed to Die with Our Biases On?" p188. Added to this page on 2017 Feb 05.^
  63. Harris (2006) p19.^
  64. Momen (1999) p114.^
  65. BBC News (2000 Sep 11) accessed 2000 Sep 14.^
  66. Boyer (2001) chapter 1 "What is the Origin?" p6-7 Pascal states that one of the appeals of religion, according to many, is that 'Religion explains puzzling experience: dreams, prescience, etc'.^
  67. Two studies reported in Skeptical Inquirer (2012 Jul/Aug): Matthew J. Sharps article "Eyewitness to the Paranormal: The Experimental Psychology of the 'Unexplained'", p39.
    • Sharps, M.J., J. Matthews, and J. Asten (2006). "Cognition, affect, and beliefs in paranormal phenomena: Gestalt/feature intensive processing theory and tendencies towards ADHD, depression, and dissociation" in the Journal of Psychology 140(6):579-90.
    • Sharps M.J., E. Newborg, S. Van Arsdall, et al. (2010). "Paranormal encounters as eyewitness phenomena: Psychological determinants of atypical perception interpretations" in Current Psychology 29(4):320-27.
    ^
  68. The Economist (2006 Dec 23) p85.^
  69. Wilson (1998) p27.^
  70. Gilovich (1991) p48.^
  71. Carl Sagan (1979) Broca's Brain: Reflections on the Romance of Science. New York: Ballantine Publishing Group: 58. In Skeptical Inquirer (2007 Jul/Aug) Vol.31 No.4 p34.^
  72. Bono (1985). That complexity requires us to divide thinking into areas is the main theme of the book. The author concentrates on brainstorming sessions and the analysis of ideas. The specific quote that "the biggest enemy of thinking is complexity, for that leads to confusion" is found on p176.^
  73. Kahneman (2011) . Added to this page on 2013 May 03.^
  74. Lehrer (2009) p207-208. Added to this page on 2013 May 03.^
  75. Dawkins (2004) p210.^
  76. Skeptical Inquirer (2009 May/Jun) article "Playing by the Rules" p42-44 by Harriet Hall, MD.^
  77. Sagan (1995) p16.^
  78. Russell (1946) p69.^
  79. Aristotle (350BCE) digital location 412-413.^
  80. Gilovich (1991) p191-193.^

©2017. All rights reserved.