The Human Truth Foundation

We Dislike Changing Our Minds
Status Quo Bias and Cognitive Dissonance

By Vexen Crabtree 2018


Comments:
FB, LJ

#beliefs #psychology #thinking_errors

Once we have put time and effort into a belief, then, it becomes difficult for us to fairly evaluate evidence on that topic. We have an inertia, a reticence, to change our beliefs1 and we have an instinctive reaction against arguments that are new or odd, compared to our existing beliefs2, and we are inclined to not give new ideas serious consideration. We often stick with the beliefs we first encounter, for example, very few people change their religion3,4. This is sometimes called a Status Quo Bias and belief perseverance1,5.

A similar mental state, called Cognitive Dissonance, occurs when a contradictory state of affairs occurs6: We see strong evidence against a point of view that we have invested time and effort into, or, we discover that some of our valued beliefs actually contradict one another. We can go to extreme lengths to avoid giving up a belief even when we know (or strongly suspect) we are making a mistake7,8 and we often defend beliefs even when there is evidence against them7,8.

These effects are especially once we constructed rational arguments in favour of a belief and have put time and effort into defending them, or if we have emotional attachments to a belief.9 This form of bias was noted as long ago as Aristotle, 2400 years ago10. Some academics warn that intelligence can sometimes work against us because we can find it easy to dismiss evidence and contradictions through the use of clever arguments and manipulations11. When we do change our minds, we often deny it (even to ourselves1). It is very difficult to be unbiased and truly open in our approaches to truth, which is why formalized schemes such as The Scientific Method are so important12.


1. Status Quo Bias

#thinking_errors

People tend to prefer arguments and evidence that backs up what they currently believe and we are poor at objectively interpreting opposing arguments and counter-evidence. In analysing the factors that affect human behaviour, the economist Joseph Schumpeter says that "we resent a call to thinking and hate unfamiliar argument that does not tally with what we already believe or would like to believe"2.

Social psychologist David Myers summarizes much of the research that has been done on this topic. "Self-created" justifications for their own beliefs are often imaginative and occur after beliefs have been solidified and such "belief perseverance" feels urgent even when it is more productive to consider a beliefs' worth more rationally and calmly. The result is a "cognitive inertia" that opposes changes in belief - especially when such changes can be noticed by other people. We often deny that any change has occurred, and we often believe we haven't changed our minds too!1

The recognition of this inertia is not a new discovery to the academic world, Aristotle pondered over it almost 2400 years ago and concluded that people invest much time and effort in their opinions and beliefs, and are therefore unlikely to easily abandon them:

Book CoverAll people value most what has cost them much labour in the production; for instance, people who have themselves made their money are fonder of it than those who have inherited it: and receiving kindness is, it seems, unlaborious, but doing it is laborious. And this is the reason why the female parents are most fond of their offspring; for their part in producing them is attended with most labour, and they know more certainly that they are theirs. This feeling would seem also to belong to benefactors.

"Ethics" by Aristotle (350BCE)10

2. Cognitive Dissonance

#beliefs #religion #science #thinking_errors

Over millennia, we (as a species) have come to hold a massive array of contradictory and daft beliefs, many of which are "obviously wrong" to many except believers. In the modern world, we are increasingly finding that the facts of the world do not mesh well with many of our beliefs. The skeptic Robert Todd Carroll worries that some (many?) religionists "when confronted with overwhelming evidence that what their religion teaches is wrong, devote their lives to defending its errors"7.

One of the greatest challenges for scientists and educators is persuading people to give up beliefs they hold dear when the evidence clearly indicates that they should. Why aren't most people grateful for the data? [...] The motivational mechanism underlying the reluctance to be wrong, change our minds, admit serious mistakes, and accept unwelcome findings is cognitive dissonance. The theory of cognitive dissonance was invented fifty years ago by Leon Festinger, who defined dissonance as a state of tension that occurs whenever a person holds two cognitions that are psychologically inconsistent. [...] One of the most effective ways the mind maintains constant beliefs is through the confirmation bias - the fact that we tend to notice and remember information that confirms what we believe and ignore or forget information that disconfirms it.

"'Why Won't They Admit They're Wrong?' and Other Skeptic' Mysteries"
Tavris & Aronson (2007 Nov/Dec)8

In short, we suffer mental distress when perceived evidence, our own actions, or counter-arguments threaten beliefs and opinions we already hold. This cognitive dissonance and its effects have gone by many names, and been discovered in extensive studies. It is a status quo bias towards what we already believe. We have belief perseverance5, which is especially strong if previous a person has constructed rational arguments in favour of their belief or if they have emotional attachments to the belief9. When changes of mind do occur, subjects then refuse to admit that they have had changes in beliefs over time. As we invest mental time on a belief, we become less likely to notice evidence against it, and less likely to change our mind even when we do see evidence. In total, our beliefs work sometimes against us, preventing us from accepting new evidence or counter arguments. This is why peer review is a fundamental part of the scientific process.

3. Other Thinking Errors

Current edition: 2018 Nov 03
http://www.humantruth.info/status_quo_and_cognitive_dissonance.html
Parent page: Errors in Thinking: Cognitive Errors, Wishful Thinking and Sacred Truths

All #tags used on this page - click for more:

#beliefs #psychology #religion #science #thinking_errors

Social Media

References: (What's this?)

Book Cover

Book Cover

Book Cover

Skeptical Inquirer magazine. Published by Committee for Skeptical Inquiry, NY, USA. Pro-science magazine published bimonthly.

Anderson & Kellam
(1992). In Riniolo & Nisbet Skeptical Inquirer (2007 May/Jun p49-53). Todd C. Riniolo is Associate Professor of Psychology at Medaille College (Buffalo, New York). Lee Nisbet is Professor of Philosophy at Medaille College and a founding member and Fellow of the Committee for Skeptical Inquiry.

Aristotle. (384-322BCE)
(350BCE) Ethics. Amazon Kindle digital edition. Originally published 340BCE (approx). An e-book.

Carroll, Robert Todd. (1945-2016). Taught philosophy at Sacramento City College from 1977 until retirement in 2007. Created The Skeptic's Dictionary in 1994.
(2011) Unnatural Acts: Critical Thinking, Skepticism, and Science Exposed!. Amazon Kindle digital edition. Published by the James Randi Educational Foundation. An e-book.

Edwards, K.
(1990) The interplay of affect and cognition in attitude formation and change. Journal of Personality and Social Psychology 59: 202-216. In Riniolo & Nisbet Skeptical Inquirer (2007 May/Jun p49-53). Todd C. Riniolo is Associate Professor of Psychology at Medaille College (Buffalo, New York). Lee Nisbet is Professor of Philosophy at Medaille College and a founding member and Fellow of the Committee for Skeptical Inquiry.

Harrison, Guy P.
(2008) 50 Reasons People Give for Believing in a God. Amazon Kindle digital edition. Published by Prometheus Books, New York, USA. An e-book.

Myers, David
(1999) Social Psychology. 6th ('international') edition. Originally published 1983. Current version published by McGraw Hill. A paperback book.

Park, Robert L.
(2008) Superstition: Belief in the Age of Science. Amazon Kindle digital edition. Published by Princeton University Press, New Jersey, USA. An e-book.

Schumpeter, Joseph A.
(1942) Capitalism, Socialism, and Democracy. 2nd. Published by Sublime Books, USA. This edition reproduces the book of 1942 without any change whatever except that a new chapter has been added. An e-book.

Tavris & Aronson
(2007 Nov/Dec) 'Why Won't They Admit They're Wrong?' and Other Skeptic' Mysteries. By social psychologists Carol Tavris & Elliot Aronson. An Article in the magazine Skeptical Inquirer.

Footnotes

  1. Myers (1999). p97-101.^^
  2. Schumpeter (1942). p10.^^
  3. Park (2008). Chapter 7 "The New Age - In which anything goes" digital location 1782,283.^
  4. Harrison (2008). Chapters 2, 32 and 44.^
  5. Anderson & Kellam (1992).^^
  6. Carroll (2011). Chapter 9 "Are We Doomed to Die with Our Biases On?".^
  7. Carroll (2011). Chapter 2 "How to Lose Friends and Alienate Your Neighbors" p25-27.^^
  8. Tavris & Aronson (2007 Nov/Dec). p12-14.^^
  9. Edwards (1990).^^
  10. Aristotle (350BCE). Book IX chapter IV.^^
  11. Carroll (2011). p196.^
  12. "What is Science and the Scientific Method?" by Vexen Crabtree (2014)^

©2018 Vexen Crabtree all rights reserved.
This site uses the HTF Disclaimer (as linked here)