The Human Truth Foundation

We Dislike Changing Our Minds
Status Quo Bias and Cognitive Dissonance

https://www.humantruth.info/status_quo_and_cognitive_dissonance.html

By Vexen Crabtree 2018

#beliefs #cognitive_dissonance #psychology #status_quo_bias #thinking_errors

We don't like changing our minds1. Especially once we have put conscious time and effort into a belief, or it becomes emotional, it's difficult to fairly evaluate contrary evidence against it2,3,4. We often stick with the beliefs we first encounter: very few change their religion5,6. This is called belief perseverance1,7 and status quo bias, noted as long ago as Aristotle in the 4th century BCE8.

When we do recognize that new evidence contradicts us, or even that our own valued beliefs contradict one other, we experience cognitive dissonance9. Even then, we can go to extreme lengths to avoid giving up a belief even when we suspect we've made a mistake10,11 and we often defend beliefs even when there is evidence against them10,11,3.

Some academics warn that intelligence can sometimes work against us because we can find it easy to dismiss evidence and contradictions through the use of clever arguments and manipulations12. We don't have the humility to admit that we're wrong13 and when we do change our minds, we often deny it (even to ourselves1) and to others. It is very difficult to be unbiased and open in our approach to truth, which is why formalized schemes such as The Scientific Method are so important14.


1. Status Quo Bias

#status_quo_bias #thinking_errors

Book CoverPeople tend to prefer arguments and evidence that backs up what they currently believe and we are poor at objectively interpreting opposing arguments and counter-evidence. In analysing the factors that affect human behaviour, the economist Joseph Schumpeter says that "we resent a call to thinking and hate unfamiliar argument that does not tally with what we already believe or would like to believe"4.

Social psychologist David Myers summarizes much of the research that has been done on this topic. "Self-created" justifications for their own beliefs are often imaginative and occur after beliefs have been solidified and such "belief perseverance" feels urgent even when it is more productive to consider a beliefs' worth more rationally and calmly. The result is a "cognitive inertia" that opposes changes in belief - especially when such changes can be noticed by other people. We often deny that any change has occurred, and we often believe we haven't changed our minds too!1

The recognition of this inertia is not a new discovery to the academic world, Aristotle pondered over it almost 2400 years ago and concluded that people invest much time and effort in their opinions and beliefs, and are therefore unlikely to easily abandon them:

Book CoverAll people value most what has cost them much labour in the production; for instance, people who have themselves made their money are fonder of it than those who have inherited it: and receiving kindness is, it seems, unlaborious, but doing it is laborious. And this is the reason why the female parents are most fond of their offspring; for their part in producing them is attended with most labour, and they know more certainly that they are theirs. This feeling would seem also to belong to benefactors.

"Ethics" by Aristotle (350B)8

2. Cognitive Dissonance

#beliefs #cognitive_dissonance #democracy #democracy_challenges #politics #religion #science #thinking_errors

Over millennia, we (as a species) have come to hold a massive array of contradictory and daft beliefs, many of which are "obviously wrong" to many except believers. In the modern world, we are increasingly finding that the facts of the world do not mesh well with many of our beliefs. The skeptic Robert Todd Carroll worries that some (many?) religionists "when confronted with overwhelming evidence that what their religion teaches is wrong, devote their lives to defending its errors"10.

One of the greatest challenges for scientists and educators is persuading people to give up beliefs they hold dear when the evidence clearly indicates that they should. Why aren't most people grateful for the data? [...] The motivational mechanism underlying the reluctance to be wrong, change our minds, admit serious mistakes, and accept unwelcome findings is cognitive dissonance. The theory of cognitive dissonance was invented fifty years ago by Leon Festinger, who defined dissonance as a state of tension that occurs whenever a person holds two cognitions that are psychologically inconsistent. [...] One of the most effective ways the mind maintains constant beliefs is through the confirmation bias - the fact that we tend to notice and remember information that confirms what we believe and ignore or forget information that disconfirms it.

"'Why Won't They Admit They're Wrong?' and Other Skeptic' Mysteries"
Tavris & Aronson (2007)11

If a belief has emotional ties, it's even harder to change2,3,4 because emotions defy rationalism, and undermine it in ways that we can't discern, and often, actively deny. Especially in social situations, sticking to our beliefs feels like the correct thing to do, regardless of facts.

Book CoverCritical thinking in the strong sense requires [...] intellectual humility: a willingness to admit error, change beliefs when warranted [and] a willingness to go wherever the evidence leads.

"Unnatural Acts: Critical Thinking, Skepticism, and Science Exposed!" by Robert Todd Carroll (2011)13

Beliefs about oneself or the world can persist despite information that would suggest [...] the need to update beliefs. Such persistence can take place through many mechanisms of self-deception or dissonance reduction. The propensity to rationalize away evidence that clashes with beliefs has been documented to be higher in some instances for more analytically sophisticated and better educated individuals, so one cannot assume that the importance of motivated cognition will decrease as levels of education increase.

United Nations Human Development Report (2022)3

In short, we suffer mental distress when perceived evidence, our own actions, or counter-arguments threaten beliefs and opinions we already hold. This cognitive dissonance and its effects have gone by many names, and been discovered in extensive studies. It is a status quo bias towards what we already believe. We have belief perseverance7, which is especially strong if previous a person has constructed rational arguments in favour of their belief or if they have emotional attachments to the belief2,3. When changes of mind do occur, subjects then refuse to admit that they have had changes in beliefs over time. As we invest mental time on a belief, we become less likely to notice evidence against it, and less likely to change our mind even when we do see evidence. In total, our beliefs work sometimes against us, preventing us from accepting new evidence or counter arguments. This is why peer review is a fundamental part of the scientific process.

Politicians are Notably Stubborn in Resisting Contradictory Evidence:

Although they bear more responsibility to carefully check that their decisions are fact-based, politicians are less likely than other people to change their beliefs when faced with evidence that ought to change their beliefs; the United Nations' 2022 report on issues facing human development call this "motivated reasoning"3 which is a form of cognitive dissonance.15. This can have a disastrous effect when policies are not working; politicians become trapped by their own personas and feel that they will do better for themselves by defending a position, rather than by changing it, even if latter is what the public need.

"The Internal Challenges Facing Democracy: 3.5. Government Suppression of the Press (the Press Freedom Index)" by Vexen Crabtree (2017)

3. Skepticism: Approach Truth Carefully and Change Your Mind as Evidence Requires 16

#critical_thinking #epistemology #philosophy #skepticisim #thinking_errors

Skepticism is an approach to understanding the world where evidence, balance and rationality come first: anecdotal stories and personal experience are not automatically accepted as reliable indicators of truth. There are a great many ways in which we can come to faulty conclusions - often based on faulty perceptions - given our imperfect knowledge of the world around us17. With practice, we can all become better thinkers through developing our critical-thinking and skeptical skills18. Facts must be checked, especially if they contradict established and stable theories that do have evidence behind them14,19. Thinking carefully, slowly and methodically is a proven method of coming to more sensible conclusions20,21,22. It is difficult to change our minds, and to do so openly,1,3,4,13 but to do so honours the truth. Skepticism is 'realist' in that it starts with the basic premise that there is a single absolute reality23, that we can best try to understand collectively through the scientific method. Skepticism demands that truth is taken seriously, approached carefully, checked repeatedly, and investigated thoroughly.

For more, see:

4. Other Thinking Errors