The Human Truth Foundation

We Dislike Changing Our Minds
Status Quo Bias and Cognitive Dissonance

http://www.humantruth.info/status_quo_and_cognitive_dissonance.html

By Vexen Crabtree 2018

#beliefs #cognitive_dissonance #psychology #thinking_errors

Once we have put time and effort into a belief, then, it becomes difficult for us to fairly evaluate evidence on that topic. We have an inertia, a reticence, to change our beliefs1 and we have an instinctive reaction against arguments that are new or odd, compared to our existing beliefs2, and we are inclined to not give new ideas serious consideration. We often stick with the beliefs we first encounter, for example, very few people change their religion3,4. This is sometimes called a Status Quo Bias and belief perseverance1,5.

A similar mental state, called Cognitive Dissonance, occurs when a contradictory state of affairs occurs6: We see strong evidence against a point of view that we have invested time and effort into, or, we discover that some of our valued beliefs actually contradict one another. We can go to extreme lengths to avoid giving up a belief even when we know (or strongly suspect) we are making a mistake7,8 and we often defend beliefs even when there is evidence against them7,8,9.

These effects are especially once we constructed rational arguments in favour of a belief and have put time and effort into defending them, or if we have emotional attachments to a belief.10,9. This form of bias was noted as long ago as Aristotle, 2400 years ago11. Some academics warn that intelligence can sometimes work against us because we can find it easy to dismiss evidence and contradictions through the use of clever arguments and manipulations12. When we do change our minds, we often deny it (even to ourselves1). It is very difficult to be unbiased and truly open in our approaches to truth, which is why formalized schemes such as The Scientific Method are so important13.


1. Status Quo Bias

#thinking_errors

People tend to prefer arguments and evidence that backs up what they currently believe and we are poor at objectively interpreting opposing arguments and counter-evidence. In analysing the factors that affect human behaviour, the economist Joseph Schumpeter says that "we resent a call to thinking and hate unfamiliar argument that does not tally with what we already believe or would like to believe"2.

Social psychologist David Myers summarizes much of the research that has been done on this topic. "Self-created" justifications for their own beliefs are often imaginative and occur after beliefs have been solidified and such "belief perseverance" feels urgent even when it is more productive to consider a beliefs' worth more rationally and calmly. The result is a "cognitive inertia" that opposes changes in belief - especially when such changes can be noticed by other people. We often deny that any change has occurred, and we often believe we haven't changed our minds too!1

The recognition of this inertia is not a new discovery to the academic world, Aristotle pondered over it almost 2400 years ago and concluded that people invest much time and effort in their opinions and beliefs, and are therefore unlikely to easily abandon them:

Book CoverAll people value most what has cost them much labour in the production; for instance, people who have themselves made their money are fonder of it than those who have inherited it: and receiving kindness is, it seems, unlaborious, but doing it is laborious. And this is the reason why the female parents are most fond of their offspring; for their part in producing them is attended with most labour, and they know more certainly that they are theirs. This feeling would seem also to belong to benefactors.

"Ethics" by Aristotle (350B)11

2. Cognitive Dissonance

#beliefs #cognitive_dissonance #politics #religion #science #thinking_errors

Over millennia, we (as a species) have come to hold a massive array of contradictory and daft beliefs, many of which are "obviously wrong" to many except believers. In the modern world, we are increasingly finding that the facts of the world do not mesh well with many of our beliefs. The skeptic Robert Todd Carroll worries that some (many?) religionists "when confronted with overwhelming evidence that what their religion teaches is wrong, devote their lives to defending its errors"7.

One of the greatest challenges for scientists and educators is persuading people to give up beliefs they hold dear when the evidence clearly indicates that they should. Why aren't most people grateful for the data? [...] The motivational mechanism underlying the reluctance to be wrong, change our minds, admit serious mistakes, and accept unwelcome findings is cognitive dissonance. The theory of cognitive dissonance was invented fifty years ago by Leon Festinger, who defined dissonance as a state of tension that occurs whenever a person holds two cognitions that are psychologically inconsistent. [...] One of the most effective ways the mind maintains constant beliefs is through the confirmation bias - the fact that we tend to notice and remember information that confirms what we believe and ignore or forget information that disconfirms it.

"'Why Won't They Admit They're Wrong?' and Other Skeptic' Mysteries"
Tavris & Aronson (2007)8

Beliefs about oneself or the world can persist despite information that would suggest [...] the need to update beliefs. Such persistence can take place through many mechanisms of self-deception or dissonance reduction. The propensity to rationalize away evidence that clashes with beliefs has been documented to be higher in some instances for more analytically sophisticated and better educated individuals, so one cannot assume that the importance of motivated cognition will decrease as levels of education increase.

United Nations Human Development Report (2022)9

In short, we suffer mental distress when perceived evidence, our own actions, or counter-arguments threaten beliefs and opinions we already hold. This cognitive dissonance and its effects have gone by many names, and been discovered in extensive studies. It is a status quo bias towards what we already believe. We have belief perseverance5, which is especially strong if previous a person has constructed rational arguments in favour of their belief or if they have emotional attachments to the belief10,9. When changes of mind do occur, subjects then refuse to admit that they have had changes in beliefs over time. As we invest mental time on a belief, we become less likely to notice evidence against it, and less likely to change our mind even when we do see evidence. In total, our beliefs work sometimes against us, preventing us from accepting new evidence or counter arguments. This is why peer review is a fundamental part of the scientific process.

Politicians are Notably Stubborn in Resisting Contradictory Evidence:

Although they bear more responsibility to carefully check that their decisions are fact-based, politicians are less likely than other people to change their beliefs when faced with evidence that ought to change their beliefs; the United Nations' 2022 report on issues facing human development call this "motivated reasoning"9 which is a form of cognitive dissonance.14. This can have a disastrous effect when policies are not working; politicians become trapped by their own personas and feel that they will do better for themselves by defending a position, rather than by changing it, even if latter is what the public need.

"The Internal Challenges Facing Democracy" by Vexen Crabtree (2017)

3. Other Thinking Errors