Why We Double Down On Our Arguments Even After Realizing We’re Wrong
A variety of reasons — ranging from cognitive biases to emotional attachments to social dynamics — stop us from accepting our mistakes.
Do you remember any instance when you were engaged in a heated debate, passionately defending your argument with unwavering conviction, when suddenly, a ray of enlightenment broke through the clouds, hitting you like a ton of cognitive bricks and making you realize you were wrong, all along? One would have expected the debate to draw to a close, at this point. But that might not be what actually happened: instead, it’s possible that despite the revelation, you chose to indulge in mental gymnastics that would put performers of even the Cirque du Soleil to shame, just to avoid admitting defeat. It’s quite a common phenomenon, in fact. Most of us, at one point or another, have probably found ourselves stubbornly clinging to our arguments, despite evidence to the contrary forcing us to reconsider.
The human tendency to swear by our stances — even as they begin to crumble like a poorly-baked soufflé — is driven by a variety of reasons, ranging from cognitive biases to emotional attachments to social dynamics, that make it challenging for us to accept we were mistaken. As a result, we often double down on our mistakes. The most common reason why is, perhaps, ego.
When we develop strong emotional ties to our beliefs, our sense of self and identity become intertwined with them — leading us to protect our self-concept by standing firm in our arguments, despite acknowledging their flaws. “Admitting we are wrong is unpleasant, it is bruising for any ego… [But s]ome people have such a fragile ego, such brittle self-esteem, such a weak ‘psychological constitution,’ that admitting they made a mistake, or that they were wrong, is fundamentally too threatening for their egos to tolerate,” writes psychologist Guy Winch. “Accepting they were wrong, absorbing that reality, would be so psychologically shattering, their defense mechanisms do something remarkable to avoid doing so — they literally distort their perception of reality to make it less threatening. Their defense mechanisms protect their fragile ego by changing the very facts in their mind, so they are no longer wrong or culpable.”
Related on The Swaddle:
Moreover, because we “want our views about the world to be validated by other people… we become friends with people who share our beliefs, and we tend not to surround ourselves with people who hold different opinions than we do,” explains Daryl Van Tongeren, an associate professor of psychology at Hope College. The validation can induce a greater resistance to changing our minds, besides creating an echo chamber effect and limiting the range of perspectives we’re exposed to — hindering our ability to consider alternative viewpoints. In the process, our identities can also get linked to the groups we belong to, while our positions become deeply entwined with the group identity. This sense of belonging can, eventually, outweigh our desire for objective evaluation of evidence such that even when we’re presented with compelling counterarguments, we refuse to budge and threaten our group cohesion.
While each one of these factors influence our decision to stay the course, what boosts our tenacity to defend our positions is confirmation bias, which refers to the human tendency to interpret information in a way that confirms our pre-existing beliefs while disregarding contradictory evidence.
So, when we’re confronted with evidence that challenges our viewpoints, we may unconsciously dismiss it or reinterpret it in a manner that aligns with our initial position. “[This] explains why two people with opposing views on a topic can see the same evidence and come away feeling validated by it. This cognitive bias is most pronounced in the case of ingrained, ideological, or emotionally charged views,” notes an article on how confirmation bias can propel people to disconfirm cold, hard numerical evidence, too. “We ignore contradictory evidence because it is so unpalatable for our brains…Constantly evaluating our worldview is exhausting, so we prefer to strengthen it instead. Plus holding different ideas in our head is hard work. It’s much easier to just focus on one.”
In doubling down on our mistakes, though, we are often overwhelmed by a sense of anxiety; psychologists believe this to be an experience of cognitive dissonance. “Cognitive dissonance is what we feel when the self-concept — I’m smart, I’m kind, I’m convinced this belief is true — is threatened by evidence that we did something that wasn’t smart, that we did something that hurt another person, that the belief isn’t true,” notes psychologist Carol Tavris, explaining how cognitive dissonance threatens our very sense of self.
Related on The Swaddle:
“Dissonance is uncomfortable and we are motivated to reduce it… [But] to reduce dissonance, we have to modify the self-concept or accept the evidence. Guess which route people prefer?” Tavris points out, adding, “We cling to old ways of doing things, even when new ways are better and healthier and smarter. We cling to self-defeating beliefs long past their shelf life. And we make our partners, co-workers, parents and kids really, really mad at us.” In other words, the tendency to alleviate cognitive dissonance — by fumbling for ways to justify our initial stance and maintain consistency — further reinforces our commitment to our arguments, even when we recognize their fallibility. This lies at the heart of not owning up to our mistakes.
But by resisting cognitive dissonance thus, we don’t just inconvenience those around us, but harm ourselves, too. Cognitive dissonance can keep us trapped in abusive relationships, too — by making us resistant to the idea that the person we love might not be the right person for us. Even if we, then, recognize harmful patterns in the behavior of someone we believe is “an amazing person,” we find ways to rationalize the behavior, as psychotherapist Zohra Master had told The Swaddle in 2021.
While not wanting to be wrong is only human, refusing to entertain the idea that we can be wrong deters intellectual growth and constructive discourse, both.
“If it is clear to everybody that you made a mistake, digging your heels in actually shows people your weakness of character rather than strength,” Tyler Okimoto, a management professor at the University of Queensland, reminds us. So, when we realize we’re as wrong as a toddler trying to argue that broccoli is a dessert, it is, perhaps, advisable to abandon our initial position — rather than risk looking like a close-minded person, instead of a regular human being who happened to be mistaken about something. As Winch noted, psychological rigidity isn’t a sign of strength, after all.
Devrupa Rakshit is an Associate Editor at The Swaddle. She is a lawyer by education, a poet by accident, a painter by shaukh, and autistic by birth. You can find her on Instagram @devruparakshit.