Home Science Why Is It That Even Proven Facts Can’t Change Some People’s Minds?

Why Is It That Even Proven Facts Can’t Change Some People’s Minds?

0
Why Is It That Even Proven Facts Can’t Change Some People’s Minds?

[ad_1]

In a world where information is always at our fingertips, it can be baffling that misinformation continues to run rampant. With scientific facts so easily accessible, it should be simple to set the record straight.

However, scientific research has confirmed: Facts alone aren’t always enough to correct misinformation and change people’s minds.

Unfortunately, the way our brains store information isn’t always conducive to correction. On top of that, we aren’t as rational as we like to think we are, which tangles our understanding of the world in a sticky web of personal identity and emotion.

Information Retention

Our brains have a remarkable ability to retain information, albeit with some quirks.

Rather than replacing incorrect information, our minds tend to create new memories alongside existing ones, says Lisa Fazio, a psychology professor at Vanderbilt University.

Consequently, when we try to recall a corrected piece of information, we may end up with competing ideas – one based on the original information and another incorporating the corrected details.

Depending on which memory was encoded more strongly or recently, incorrect information may dominate our recollection.

This phenomenon, known as the “continued influence effect,” helps explain why corrections, even when we readily accept them, sometimes fail to override the original misinformation.


Read more: Understanding the Basis of Superior Memory


Mental Shortcuts

Despite their incredible cognitive abilities, our brains can only process so much information at a time. Faced with complicated information, we tend to rely on mental shortcuts rather than engage in purely logical thinking.

One such shortcut involves repetition. Research has shown that information we hear multiple times tends to stick in our minds. Even children as young as five are more likely to believe something is true if they’ve heard it more than once.

According to Fazio, these mental shortcuts generally work due to the finite amount of truth in the world. “There’s an infinite universe of false things,” explains Fazio, “and a more narrow universe of true things, so most of the stuff that we hear multiple times is true.”

While this shortcut is useful in many cases, it can quickly become dangerous when trying to sift through frequently repeated misinformation and accurate information.

This tendency to perceive something as true due to repetition, even if it lacks factual basis, is known as the illusory truth effect.

The illusory truth effect becomes particularly concerning in the context of social media, where misinformation can spread rapidly. Due to this effect, false claims can gain traction and become widespread, causing confusion.


Read more: People Underestimate the Joys of Sitting and Thinking, Study Suggests


The Influence of Belief Systems

Our existing beliefs also play a big role in how we perceive and process information.

New information that challenges our deeply-held beliefs can cause discomfort. So, to avoid that discomfort, our minds resist change.

“It’s whether or not you view the information as an attack on yourself and your core beliefs,” explains Fazio. “I think we’re a lot quicker to dismiss any sort of correction that goes against our identities or worldviews – to the point where we don’t even consider [the correction].”

Our human tendency to closely identify with our values and our brain’s love of shortcuts can make it easy to discount reputable sources.

“It’s way easier to think, ‘that’s from a source that I don’t trust, I don’t have to pay attention to it,’” says Fazio, “rather than to actually engage with information that contradicts what you believe about yourself and the world.”

Why Facts Dont Change Our Minds

So, if facts aren’t enough to change minds, what does work?

Researchers continue to debate the best ways to correct misinformation, and outdated ideas on setting the record straight still persist.

What’s generally agreed is this: It’s not just about the facts themselves – it’s about how they’re presented and the context in which they’re received.

While it’s generally best to avoid the repetition of untruths due to the illusory truth effect, some evidence suggests that using misinformation as a scaffolding for a correction can make it more effective.

“You can imagine repeating a myth could be really bad because people think it’s more true later on. But there’s actually some evidence saying it helps people link the correction to that existing belief. And that makes it more impactful.”


Read more: Why Are So Many of Us Afraid of Our Thoughts?


Understanding Emotions and Values

Another way to correct misinformation may lie in exactly what makes misinformation work in the first place: emotions and values.

By understanding the values that underpin someone’s beliefs, we can frame corrections in a way that appeals to those values, increasing the chances of creating meaningful dialogue and potential change.

“The most useful thing is when you can get information from sources that they do trust,” says Fazio. “When you don’t have that, one thing that can help is, starting by talking about shared values.”

Facts woven into stories, personal anecdotes and appeals to empathy can evoke emotions that resonate with us on a deeper level, making us more open to considering alternative viewpoints.

Correcting misinformation and changing minds rarely happens through a single interaction or a barrage of angry internet comments. Patience, empathy and respectful communication are key for productive discussions that have the potential to challenge and shape beliefs over time.


Read more: How Did Belief Evolve?


[ad_2]

LEAVE A REPLY

Please enter your comment!
Please enter your name here