Think Your Opinion Must Be Right? Science Reveals Why You May Be Wrong

certain uncertain thinking
Adobe Stock
Published on
Updated on

Key Takeaways

  • People tend to think they’re right even if they don’t have all the info they need

  • People given half the story took a firm stand that agreed with the info they were provided

  • However, many were willing to change their minds if given the whole picture

THURSDAY, Oct. 10, 2024 (HealthDay News) -- Attention all 'know-it-alls."

Folks who are sure they're right often believe they've got enough information to make up their minds, even if in reality they only have part of the picture, a new study finds.

It’s a concept called the “illusion of information adequacy,” and it helps explain how people can have such strong and cemented opinions even though they get their news from limited and biased sources, researchers said.

“We found that, in general, people don’t stop to think whether there might be more information that would help them make a more informed decision,” said researcher Angus Fletcher, a professor of English at Ohio State University.

“If you give people a few pieces of information that seems to line up, most will say ‘that sounds about right’ and go with that,” Fletcher said in an Ohio State news release.

For the study, researchers recruited nearly 1,300 Americans for an online experiment.

All of the people read an article about a fictional school that lacked adequate water.

But one group were provided an article containing only reasons why the school should merge with another that had adequate water. A second group’s article only gave reasons for staying separate and hoping for other solutions.

Only a third group, which served as the control group, had an article that provided all the arguments for either merging or staying separate.

The two groups that read only half the story still believed they had enough information to make a solid decision, and said they would follow the recommendations in the article provided to them, results show.

“Those with only half the information were actually more confident in their decision to merge or remain separate than those who had the complete story,” Fletcher said. “They were quite sure that their decision was the right one, even though they didn’t have all the information.”

Folks with half the information also thought most other people would come to the same decision as them, researchers added.

However, the experiment offered one ray of hope.

People who had read only one side of the story were later provided arguments for the other side, and many were willing to change their minds once they had all the facts, researchers said.

The new study was published Oct. 9 in the journal PLOS One.

That might not work all the time, particularly on issues related to entrenched ideological positions, Fletcher said.

In those cases, people might reject the new information as untrustworthy or try to reframe it to fit in with their preexisting views.

“But most interpersonal conflicts aren’t about ideology,” Fletcher said. “They are just misunderstandings in the course of daily life.”

People need to make sure they have the full story about a situation before they take a stand or make a decision, rather than leading with their chin, Fletcher said.

“As we found in this study, there’s this default mode in which people think they know all the relevant facts, even if they don’t,” Fletcher said.

“Your first move when you disagree with someone should be to think, ‘Is there something that I’m missing that would help me see their perspective and understand their position better?’” Fletcher continued. “That’s the way to fight this illusion of information adequacy.”

More information

Princeton University has more about the psychology of belief.

SOURCE: Ohio State University, news release, Oct. 9, 2024

What This Means For You

If you’re taking a stand or making an important decision, make sure you’ve got all of the information you need.

Related Stories

No stories found.
logo
www.healthday.com