Should we worry that half of Americans trust their gut to tell them what's true?
Intuition is just one of many factors that shape what you believe.
Have you ever thought to yourself, “I’ll bet that’s true,” before you had all the facts? Most people probably have at some point.
Where people differ is in how often they do so. A 2016 survey that my colleague Brian Weeks and I conducted found that 50.3 percent of all Americans agreed with the statement “I trust my gut to tell me what’s true and what’s not.” Some of those polled felt quite strongly about it: About one in seven (14.6 percent) strongly agreed, while one in 10 (10.2 percent) strongly disagreed.
In other words, there’s a lot of variation in how Americans decide what to believe.
In a recent paper, we were able to use the findings from this survey and two others to dig into the different approaches people take when deciding what’s true.
We found some surprising differences between how people think about intuition and how they think about evidence. It turns out that how often someone trusts their intuition and how important they think it is to have evidence are two separate things. Both make a big difference in what we believe.
What we learned offers some hope for people’s ability to tell truth from fiction, despite the fact that so many trust their gut.
How beliefs are formed
Many incorrect beliefs have political foundations. They promote a policy, an ideology or one candidate over another.
People are susceptible to political misinformation because they tend to believe things that favor their side – even if it isn’t grounded in data or science. There are numerous factors at play, from the influence of nonconscious emotions to the need to defend a group that the individual identifies with.
For these reasons, millions of Americans believe things that aren’t true.
People reject the conclusions of scientists when they deny humans’ role in promoting climate change, question the safety of genetically modified foods or refuse to have their children vaccinated.
They reject the assessments of fact checkers, incorrectly believing that President Obama was born outside the U.S. or that Russia successfully tampered with vote tallies in the 2016 presidential election. And certain conspiracy theories – like the belief that President Kennedy’s assassination was orchestrated by a powerful secret organization – are remarkably persistent.
With all the talk about political bias, it’s easy to lose track of the fact that politics aren’t the only thing shaping people’s beliefs. Other factors play a role too.
For example, people are more likely to believe something the more often they’ve heard it said – commonly known as the illusory truth effect. And adding a picture can change how believable a message is, sometimes making it more convincing, while at other times increasing skepticism.
Valuing intuition versus valuing evidence
Our study focuses on something else that shapes beliefs: We looked at what matters the most to people when they’re deciding what’s true.
We found that having faith in your intuition about the facts does make you more likely to endorse conspiracy theories. However, it doesn’t really influence your beliefs about science, such as vaccine safety or climate change.
In contrast, someone who says beliefs must be supported with data is more likely both to reject conspiracy theories and to answer questions about mainstream science and political issues more accurately.
The risk of relying on one’s intuition may be self-evident, but its role in belief formation is more nuanced.
Although our study shows that trusting gut feelings is associated with belief in conspiracy theories, this doesn’t mean that intuition is always wrong. (Occasionally a conspiracy does turn out to be real.)
Furthermore, intuition isn’t all bad. There’s lots of evidence that a person who is unable to use feelings in forming a judgment tends to make very poor decisions.
In the end, knowing how much someone trusts his or her intuition actually tells you very little about how much proof that person will need before he or she will believe a claim. Our research shows that using intuition is not the opposite of checking the evidence: Some people trust their instincts while at the same time valuing evidence; others deny the importance of both; and so forth.
The key is that some people – even if they usually trust their gut – will check their hunches to make sure they’re right. Their willingness to do some follow-up work may explain why their beliefs tend to be more accurate.
It’s valuing evidence that predicts accuracy on a wider range of issues. Intuition matters less.
It’s all about the evidence
These findings might seem obvious. But researchers studying misperceptions often find that “obvious” predictors don’t work the way we hope they would.
For example, one study sorted people based on how accurate they are when solving problems for which the obvious answer is incorrect: If a bat and a ball cost US$1.10 in total, and the bat costs $1.00 more than the ball, how much does the ball cost? (It’s not $.10.) Results show that individuals who got questions similar to this one right tended to be more biased in their beliefs about climate change.
Another study found that people with the strongest reasoning skills and the highest science literacy also tend to be more biased in their interpretation of new information. Even asking people to “think carefully” can lead to more biased answers.
In this context, our results are surprising. There are many individual qualities that seem like they should promote accuracy, but don’t.
Valuing evidence, however, appears to be an exception. The bigger the role evidence plays in shaping a person’s beliefs, the more accurate that person tends to be.
We aren’t the only ones who have observed a pattern like this. Another recent study shows that people who exhibit higher scientific curiosity also tend to adopt more accurate beliefs about politically charged science topics, such as fracking and global warming.
There’s more we need to understand. It isn’t yet clear why curiosity and attention to the evidence leads to better outcomes, while being knowledgeable and thinking carefully promote bias. Until we sort this out, it’s hard to know exactly what kinds of media literacy skills will help the most.
But in today’s media environment – where news consumers are subjected to a barrage of opinions, data and misinformation – gut feelings and people’s need for evidence to back those hunches up can play a big role. They might determine whether you fall for a hoax posted on the Onion, help spread Russian disinformation or believe that the British spy agency MI6 was responsible for Princess Diana’s death.
For now, though, when it comes to fighting the scourge of misinformation, there’s a simple strategy that everyone can use. If you are someone who consistently checks your intuition about what is true against the evidence, you are less likely to be misled. It may seem like common sense, but learning to dig into the story behind that shocking headline can help you avoid spreading falsehoods.
So if someone shares something with you that you know is false – especially if it is someone you know – don’t be afraid to disagree.
There’s no need for name calling; studies have shown that just providing evidence can make a difference, if not for the person who shared the falsehood, then at least for others who were exposed to it.
In a world where the very idea of “truth” often appears under attack, this is an easy way that individuals can make a difference.
R. Kelly Garrett receives funding from the National Science Foundation.
Read These Next
Transform the daily grind to make life more interesting – a philosopher shares 3 strategies to help
A shift in mindset as you go through your day-to-day can help you cultivate a psychologically rich life.
What if you could rank food by ‘healthiness’ as you shopped? Nutrient profiling systems use algorith
Nutrition Facts labels provide useful information about how nutritious a food is, but can be overwhelming.…
New Year’s Eve celebrates St. Silvester – the 4th-century pope whose legend shaped ideas of church a
Historians may not know much about Silvester’s life, but the era he lived in was pivotal for Christianity.