3 reasons why people fall for politicians' lies about statistics

Psychological phenomena like confirmation bias and the Dunning-Kruger effect make it easy for people to fall for deliberate or inadvertent lies in the news.

Author: Mack Clayton Shelley on Feb 28, 2019
 
Source: The Conversation
They said it, but is it true? EQRoy/shutterstock.com

Why do people make such poor decisions about politics? Why are they so often distracted by lies, irrelevant alternatives and specious arguments?

Politicians use and abuse statistics and fabricate when it suits their purposes. Contemporary examples of either deliberate or inadvertent misuse of data are easy to find on all sides of the political divide, from the Trump administration’s claim that U.S. border officials detained “nearly 4,000 known or suspected terrorists” last year at the Mexican border to U.S. Rep. Alexandria Ocasio-Cortez’s December tweet asserting that “66 percent of Medicare for All could have been funded already” with the money spent on the Pentagon’s accounting errors.

The notion of politically related lying with numbers has been around a long time, back at least to Mark Twain in a 1906 book in which he attributed the phrase “lies, damn lies and statistics” to British Prime Minister Benjamin Disraeli. Lots of others claim parentage of the phrase or are given credit for coining it.

I have spent 40 years teaching and publishing in political science and statistics, focused on helping students become critical thinkers. I believe that politicians can get away with lies so easily because the public is not trained to critically consume statistical information or to defend against other (dis)information that is deliberately designed to mislead.

1. Lack of statistical skills

It’s difficult to be a critical consumer of statistical information, because that requires the ability to process numeric data in context.

Many Americans do not do well with processing information about numbers and consequently may make poor decisions. People who are more numerate are less susceptible to being led to a false conclusion, are less affected by their mood, and are more aware of the levels of risk associated with actions and decisions.

For example, if you flip four coins in a row, what’s the probability of getting two heads? Most people guess 50 percent. Figuring out that the answer is actually 37.5 percent takes some work and is not intuitive. So is understanding that a run of nine consecutive tails does not mean that the tenth coin flip is likely to be a head.

In the same way, it’s easy for people to believe the tweet from President Donald Trump, based on outdated information from the Texas secretary of state that “58,000 non-citizens voted in Texas, with 95,000 non-citizens registered to vote. These numbers are just the tip of the iceberg. All over the country, especially in California, voter fraud is rampant. Must be stopped. Strong voter ID! @foxandfriends.”

In reality, proven cases of voter fraud are rare and voter lists often are inaccurate about current citizenship status. A scary-sounding statement that “58,000 non-citizens voted” should trigger immediate head-scratching and fact-checking; as it has turned out, most of the alleged illegal votes were cast by people who subsequently had become citizens and eligible to vote.

2. Letting emotions get the better of you

It’s easy for politicians to take advantage of what Nobel Laureate Herbert Simon calls “bounded rationality.” “Bounded rationality” is about being influenced by emotions, preconceived notions and things I may think I know but really don’t.

What’s more, political figures can get away with saying things that don’t square with the facts, because it would take too much effort for the average person to fact-check everything for accuracy.

Coupled with this is the psychological process of “confirmation bias.” If you hear or read or someone tells you something that sounds wrong to you, you tend to block out ideas, facts or data that don’t jibe with your current beliefs.

Confirmation bias can apply to a wide array of issues, including gun control, sexual double standards and more.

Emotions can sway people to believe untrue statements. Worawee Meepian/shutterstock.com

3. Overestimating your own knowledge

This brings us to the Dunning–Kruger effect.

People with lesser abilities tend to overstate their level of knowledge and understanding. If I see a bad call by a football referee, my first reaction might be to say that I could have gotten that call right, but I’m totally not trained as a referee and wouldn’t have a clue about what call to make on most plays.

This perception of illusory superiority comes from people not being equipped to realize that they don’t know what they don’t know. That in turn makes it all the more difficult to separate out “fake news” from reality. In a 2017 study, researchers Chris Vargo at the University of Colorado and Lei Guo and Michelle Amazeen at Boston University showed that false reports are instrumental in setting the news agenda for partisan media, despite fact-checkers’ efforts. Other research shows that most Americans who see fake news believe it.

Combined with a general lack of knowledge about political processes, these mental processes make it tough for anyone to understand the facts about major issues. Elected public officials are hired by the electorate precisely because they are good at saying things you like to hear. They are rewarded for what they say – rather than for doing the right thing.

Mack Clayton Shelley, II does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Read These Next

Recommended for You