Facts about fake news's influence on U.S. elections and the fight against misinformation - Los Angeles Times
Advertisement

Facts about fake news’s influence on U.S. elections and the fight against misinformation

Share via

With another presidential election around the corner, a question from the last still lingers: Did fake news help Donald Trump beat Hillary Clinton in the race for the White House?

It depends on whom you ask, and how you define “fake news.” Trump, for example, tends to lob the term at mainstream media outlets when he doesn’t like what they report. And in everyday jargon, it’s become a commonplace phrase to refer to conspiracy theories and wrongful speculation.

Why conservatives are more likely than liberals to believe false information about threats »

Advertisement

Researchers who have been studying the 2016 election consider fake news to be any piece of misinformation intended to sway and confuse the public. It often spreads via websites designed to help misinformation circulate as widely as possible.

So far, researchers have found that fake news is not as influential as they had feared. To the extent it is mistaken for actual news, the people who are hoodwinked tend to lean conservative.

Here’s a look at some of the real facts about about fake news.

Are people any good at spotting fake news?

Yes. Both Democrats and Republicans were able to distinguish between actual news and spin from hyper-partisan sources. That’s according to a report last month in the Proceedings of the National Academy of Sciences that examined Americans’ ability to recognize misinformation.

Advertisement

Researchers asked 1,980 non-news experts from around the country to rank the trustworthiness of 60 websites. Some were mainstream news sites, like Fox News and CNN; others published mostly hyper-partisan stories that sometimes veered into misinformation, like Occupy Democrats and Breitbart News Network. The survey-takers’ responses were compared to the ratings of eight professional fact-checkers.

When it came to trusting various news sources, Republicans were more suspicious of mainstream outlets. The researchers suggested that Trump’s criticism of mainstream news could be partly responsible for that skepticism.

There is a vast menagerie of misinformation.

— David Lazer

Advertisement

David Lazer, who studies the internet’s influence on citizens and their elected officials, said it’s all but impossible to know how exposure to all kinds of misinformation — made-up headlines, photos that have been doctored or taken out of context, memes based on moments that never really happened — affects people when they see it in their Facebook or Twitter feed but don’t click on or otherwise engage with it.

“There is a vast menagerie of misinformation,” Lazer said. ”It’s particularly insidious because it undermines the legitimacy of the mainstream news.”

Who has been fooled by fake news?

Not everyone.

Some news consumers are more gullible than others, and age may have something to do with it.

A study published in January in the journal Science Advances examined the behavioral data of 1,191 Facebook users who gave researchers access to their account activity so they could see the types of links that were shared on the platform. Researchers found that fewer than 10% of Americans shared a story from a fake news domain, and those who did were more likely to be over the age of 65. In fact, senior citizens shared fake news articles on Facebook at a rate that was seven times higher than that of young adults between the ages of 18 and 29. This raises questions about the digital media literacy gap between older and younger Americans, the study authors said.

Political ideology may be a factor as well. The study also revealed that conservatives were more likely than moderates or liberals to share fake news articles in 2016. People who were deemed “conservative” shared an average of 0.75 fake news links over the last five weeks of the election season, and those who were “very conservative” shared an average of one fake news link during that period. Meanwhile, those who were “very liberal,” “liberal,” or “moderate” all shared less than 0.1 such links in that window. It’s possible that conservative Facebook users were simply exposed to more fake news articles than their counterparts, but shared them at the same rate, the study authors said.

Fake news flooded social media before the 2016 presidential election.
(John Locher / Associated Press)
Advertisement

Another report about people’s behavior on Facebook found that only the 10% of Americans with the most conservative media diets clicked on links to fake news stories or navigated to the websites where they were posted.

Compared to 2016, “even fewer Americans were exposed to fake news in 2018,” said University of Michigan political scientist Brendan Nyhan, who worked on the report. In his view, that’s “an indication that Facebook doesn’t seem to be playing the same role in enabling the distribution of fake news.”

The trend was even more apparent on Twitter. Lazer and others analyzed more than 16,000 Twitter accounts over a four-month period in 2016. They found that just 0.1% of users were responsible for sharing nearly 80% of fake news on the social network. The spread of fake news was mostly concentrated among conservative users, according to their study published in January in the journal Science.

That’s in line with research conducted by a team from UCLA in 2017. They examined how personality traits and thinking styles affect people’s intake and acceptance of information, and found that conservatives are more likely than liberals to believe things that aren’t true when the possible consequences are negative, or suggest possible danger.

What’s the role of bots and trolls?

Bots and trolls accelerate the spread of misinformation. Even if they don’t convince people that something fake is real, they can sow doubt about things that shouldn’t be in question.

For instance, an examination of online discourse about vaccine safety found that Twitter bots and Russian trolls promoted arguments both for and against immunizations. That created a false equivalency that legitimized the thoroughly discredited view that vaccines are dangerous, according to a report last year in the American Journal of Public Health.

Advertisement

[Social media] amplifies what would really be a fringe message, and makes it mainstream when it’s not.”

— David Broniatowski

In this case, bots not only scare some people away from vaccines, they reduce their overall confidence in the healthcare system, said study leader David Broniatowski, whose research examines how people make decisions that involve risk.

“Social media certainly allows some of these misconceptions to spread,” said Broniatowski, a professor at George Washington University. “It amplifies what would really be a fringe message, and makes it mainstream when it’s not.”

A 2018 study published in Nature examined how social bots spread hundreds of thousands of fake news articles from May 2016 to March 2017. Researchers who tracked 389,569 unsubstantiated or debunked claims from that period found that bots were largely responsible for false information going viral.

Voters have high tolerance for politicians who lie, even those caught doing it »

Social bots tend to tweet and retweet fake news upon publication, a technique that amplifies a story almost immediately, the study found. The bots often tag influential people, like journalists or politicians, increasing the likelihood that the fake story will be shared further. As the study authors noted, social media platforms are designed to prioritize engaging content, and that doesn’t always equate to trustworthy posts.

Advertisement

Twitter has removed millions of bot and fake accounts since 2017. But there’s no guarantee that more will not surface in the future.

And none of it means that bots are entirely responsible for the rise of fake news. A 2018 study in Science found that false information was 70% more likely to be retweeted than the truth. That’s because false statements were more unusual and unfamiliar than facts, and that novelty elicited stronger emotional reactions, the study authors said.

Can fact-checking neutralize fake news?

In theory, perhaps. But in reality, the people most in need of a fact-check are not necessarily the people that fact-checkers are reaching, according to research by Nyhan and others.

In an online survey of 2,525 Americans, researchers found that none of those who visited a fake news site came across a fact-check that debunked the dubious claim. One in four of the survey-takers visited a site known to post misinformation, and 38% of respondents were not familiar with fact-checking. (Study participants submitted data to YouGov and allowed researchers to track their online activity anonymously.)

In a sense, it’s a natural extension of the filter bubbles we tend to create by reading and sharing stories that align with our viewpoints, while ignoring or dismissing stories that don’t.

“Individuals who engage in high levels of selective exposure to online news in general are also differentially likely to visit fake news websites favoring their preferred candidate,” the study authors wrote. “In general, fake news consumption seems to be a complement to, rather than a substitute for, hard news — visits to fake news websites are highest among people who consume the most hard news and do not measurably decrease among the most politically knowledgeable individuals.”

Advertisement

Experts still have a long way to go to rid the planet of fake news. In the meantime, diversifying your sources of information could improve your odds of identifying any misinformation you happen to come across, experts said.

Advertisement