By getting into the news business, Facebook opened itself up to a new controversy
To lure more users and advertising dollars, Facebook has increasingly assumed the role of a news organization by curating and publishing articles -- and to great effect. Four in 10 U.S. adults now get their news from the social media giant.
But with recent accusations that Facebook suppressed news from conservative-leaning outlets, the company is learning there are consequences to being everything to everyone.
The allegations laid out in Gizmodo stories over the last week, citing unnamed former Facebook contractors, expose one of the country’s leading corporate juggernauts to political inquiries it would much rather ignore. They also raise questions about how Facebook designs and applies its algorithms, something it’s loathe to answer in the face of competition.
On Tuesday, the U.S. Senate Commerce Committee sent a letter to Facebook Chief Executive Mark Zuckerberg asking him, among other things, to provide a full account of how the company operates its trending topics feed -- a list of popular news stories, personalized to individual tastes, that appears in the upper right side of users’ Facebook pages.
“Facebook must answer these serious allegations and hold those responsible to account if there has been political bias in the dissemination of trending news,” Sen. John Thune (R-S.D.), chairman of the Commerce Committee, said in the letter. “Any attempt by a neutral and inclusive social media platform to censor or manipulate political discussion is an abuse of trust and inconsistent with the values of an open Internet.”
Facebook holds itself up as a passive player in the media ecosystem, one that delivers articles, posts and videos to users based on the things they and their friends care about. People had always been involved in writing the largely secret software that defined “care.” But the launch of trending topics in 2014 pushed Facebook to increased human involvement in deciding what users would encounter.
The former Facebook contractors told Gizmodo that they were instructed to select articles from preferred media sites such as the New York Times, Time and Variety and downplay right-leaning news sites, conservative topics or news about Republican Party leaders.
In the wake of the allegations, some conservatives said they always suspected a bias on the part of the company and others said they never expected Facebook to be neutral. But plenty shamed Facebook for altering what the company had described as a level playing field that connects people to what matters to them most.
“Facebook and social media have been championed as platforms without filters -- places where news and information can reach citizens directly,” said Vincent Harris, a Republican media strategist and expert in digital campaigning. “Facebook has already made it nearly impossible for nonpaid content to reach a user’s feed and now they want to act as a news god? It’s a slippery slope.”
Americans have gotten used to the idea that you go to certain news organizations for certain perspectives -- say, Fox News for a conservative viewpoint or MSNBC for a liberal point of view.
Facebook’s entry into the news business is worth watching, not only because of its massive reach, but also because it’s unclear what responsibility it has to divulge political leanings -- if indeed it has them.
“There’s an issue with transparency,” said Kjerstin Thorson, an assistant professor at USC Annenberg’s School of Journalism. “They have a responsibility to tell us what kind of news provider they are. There is a skepticism around any news media. The difference is Facebook [isn’t] telling us what the news is. It’s trying to tell us what we’re talking about.”
------------
FOR THE RECORD
An earlier version of this article misspelled an assistant professor’s name as Kjerstin Thornson. She is Kjerstin Thorson.
------------
Facebook Vice President Tom Stocky refuted the allegations in a post Monday, saying there were guidelines to ensure neutrality.
“These guidelines do not permit the suppression of political perspectives,” Stocky said. “Nor do they permit the prioritization of one viewpoint over another or one news outlet over another.”
Facebook isn’t the first online destination to run into challenges presenting news.
Blogging service Medium, aggregation app Google News and many other platforms have also faced scrutiny over a lack of transparency. Stories appear on Google News based on computer-generated rankings, but humans help decide what sources are up for consideration.
Many more fast-growing apps want to become news sources too, which experts say will spur an increasing number of allegations of bias.
“All of these different platforms are going to produce their own special controversies for what it means to produce news,” Thorson said. “Each one gets tangled up in what’s their responsibility, what’s the ethics.”
Instagram, the photo-sharing app owned by Facebook, has an editorial team that writes about the work of users, including chefs and night-sky photographers. Apple’s editorial team decides where things go in its News app, which includes stories from the Wall Street Journal, Buzzfeed and other publications. Last year, Snapchat hired a reporter away from CNN to cover politics.
The strategies of other companies, for now, are vastly different from Facebook, which maintains that it’s not actually producing any news content. But it carries more influence than anyone else with 1.5 billion users worldwide.
About 36% of U.S. adults say Facebook is among several important ways that they access news, according to Pew Research Center data from last year.
Among people ages 18 to 29, about 82% of Republican survey respondents and 81% of Democrats have a Facebook account, according to a Harvard University Institute of Politics report.
Still, there’s debate about whether Facebook’s curation efforts make a difference.
Facebook researchers have studied whether the company is succeeding in presenting multiple viewpoints to users, coming to the conclusion that the software that determines stories in the news feed was having less of an effect than people’s own choices about what to click on.
But outsiders say Facebook is shirking its responsibility and could do far more to ensure neutrality. At the least, the company could admit that the team behind trending topics is acting as journalists.
“Good on Facebook for highlighting [Black Lives Matter] into their news section,” Snapchat’s researcher and social media theorist Nathan Jurgenson said on Twitter. “Bad on them for pretending like that’s not journalism.”