• Ideas
  • Media

It’s Time Facebook Came Clean About How It Picks What You Read

5 minute read
Ideas
Luckerson, a journalist based in Tulsa with a biweekly newsletter about neglected black history called Run It Back, is the author of Built from Fire: The Epic Story of Tulsa's Greenwood District, America's Black Wall Street

Technology companies generally strive to create easy-to-use experiences. Algorithms, most tell us, are the key to a seamless digital future. At Facebook, algorithms determine which of your friends’ posts will appear the top of your News Feed. They’re the reason the YouTube videos you post today have less reach than clips posted in Facebook’s native media player. Algorithms are why Facebook’s “Trending” box may have pegged you (O.K., me) as a person who would consider Justin Bieber’s new facial tattoo one of the most important news items in the world on May 9.

Engineers would have us believe that algorithms are altruistic bits of code that make our lives better. Google defines them as “computer programs that look for clues to give you back exactly what you want.” But the truth is more complex. Facebook this week is being forced to grapple with that complexity after the technology news site Gizmodo alleged that the social network’s “Trending” box is subject to a significant amount of human curation, and, therefore, bias. According to unnamed former members of the Trending team interviewed by Gizmodo, some curators purposefully excluded articles from conservative outlets such as Breitbart and RedState from appearing in the module, even when they were popular enough to earn the trending designation.

These curators, contract workers who often had journalistic backgrounds, were essentially exercising editorial judgment to determine which stories were newsworthy enough to warrant inclusion in the Trending feature, the piece alleges. The story surprised many readers who believed the Trending to mean trending, a.k.a. an impartial reflection of what’s popular on Facebook. Responding to the Gizmodo story, Facebook said it doesn’t prohibit any news outlet from appearing in the module and uses rigorous guidelines to ensure “consistency and neutrality.”

That’s fine. Determining what information in the world is worthy of wide dissemination is a role news organizations have played for centuries. But Facebook doesn’t explain anywhere in or around the trending box that humans are selecting some of the stories that receive such prominent placement. Nor does the company explain that the “Trending” items are tailored to each user’s interests, rather than an ordered list of the biggest news items across Facebook on a given day. The end result nudges users to place a disproportionate amount of trust in an algorithm and an unknown team of curators whose processes are completely opaque. All of this is occurring at a time when traditional media outlets are held accountable by their readers more than ever thanks in part, ironically, to social media platforms like Facebook and Twitter.

It’s long past time for Facebook to be more transparent about the methods it uses to amplify information. The Menlo Park, Calif. company, like many tech giants, wants to be seen as a nonpartisan entity whose role is to deliver information from one user to another. Here’s Adam Mosseri, Facebook’s product management director for News Feed, explaining to TIME last year how the company organizes information in the feed:

“The thing that’s important to remember is we can’t start an editorialized feed. That doesn’t mean we don’t have values, but there’s a line that we can’t cross, which is deciding that a specific piece of information–be it news, political, religious, etc.–is something we should be promoting. That’s an editorial point of view the way a newspaper has, so we can’t do that.”

The News Feed and the Trending box are distinct products with separate teams. But in its official response to the Gizmodo story, Facebook took a similar tack in trying to dodge accusations of bias.

“We take allegations of bias very seriously,” a spokesperson said in an emailed statement. “Facebook is a platform for people and perspectives from across the political spectrum… These guidelines do not permit the suppression of political perspectives. Nor do they permit the prioritization of one viewpoint over another or one news outlet over another. These guidelines do not prohibit any news outlet from appearing in Trending Topics.”

But as we better understand the ways technology platforms can affect the way people feel, spend and vote, the companies in charge of these platforms will have a tougher time arguing that they are inherently neutral. Stories in News Feed are ranked using thousands of factors to determine an ambiguous assessment of relevance, but Facebook won’t tell users what those factors are. The chance of bias in Trending is even more obvious, with certain stories being added to the box or removed by young journalists “primarily educated at Ivy League or private East Coast universities,” according to Gizmodo.

Bias will emerge because Facebook is run and operated by human beings, not robots. (Sorry, M) Facebook says it has guidelines to ensure neutrality in the Trending box. It should share these guidelines publicly to help users better assess whether Facebook’s values, as a disseminator of news, align with their own. More importantly, the company should indicate within its interface that both the Trending box and the News Feed are subject to curation. Not for the media-obsessed journalists who are following the Trending controversy closely, but for the other 1.5 billion users who log onto the site and may not be aware of the secret strings Facebook is pulling behind the scenes.

More Must-Reads From TIME

Contact us at letters@time.com

TIME Ideas hosts the world's leading voices, providing commentary on events in news, society, and culture. We welcome outside contributions. Opinions expressed do not necessarily reflect the views of TIME editors.