There are worse things than manipulated ‘Trending' stories lists
The only thing worse than an allegedly manipulated 'Trending' list? An unfiltered one
The fuss over a social-networking site possibly manipulating its list of trending topics reached a predictably silly extreme yesterday when the chair of Senate Committee on Commerce, Science, and Transportation sent a sternly worded letter demanding answers.
While we wait to see how Facebook (FB) founder, chairman and CEO Mark Zuckerberg responds to the inquiry from Sen. John Thune (R.-S.D.) over Gizmodo’s report that Facebook staffers suppressed right-wing news sources in its Trending list, here are a few things to keep in mind:
Facebook’s Trending list needs help
Facebook’s Trending list—towards the top right of the News Feed in a desktop browser, shown in its mobile apps when you tap the search field—has historically functioned as a real-time display of the poor taste of other Facebook users.
As I type this, the list in a test account and the one visible in my real account feature such info-morsels as Justin Bieber’s new tattoo and the latest uttering by Phil Robertson of “Duck Dynasty” about who should use which bathrooms.
I am about as interested in Bieber’s tattoos and Robertson’s takes on gender identity as I am in their grasp of nuclear physics.
And yet Facebook drives an enormous amount of news readership among younger users: A 2015 Pew Research Center survey found that 61% of Americans born between 1981 and 1996 get political news via Facebook in a given week.
(Facebook also maintains a less-inane trends database called Signal, but only journalists can sign up for it. I should use it more often.)
Without some oversight, Trending would be worse
Facebook’s Trending list, however, looks like the Economist next to the stuff in my News Feed, where I routinely see people share “news” that is not just vapid but fake and deliberately so.
The Pulitzer Prize–winning PolitiFact has an entire category for “Facebook Posts.” That fact-check site has rated 36% of them “pants on fire” made-up, 18% entirely “false,” and 14 % “mostly false.”
The problem of hoaxes going viral on Facebook is bad enough that the Washington Post launched a weekly “What Was Fake” feature to debunk this nonsense. Less than two years later, my former employer gave up trying because people sharing these hoaxes ignored its help.
Do you want that kind of popular garbage in Trending? Facebook apparently thinks you don’t, most likely because you’d recoil in horror from such an honest mirror.
Filtering demands human judgment. But whose?
Gizmodo’s story, citing testimony from two former Facebook contractors, reports that the social network “routinely suppressed news stories of interest to conservative readers.”
Gizmodo technology editor Michael Nunez added that more of these “news curators”—which he described in an earlier post as overworked and prone to burnout—said they’d injected other topics into the list and kept stories about Facebook itself out of this spotlight.
Facebook’s response, less than a day later: We do no such things. Search vice president Tom Stocky posted that Trending starts with an algorithmically-generated list of stories that human curators then filter to remove “junk or duplicate topics, hoaxes, or subjects with insufficient sources.”
Stocky added: “Our reviewers' actions are logged and reviewed, and violating our guidelines is a fireable offense.” He did not share any of those logs.
Filtering for truth can set back sites that publish things that are false
Here’s where the Gizmodo report and Stocky’s rebuttal could overlap. Nunez’s post cites Breitbart.com and Drudge Report as news sources that suffer from Facebook’s filtering—but they also often publish untruths.
Well before its recent turn as Donald Trump’s cheering section, Breitbart was notorious for running nonsense like a report that former Sen. Chuck Hagel (R.-Neb.) had spoken before the nonexistent group “Friends of Hamas.”
Drudge, meanwhile, uncritically links to conspiracy-theory sites like Infowars and has been known to use a photo of Palestinians with a headline about Mexican illegal immigration because… they’re all brown anyway?
But now that the Republican Party is set to nominate Donald Trump, a man with a habit of making stuff up and recycling other people’s lies, Facebook’s ambition to keep Trending a truth-first zone may be unsustainable.
A little transparency could have spared us much of this angst
Facebook’s documentation of Trending captures none of this nuance. That help page blandly states that “topics you see are based on a number of factors including engagement, timeliness, Pages you've liked and your location.”
This falls into a larger pattern of Facebook being intentionally obscure about its basic workings. The popularity of any one thing we share and the makeup of our News Feeds remain mysteries.
That represents a striking contrast to the situation at Twitter and Google, which face the same problems of spam and abuse but have been able to provide a little more clarity about their systems.
With that history at Facebook, it’s not much of a surprise that people now see a conspiracy afoot in a place where nobody may have any time to conduct one.
Email Rob at [email protected]; follow him on Twitter at @robpegoraro.