Why the 2016 election may be the worst yet for journalism -- and what Facebook can do to fix it
April is a big month for journalists who like to think and talk about journalists, what with the Pulitzer finalists announced last week. But yesterday was also a big day for media navel-gazers as Pew published its annual State of the News Media report. The sprawling study includes a number of fascinating takeaways about the audiences, revenues, and user habits of the modern news economy, like that more people than ever look to Facebook -- and their phones -- to discover and consume news.
It's worth debating whether these and other trends highlighted by the study bode well for building a well-rounded and knowledgable electorate ahead of the 2016 presidential race. After all, big national elections are as good a barometer than any when determining if the news media -- and the new digital distribution channels -- are properly serving readers. And while it may sound like mere Luddite-ism or get-off-my-lawn syndrome to argue this, the data suggests that despite the massive wealth of information on the Internet, the way audiences access and consume that information is probably making us dumber and worse at making smart voting decisions.
The report leads with the statistic that "39 of the top 50 digital news websites have more traffic to their sites and associated applications coming from mobile devices than from desktop computers." That was probably an inevitable development, and who really cares anyway -- a screen's a screen, right?
Wrong. On 25 of the top 50 news sites, desktop readers spent more time per article than mobile readers did. The opposite was true for only 10 of those sites, while the time spent was around the same on mobile and desktop for the remaining 15 sites. And so as more and more readers shift their consumption habits to mobile, more and more will simply read the scandalous headline and lede ("<Insert Name> Is the Most Corrupt Presidential Candidate of All Time!") and miss the body of the story where the list of misdeeds is revealed to be pretty bland and garden-variety, at least when compared to the corrupt repertoire of a Warren Harding or Ulysses S. Grant. It's simply too easy on mobile to become distracted by the barista ready to take your order or the television in the background or the next piece of content within the social app you used to discover the story.
Speaking of social apps, almost half of web-using adults get their news from Facebook, making it the second-top news source at just a percentage point behind local television news. That number is even higher among Millennials, 88 percent of whom say they use Facebook "regularly" to find news.
On one hand, using Facebook as a news source -- which culls together stories from all over the web -- may be preferable to the news habits of Americans who get their news solely from one highly partisan source like Fox News. That said, social media can also have a polarizing effect on a populace when the media they consumed is filtered by friends and Facebook algorithms. In a widely-read 2013 New York Times column, David Carr wrote, "Unless you make a conscious effort to diversify your feeds, what you see in your social media stream is often a reflection, even amplification, of what you already believe. It’s a choir that preaches to itself." That's not the best recipe for a well-rounded electorate. And while the data shows that moderate and middle-of-the-road voters see plenty of diverse opinions on social media, that is far less true for "consistently liberal" or "consistently conservative" social media users, whom Pew says "have a greater impact on the political process than do those with more mixed ideological views. They are the most likely to vote, donate to campaigns and participate directly in politics."
Even if you do diversify your feeds, algorithms like the one Facebook uses for its News Feed are there to make sure you see lots of posts that are already aligned with your interests or worldviews. That also holds true for the algorithms used by search engines like Google -- which according to a recent study are more trusted news sources than traditional publications and social media.
And finally, the rise of Facebook as the predominant news source is becoming problematic for many smaller, less well-capitalized outlets. Facebook has continued to de-prioritize pages belonging to media sites when presenting content on its News Feed. And I fear this is leading to a "pay-to-play" system where the biggest outlets -- either because they can afford to run a high number of broad promoted campaigns on Facebook or because they strike explicit content deals with the outlet -- will be some of the only news sources to break through the noise. Meanwhile, those smaller outlets -- precisely the ones that often have the bravery or verve to investigate the most powerful political players -- may find that their work falls on deaf ears, at least on Facebook where everybody's beginning to get their news anyway.
There is one way however that Facebook could greatly improve the quality of political discourse ahead of the 2016 election. I have a friend who nearly everyday texts me URLs to outrageous stories -- most of them found on Facebook -- that are 100 percent untrue. Like, Bill Nye-getting-arrested-for-"manufacturing-marijuana"-untrue. And then of course there are plenty of other articles -- some of them on very reputable sites -- that are filled with falsehoods. That's to say nothing of the misleading memes and image macros -- like the infamous Detroit Press photo of a crowd pushing through a door to pick up federal housing assistance forms -- that have been photoshopped and meme-ified to falsely represent #BlackBrunch protesters and other demonstrations, usually to racist effect.
Meanwhile, like it or not, Facebook is positioning itself as a major gatekeeper of content, removing images it deems inappropriate and often doing so with a measure of hypocrisy and overblown Puritanism. But if Zuckerberg and his algorithms are going to fall on a fainting couch over bare nipples, can't it do something to police false content? Like, ban accounts that repeatedly share false information? Or remove false image macros and stories? Or at least send them to social purgatory by greatly de-prioritizing them in News Feeds? Or as I've proposed before, maybe Facebook could assign an "Impact Score" -- not unlike the metric used in academia to determine the trustworthiness of a journal publication -- for news organizations, to let less savvy readers know when they're being fooled?
Again, the idea of Facebook as content cop is not a happy one. But it's also the reality of the new media landscape. And with news trends moving toward the fast consumption of content that's often filtered through an echo chamber, Facebook could serve as a much needed bullshit-eradicator in next year's presidential election. Because lord knows the bullshit will be flowing more thickly than ever.
[illustration by Brad Jonas]