May 11, 2015 · 2 minutes

Facebook is testing a new search tool that allows its users to find links to news articles and other content without ever having to leave its mobile applications.

TechCrunch reports that the tool relies on information gathered from 1 trillion posts -- data the company has prevented rivals like Google from accessing. The feature is limited to a small number of consumers in the United States, and for the moment it appears to be exclusive to people using Facebook's iPhone app.

Facebook is said to display articles based on how often they've been shared, how recently they were posted, and other factors that haven't been revealed. It's not clear if the company will also include links that haven't yet been shared to its service, or if original content will still have to be found outside the service.

The feature complements another Facebook initiative: asking publishers like the New York Times to host content on its platform instead of on their websites. Just as the company wants to make sure publishers are trapped inside the News Feed, it wants to ensure readers are all-but-confined to its mobile applications.

This will give Facebook untold control over how information is transferred from one person to the next. While that's a scary thought, given that Facebook is a private company that's really only obligated to give investors healthy returns, it could have some benefits. As Pando's David Holmes explained last month:

[T]he idea of Facebook as content cop is not a happy one. But it’s also the reality of the new media landscape. And with news trends moving toward the fast consumption of content that’s often filtered through an echo chamber, Facebook could serve as a much needed bullshit-eradicator in next year’s presidential election. Because lord knows the bullshit will be flowing more thickly than ever.
Facebook wouldn't only be able to monitor publishers. This search tool could also make it harder for Facebook users to share bullshit stories without even a kernel of truth. This could be subtle (prioritizing accurate stories over bullshit ones in search results) or more overt (warning users about an article's veracity).

That would create a system rife with abuse, of course. Facebook could bury stories about its own wrongdoings, warn users about content that turns out to be true, and otherwise screw with the free flow of information. One of the perks of being a human, after all, is the ability to be absolutely wrong in your beliefs.

But there could be some benefits to bringing publishers and readers together and making it difficult for either party to escape. Facebook's already taking control of the media, and as Holmes wrote, it's better to be shackled with golden handcuffs than to be tossed into a dungeon without getting much in return.

[illustration by Brad Jonas]