Aug 22, 2014 · 3 minutes

Earlier this week, after the Islamic militants of ISIS beheaded journalist James Foley and posted a horrific video of his murder, Twitter suspended a number of ISIS-affiliated accounts. CEO Dick Costolo clarified the company's policy in a tweet, saying, “We have been and are actively suspending accounts as we discover them related to this graphic imagery" (though, apparently, accounts associated with major news organizations are exempt from this rule).

But while centralized networks like Twitter and Facebook can remove content or ban users with a virtual flip of a switch, not all networks function like this. Take Diaspora for example. Instead of managing and storing data on servers owned within one organization, Diaspora lets users set up smaller, local servers called "pods." Diaspora's administrators cannot police the content on these servers, only the person or "podmin" who set up that particular pod can.

For proponents of the open Internet, and adherents to the "free and open source software" (FOSS) philosophy, that's a good thing. The Internet, many argue, is not something that should be controlled by a handful of large companies or governments, and products like Diaspora help ensure there will always be a place where the free-flow of information is unhindered.

But there's a downside to this open Internet utopia, and we're seeing it right now as ISIS, spurned by Twitter and Facebook, has begun to adopt these localized Diaspora pods to promote its extreme messages and activities. And because Diaspora is working like it's "supposed" to, there's little the social network can do about it.

"diaspora* is a completely decentralized network which, by its nature, consists of many small servers exchanging posts and messages," Diaspora's creators wrote in a blog post. "There is no central server, and there is therefore no way for the project's core team to manipulate or remove contents from a particular node in the network (which we call a 'pod'). This may be one of the reasons which attracted IS activists to our network."

That doesn't mean it's not concerned about ISIS activity on its platform, however. It's working to alert individual administrators when extremist propaganda is found on one of the pods, if for no other reason than that the pod administrators may face legal consequences for hosting that content. But they also say it would go against the network's FOSS philosophy to attempt to "influence" these administrators in any way.

It's hard to ask much more of Diaspora which, by the design of its creators, makes it not only philosophically problematic but virtually impossible to police this content. Diaspora is a tool, not a centralized consumer product like Twitter. It's like blaming inventor of the web Tim Berners-Lee for online child pornography. For its part, Diaspora has been cataloging the accounts belonging to ISIS members and supporters to make it easier for podmins to remove the content themselves, and the platform claims that "all of the larger pods have removed the IS-related accounts and posts."

This is yet another example of what's become an ongoing narrative surrounding sites that host user-generated content. The more protections sites offer to preserve the open Internet and all of the good that comes with that, the more chances there are for people to post content that runs afoul of laws and public taste. And as this story shows, there is likely no way to keep ISIS off the social web entirely. But the good news is, as larger networks like Twitter and Facebook, along with the pods that Diaspora can control, continue to apply and enforce standards on content like ISIS' propaganda, these extremists are forced to find more and more farflung (and less popular) networks to push their violent, hateful message.

[Illustration by Brad Jonas]