Facebook adds warning to "graphic content"
Facebook will now warn its users when they might encounter graphic content in the News Feed. Instead of automatically loading, like most other photos and videos shared to the service, this violent content must be interacted with in some way before it is displayed.
Content must meet a specific set of a parameters before receiving the warning: it must be violent, not graphic in some other way; must have been reported by Facebook's users; and must not run afoul of the content standards governing everything shared to the site.
This means that sexual content -- or really anything depicting a woman's naked body -- will continue to simply be removed from Facebook's service instead of being covered by this new warning feature, which will apply mostly to violent images shared because of their relevance to recent news.
In addition to warning some of its users about the content's graphic nature before allowing it to be viewed, Facebook will also prevent young people from viewing the photos or videos at all, according to a statement the company gave to the Guardian.
This new warning system perpetuates the idea that violent imagery is somehow more acceptable than nudity or other forms graphic content. It begs the questions, why delete photos of women breastfeeding while letting videos depicting mass violence remain on the service? Isn't that a little backwards?
But, then again, Facebook isn't known for being entirely logical about what it does or doesn't allow on its service. Semi-nude women? Banned. "Female nipple bulges" in a New Yorker cartoon? Gone. A girl hunting rare animals? Poof. A person being shot? A-okay!