Dec 29, 2014 · 1 minute

Facebook has apologized for the insensitivity of a feature which relied on algorithms to collect a year's worth of events, status updates, and photographs into a single presentation after it was criticized for showing images of deceased family members.

Eric Meyer, the user who first wrote about the "Year in Review" feature's morbid callousness, has also apologized to Facebook for not making clear the company's efforts to console him for the algorithmic fuck-up before he published his blog post.

But his original point -- that companies should account for all their users instead of building products for an idealized version of the human condition -- still stands. It might even be more relevant now that it's clear Facebook didn't know of the problem.

The algorithm's decision to highlight a photo of a deceased family member is akin to a gun misfiring: though the damage was inflicted by a machine, the problem might never have happened if that machine had never been exposed to human error.

This means even the smallest oversight, when built into an algorithm which can reach more than 1 billion people, can wreak havoc on some Facebook users even though the company only intended to offer a convenient year-end highlights reel.

Facebook's engineers are essentially being asked to work while knowing their actions can harm untold numbers of people because of a simple mistake. Knowing the harm was done by a digital machine helps shift the blame, but it's still the workers' fault.

How can anyone be expected to account for the individual tragedies of more than a billion people? A machine can only do as it's told, and in so many cases, it's clear Facebook can't always know when its instructions will accidentally harm users.

[illustration by Brad Jonas]