In the wake of its emotion manipulation scandal, Facebook updates research guidelines. But is it enough?
Few tech stories united so many in outrage like the revelation that Facebook had conducted emotional manipulation experiments on users without their knowledge. The tests were designed to determine if showing users more positive posts made them post positive things themselves and vice-versa (it did). While some people brushed off the controversy as just another tech company administering A/B product testing (a cornerstone of Silicon Valley buzzwords), both the digital elite as well as non-tech, non-media people made vows, of varying degrees of seriousness, to "quit Facebook." At the time I wrote that the experiments showed us that Facebook is "even more powerful and unethical than we thought."
Now, over three months after the scandal broke, and after a series of half-hearted, mealymouthed non-apologies, Facebook has finally announced a formalized change in its guidelines surrounding "research."
"Over the past three months, we’ve taken a close look at the way we do research," Chief Technology Officer Mike Schroepfer writes in a blog post. "Today we’re introducing a new framework that covers both internal work and research that might be published."
That will include "clearer guidelines" surrounding what kind of research should be conducted, especially if it involves studying "particular groups of populations (such as people of a certain age) or if it relates to content that may be considered deeply personal (such as emotions)." Schroepfer doesn't go into much detail about what those guidelines specifically will be, but I imagine they are designed to avoid not only public outcry, but also any potentially illegal discriminatory experiments that target certain ethnicities or other demographics. In any case, these more sensitive projects will be subject to internal and external review to determine if they are worth the legal and PR headaches.
Facebook will also expand some of its training materials on research so that every employee is exposed to them. And finally, the company has launched a website dedicated to explaining individual research projects and methodologies to its users. One of the common themes of Facebook's "non-apology tour" in the wake of the controversy was, it wasn't that the company did anything wrong, it simply didn't properly communicate the study to users. That said, it's since done plenty of explaining, and that hasn't much dampened the outcry.
These are all positive steps. But the tone of the post doesn't so much suggest a comprehensive shift in how Facebook conducts research as it does a desire to merely cover its own ass. And Facebook isn't alone -- OKCupid founder Christian Rudder went so far as to brag about his site's softly manipulative research, proclaiming in a blog post, "We Experiment on Human Beings!"
What that means for users of practically any site is that it's time to embrace a new normal: As long as a company isn't running afoul of the law (or it thinks it can get away with it anyway), it will run experiments on you that may involve manipulating your behavior. That could be something as innocent as placing an advertisement on different parts of the site for different users to determine where it gets the most clicks, or by telling you to go on a date with somebody even though the dating site's algorithm thinks you're a bad match.
The best outsiders can do is to work to keep these algorithms, which continue to play a greater and greater role in our digital and non-digital lives, accountable. With that in mind, Facebook's increased transparency is a great thing -- tech firms and activists alike are figuring all this out as we go along, and so I welcome any opportunity for open dialogue over what digital communities consider acceptable levels of "manipulation." J
ust remember that whenever you use Facebook or any other algorithmically-driven website that you're giving up just a little bit of control. At least now Facebook is willing to smile and tell that to our faces.
[illustration by Brad Jonas]