Jul 28, 2014 · 4 minutes

Complaining about Facebook has become almost a sport at this point, but for a couple weeks in June and July, the disdain for the social network hit never-before-seen levels after it admitted to manipulating user emotions by tinkering with its News Feed algorithm.

The furor has largely died down, and most Facebook quitters I know have begrudgingly rejoined the service. But it's opened up a larger conversation about keeping tech firms' ethically and legally accountable for how they use algorithms or design to alter user behavior. For example, designing a website to encourage certain user behaviors like clicking is perfectly standard and a keystone of building user interfaces. Filling users' News Feeds with negative content just to see if it bums them out is something else entirely.

So what do we make of a recent blog post by OkCupid founder Christian Rudder titled "We Experiment on Human Beings!" Within the post, Rudder details how the dating site regularly runs experiments to gain insight on its users and to provide better matches. He outlines a couple fairly innocuous experiments like launching a "Love is Blind Day" where profile pictures were obscured for a day in order to determine how much looks really matter to users.

But the experiment that has everyone full of Internet outrage today is one that involved telling two prospective lovebirds that they were ninety percent matches when in fact they were only thirty percent matches. Part of the objective was to determine if OkCupid's matching algorithms actually worked. The other was to measure the power of suggestion when it comes to hook-ups -- unsurprisingly, users were seventy percent more likely to message a bad match if told that person was a good match.

(As an aside, I'm fairly certain I was part of this experiment -- How else to explain the date I went on with a woman who, despite being a 90 percent match, was an anti-science nut who believed that vaccines caused autism?)

In some ways, this experiment is even more disturbing than Facebook's. Facebook manipulated users but OkCupid outright lied to them. And while Facebook was hardly shy about treating its users like lab rats, having published its findings in an academic journal, OkCupid is almost giddy about it -- note the exclamation point after "We Experiment on Human Beings!"

So why am I, as a person who probably spent more time on OkCupid for a three or four month period last year than I did even on Facebook, not that bothered by any of this?

First off, let's be real for a second -- despite the company's best marketing efforts, there's no magic in OkCupid's algorithm. Matches are essentially the product of survey questions on everything from pets to politics to hygiene. The more answers you share, particularly on questions that matter immensely to you, the closer you will match. (Yes I am sure it is more nuanced than that, but at its heart, the data inputs appear to be fairly simple). And barring a few deal-breakers like, say, not supporting gay rights or liking the Eagles, you will generally learn more about a person in three minutes of in-person conversation than from a list of survey answers. (There's also the fact that people totally finesse these answers to make themselves into the person they want to be, not the person they are.)

So while it's certainly a break in trust for OkCupid to play it fast and loose with match percentages, the real world consequences of this are pretty limited: no one's going to miss out on the love of their life or accidentally marry a psycho because OkCupid gave them a bad match.

But the biggest difference to me between the two experiments is that OkCupid is one of the few Internet companies for whom data collection is not an exercise to sell more ads (or sell you out to government authorities). It's a good faith exercise in gathering insight about human nature and, in truth, to make a product that better serves its users. More importantly, it's one that users knowingly sign up for.

Before Rudder decided to sell a book of his findings, OkTrends was one of the most fascinating data blogs on the planet, publishing charts and graphs that leveraged OkCupid's unique data set. It was probably one of the only "content arms" of a tech company that wasn't entirely awful and self-serving. On the other hand, while Facebook says its News Feed is designed to give readers what they want, don't be fooled about the company's priorities -- it's designed to give advertisers what they want.

I'm not about to tell anyone how they should feel about OkCupid's user manipulation -- people have a right to be pissed off about this, just as people have a right to consider Facebook's emotional meddling a mere exercise in A/B testing. But the Facebook experiment and the OkCupid are worlds apart, involving two companies with very different prerogatives when it comes to data and operating at vastly different scales, both in terms of the number of users and the variety of data types collected. And so, to equate the two situations, or to use one to justify or bury the other, simply makes no sense.

[illustration by Hallie Bateman]