Jul 7, 2014 · 3 minutes

Despite widespread condemnation over conducting a psychological experiment on more than 600,000 people without their consent, Facebook has found support from many techies, some members of the press, and essentially anyone who believes that the company's research was conducted in the name of "innovation," which should apparently allow it to do whatever it wants without fear of reprisal.

That support takes various forms. Sometimes it's direct, like when people argue that the study was no different from the algorithm tweaks Facebook and others make all the time; other times it's indirect, as when researchers worry about the company clamping down in response to the backlash or when the press excuses Facebook's actions just because it's a tech company. The direct support is worrisome because it's disingenuous -- the indirect support is dangerous.

Pando has written a lot about the argument that the study was just another product tweak. I wrote that this view is caused by the Valley's insistence on viewing people as data points that can be exploited; David Holmes wrote about the culture of contempt for its users Facebook has cultivated over the last several years. But we've written less about the tacit approval Facebook has gotten from those who argue that researchers need Facebook's data or that the study wasn't shocking.

The first argument was highlighted by Mashable with a report on researchers' fears for their most bountiful data source (their "data dream") if Facebook responds poorly to this backlash. In it, the researchers profess to be worried not only about their access to Facebook's data, but also about the ethical standards it would be able to ignore if it conducts research in private:

Gosling noted that Facebook needs academic interaction to keep it in line with the generally recognized ethics of modern social sciences. If Facebook were to do only internal research, it could end up with even less oversight and fewer reasons to limit its social experiments.

'They have some interests but their priorities are not the same as scientific priorities,' Gosling said. 'Scientists will want to publish even if it shows that Facebook is very bad for you or bad for society.' Gosling's argument ignores the fact that this study was published while Facebook was working with researchers, and that the company is said to have conducted other experiments with little oversight over the years. Researchers don't prevent Facebook from behaving unethically -- this argument seems like little more than an attempt to justify using Facebook's data even though it might be collected without informed consent, from minors, or without effective oversight.

The second argument was summarized in a Guardian column in which John Naughton argues that we "shouldn't expect Facebook to act ethically" because it's beholden to its shareholders:

Besides, the idea that corporations might behave ethically is as absurd as the proposition that cats should respect the rights of small mammals. Cats do what cats do: kill other creatures. Corporations do what corporations do: maximise revenues and shareholder value and stay within the law. Facebook may be on the extreme end of corporate sociopathy, but really it's just the exception that proves the rule.
Naughton is correct in thinking that companies like Facebook will always favor their business over their ideals. But thinking that this means people shouldn't be outraged about this study, or that there's nothing they can do except move on with their lives, is what allows companies like Facebook to retain their power for so long. A company's decision to experiment on people without their consent shouldn't just be a fact of modern life -- it should be a real controversy.

Pando's Yasha Levine made that argument over the weekend when he said that we need better laws to protect us from Facebook:

[...]what’s really needed here are real laws protecting real people from being constantly spied on and tested like lab rats for profit.

But, of course, that would go against the very nature of Surveillance Valley’s business model — and against the hallowed techno belief in the power of the market to sort everything out. Arguing that researchers' access to Facebook's data should be preserved -- perhaps by halting the backlash against this study -- or that Facebook's actions are unsurprising make it harder to introduce the laws Levine wrote about. They don't directly support Facebook and its supposed right to experiment on its users, but they indirectly support the company by hiding the truth or convincing people that psychological experiments like this are just gonna happen.