Tag Archives: Nurnberg Code

Facebook conducted psychological experiments on its users

Facebook being unethical – again

I think at this point it’s safe to say that ethics isn’t necessarily one of Facebook‘s concerns, and this study shows it once again. What am I talking about? A covert experiment which influenced the emotions of 600,000 people, without asking for permission.

The entire situation is starting to become one big Monty Python sketch. Was permission for the study granted? At first, the answer was ‘yes’, but it quickly changed to ‘no’. Then it became ‘maybe’ , but the final call was still ‘no’. Initially, they said it was funded by the US military, but then, that statement was retracted, without any single further explanation. It’s easy to understand why this caused uproar, and massive discussions about the lack of ethics regarding this study. This pretty much sums it up:

“What many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we’ll respond to but actually change our emotions,” wrote Animalnewyork.com in a blog post on Friday morning.

Cornell University and the head of the study quickly washed their hands of the whole thing, saying that they had nothing to do with the gathering of the data and the experiment, they merely interpreted the results:

“Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any individual, identifiable data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research.”, Cornell University said in a statement.

The social experiment

Facebook’s CEO, Mark Zuckerberg, hasn’t replied to the heavy accusations brought to Facebook.

What the experiment actually did was that for one week, they changed the content of news feeds for a random sample of Facebook users (over 600,000). For one group of users they removed content that contained positive words, for another group they removed content that contained negative words. The point was to see whether this biased way of presenting things had any effect on users’ emotions. Interestingly enough, it did – news which was presented in a more positive way had a more positive impact, and vice versa. The problem was that the users didn’t know they were participated in any research – just like with Twitter, it can be argued that the data for this study is unethical.

Scientifically, it can clearly be said that the study has a significant value. The number of people which were involved is absolutely huge – it’s quite possibly the largest sample size ever used in a psychological study – so there is a high statistical relevance. However, the statistical difference was one of the smallest ever published – so the results, while noticeable, are extremely small.

But the problems with this study isn’t that the results were small – it’s that again, Facebook didn’t get the approval of the participants. Participation in a study is at the core of science ethics since WWII, and this is simply against those ideas.

“It’s completely unacceptable for the terms of service to force everybody on Facebook to participate in experiments,” said Kate Crawford, visiting professor at MIT’s Center for Civic Media and principal researcher at Microsoft Research.

Facebook said that the study was conducted anonymously, so researchers could not learn the names of the research subjects – but the fact remained that they attempted to manipulate the feelings of some of its users without consent. They also don’t seem to care, since they didn’t even bother to clarify the situation. Gotta love Facebook!