Did Facebook’s Secret Mood Manipulation Experiment Create Harm?

Facebook Shows No Regard for Impact of Its Human Mood Experiments

Facebook Shows No Regard for Impact of Its Human Mood Experiments

Yesterday, The Atlantic broke the story about “Facebook’s Secret Mood Manipulation Experiment” This story isn’t about Facebook manipulating news feeds for advertising–that’s not news to any of us who have been paying attention (and, I might add, their ads are labeled). This story is about Facebook subjecting people to mood experiments two years ago, without full informed consent and without a post-experiment debriefing.

 

I understand that this experiment was approved by some “local” Institutional Review Board (IRB), a board that is responsible for ensuring that human subjects are protected in research. But I don’t believe this would have passed an IRB review at my university (and I have served on the IRB), so in my opinion, whichever IRB reviewed this gets a huge fail. What Facebook did didn’t meet the current standard for human subjects research, in three major ways, two of which relate to informed consent (#1 and #2) and the last relates to a violation of the principle of beneficence (Belmont Report, 1979):
  1. Facebook’s data and research disclosure policy never discusses any risks related to having mood manipulated. And let’s be clear, there are risks. People struggling with depression, anxiety, anger problems and suicidality use this platform, and during the study they would have their negative mood amplified if they were in the “positivity reduced” condition. We don’t know if there were any negative consequences to this amplification, because no informed consent about being subjected to a mood experiment or its potential risks was ever given and no follow-up with a debriefing (e.g., telling people what they just went through) was ever provided. But there may well have been negative consequences for people who were vulnerable. I might add that some research suggests that people higher in neuroticism use social media more often and for longer periods than those who are low in this trait, Caci et al., 2014, as cited by Myers, 2014).
  2. As The Atlantic article notes, when people are deceived or manipulated in an experiment, ethical practice is to debrief them at some point. There was no such debriefing ever provided.
  3. When an experiment creates emotional distress, participants are supposed to be given resources that they can use to seek help. Of course, this ethical guideline was never met because people didn’t know they had been in an experiment, and no one assessed the degree of emotional distress that might have been caused by the experiment.

As a colleague of mine, Mike Langlois, commented on my Facebook feed when I posted about The Atlantic article: “This. Is. Important.” It remains to be seen what consequences for Facebook may come from this unethical conduct, but my hope is this breach of ethics will result in enough negative consequences to discourage such conduct in the future.

Did Facebook’s Secret Mood Experiments Create Harm?

It’s entirely possible that people who were already feeling bad, and then felt worse as a result of the experiment, may have been harmed, perhaps through increased depression, anger, anxiety, or suicidality. There might even have been increased self-harm episodes, out of control anger, or dare I say it, suicide attempts or suicides that resulted from the experimental manipulation. Did this experiment create harm? The problem is, we will never know, because the protections for human subjects were never put into place. Regardless, it disturbs me that I haven’t heard this question asked enough in the current dialogue.

Had Enough of Facebook?

I am on Facebook, mostly because that’s where “people” are: friends, family, colleagues, alumni of our school. But I am hoping that more people will join me on other platforms, as I would gladly give up Facebook. For now, as a first step, I will be canceling the advertising that I do there for one of the pages that I maintain.
I think Google+ is a good alternative to Facebook if more people would make the move over to there. It allows you to limit and vary postings for privacy, and, as far as I know, doesn’t manipulate your news feed, has some awesome photo services and the best interfaces for communities I’ve seen. Here’s a beginner’s guide to Google+ for those who are interested, Learn How to Use Google Plus

References

Belmont Report (1979). The Belmont Report: Ethical principles and guidelines for the protection of human subjects of research.

Caci B, Cardaci M, Tabacchi ME, & Scrima F (2014). Personality variables as predictors of Facebook usage. Psychological reports, 114 (2), 528-39  cited by Myers L. (June 17, 2014)  Facebook “Likes” and Twitter Followers Predict Personality Traits and More.

Image courtesy of Dimitris Kalogeropoylos on Flickr. 


Update: 7/22/14: It appears that the article from The Atlantic, cited above, has continued to change — content has been crossed out, and , it seems to me that some has been added. So I recommend checking back to reread that article again.

15 thoughts on “Did Facebook’s Secret Mood Manipulation Experiment Create Harm?

  1. Excellent critique, and right on. I would add that it also harms science as a whole, by perpetrating an unethical human subjects experiment on such a broad scale, with such wide publicity. People will naturally be more skeptical not only of facebook, but of legitimate research.

    Liked by 1 person

  2. I agree with everything you say. I performed an experiment of my own today. I deactivated my Facebook account, read, and noticed my mood improved. Initial results show promise!

    Like

  3. Pingback: Did Facebook's Secret Mood Manipulation Experim...

  4. Pingback: Facebook’s secret ‘mood experiment’ ignored ethical safeguards, says expert | News Alternative

  5. Nancy, excellent commentary. Karen–WOW! You really left facebook. I have seen a lot of good things happening on facebook–people keeping in touch, getting support for an illness, and so forth. It’s disappointing, at the very least, to hear about this kind of stuff.

    Like

  6. Finally something fresh, an article which is not only angry, but explains why the experiment was not correct.
    You’re definitely right that the study was a failure if we assess its morality or scientific proficiency. On the other hand, I see one great plus it might have (unintentionally?) brought: opening people’s eyes.
    Hopefully they would start taking more care of their privacy and some might even stop sharing every detail of their lives…

    Like

  7. Facebook has intentionally subjected it’s viewers to false information of wars and political information to create paranoia and increased stress levels. Many of it’s sites have actually created violence in people, especially where religion is concerned, mainly Muslims verses Christians.
    When I realized the information was false and harmful, I withdrew from the site permanently. This was sad for me as it took away my connection to my extended family.

    Like

    • If you have links to information about some of what you mentioned, I am sure that my readers would appreciate seeing them. The one positive side to this whole Facebook experiment fiasco is that it is surfacing some information that many people haven’t heard.

      As I read your comment I realize that I now feel an emotional hostage when it comes to Facebook: this is where family and friends are, and those connections will be lost if I leave. And for me I would add to the list of connections to be lost: students, alumni and colleagues, because Facebook plays a very large role in my work.
      It’s very sad to have to give up connections to friends and extended family because of this manipulation by Facebook (in concert with government, in some cases)–

      Like

  8. Pingback: An E-book (free) on the Facebook Mood Experiments Controversy | Virtual Connections

  9. Pingback: Best in Mental Health (weeks 6/23 - 7/5/2014) - SocialWork.Career

Comments are closed.