girl holding cellphone

Bioethics Forum Essay

Facebook’s Emotion Experiment: Implications for Research Ethics

Several aspects of a recently published experiment conducted by Facebook have received wide media attention, but the study also raises issues of significance for the ethical review of research more generally. In 2012, Facebook entered almost 700,000 users – without their knowledge – in a study of “massive-scale emotional contagion.” Researchers manipulated these individuals’ newsfeeds by decreasing positive or negative emotional content, and then examined the emotions expressed in posts. Readers’ moods were affected by the manipulation. Of the three authors, two worked at Cornell University at the time of the research; all three are social scientists.

The study raises several critical questions. Did it violate ethical standards of research? Should such social media company studies be reviewed differently than at present? Did the journal publishing the study proceed appropriately?

Federal regulations concerning the conduct of research technically apply only to certain categories of government-funded research and follow ethical principles closely paralleling those of the Helsinki Declaration. Requirements include prospective review by an institutional review board (IRB) and (with some exceptions) subjects’ informed consent. Institutions conducting federally-funded research indicate whether they will apply these guidelines to all research they conduct. These guidelines have become widely-used ethical standards.

Was Facebook obligated to follow these ethical guidelines, since the study was privately funded? At the very least, it seems that the behavioral scientists involved should have followed them, since doing so is part of the professional code of conduct of their field.

The absence of consent is a major concern. Facebook initially said that the subjects consented to research when signing up for Facebook; but in fact the company altered its data use agreement only four months after the study to include the possibility of research. Even if research had been mentioned, as it is now, it is doubtful that would have met widely accepted standards. This statement now says, “we may use the information we receive about you . . . for internal operations, including troubleshooting, data analysis, testing, research and service improvement . . .” Not only do most users of online services fail to read such use agreements (perhaps not unreasonably), but mere mention of the possibility of research provides users with no meaningful information about the experiment’s nature.

Regulations stipulate that researchers must respect subjects’ rights, and describe for participants the study’s purposes, procedures, and “foreseeable risks or discomforts.” IRBs can waive informed consent in certain instances, if the subjects’ “rights and wellbeing” will not be compromised. The regulations permit such waiver only for minimal risk research – where risks are “not greater . . . than those ordinarily encountered in daily life.” One could argue that having newsfeeds manipulated is part of the “daily life” of users, because Facebook regularly does so anyway. That argument seems dubious. While the company may regularly focus newsfeeds on content of interest to the user, not all manipulations are the same.  Here, the goal was to alter the user’s mood, a particularly intrusive intervention, outside the expected scope of behavior by a social media company.

Did the study otherwise involve more than minimal risk? Seeking to alter mood is arguably less benign than simply ascertaining what type of music users prefer. Altered mood can affect alcohol and drug use, appetite, concentration, school and work performance and suicidality. It is not clear that this experiment altered mood to that degree. However, we do not know how far Facebook or other social media companies have gone in other experiments attempting to alter users’ moods or behavior.

The study also appears to have included children and adolescents, raising other concerns. Their greater vulnerability to manipulation and harm has led to special provisions in the federal regulations, limiting the degree of risk to which they are exposed, and requiring parental consent – not obtained by Facebook. The company could easily have excluded minors from the study – since users all provide their age – but apparently didn’t do so.

Susan Fiske, who edited the article for PNAS, has stated, “The authors indicated that their university IRB had approved the study, on the grounds…[of] the user agreement.” The authors told the journal that Facebook conducted the study  for internal purposes.” Cornell’s IRB decided that its researchers were “not directly engaged” in human subject research, since they had no direct contact with subjects, and that no review was necessary. But federal guidance stipulates that if institutions decide that investigators are not involved,“it is important to note that at least one institution must be determined to be engaged,” so that some IRB review takes place. There is no evidence that any other IRB or equivalent review occurred – highlighting needs for heightened IRB diligence about this obligation.

PNAShas since published an “Editorial Expression of Concern” that the study “. . . may have involved practices that were not fully consistent with the principles of obtaining informed consent . . .”PNASrequires all studies to  follow the principles of the Declaration of Helsinki, which mandates as well that subjects be informed of the study’s “aims, methods . . . and the discomfort it may entail.” Many journals request that authors inform them about IRB review; however, one lesson from the Facebook study is that journals should ask for copies of the IRB approval. Journals should also require authors to state in the published article whether IRB approval was obtained – information often not reported– so that readers can assess these issues as well.

Scientific research, which can benefit society in manifold ways, requires public trust. Research that sparks widespread public outcry – as this study has – can undermine the scientific enterprise as a whole.

Social media companies should follow accepted ethical standards for research. Such adherence does not need to be very burdensome. Companies could simply submit their protocols to a respected independent IRB for assessment. (The objectivity of an in-house IRB, most of whose members are employees, may be questionable.) Facebook should also exclude children and adolescents from experiments that seek to manipulate subjects’ moods.

Alternatively, governments could mandate that social media and other companies follow these ethical guidelines. But public policies can have unforeseen consequences. As a first step, requesting that companies voluntarily follow these guidelines might go a long way toward improving the current situation.

Robert Klitzman is professor of psychiatry and director of the Masters of Bioethics Program at Columbia University. Paul S. Appelbaum is the Elizabeth K Dollard Professor, Columbia University Medical Center and New York State Psychiatric Institute. Both are Hastings Center Fellows.

Acknowledgements: Funding for this study was provided by P50 HG007257 (Dr. Paul S. Appelbaum, PI). The authors report no conflicts of interest. The authors would like to thank Patricia Contino and Jennifer Teitcher for their assistance in the preparation of this manuscript.

 

Posted by Susan Gilbert at 07/21/2014 03:59:38 PM |

Read More Like This