Product Description
Abstract
June 2014 saw a media uproar about Facebook's emotional contagion study, published in the Proceedings of the National Academy of Sciences. In conjunction with researchers at Cornell, Facebook designed an experiment which altered the Facebook News Feed to explore if emotions can spread through Facebook. These feeds, the primary activity and content list on Facebook, are populated according to a proprietary algorithm. In the experiment, the algorithms for a random subset of users were manipulated to display either proportionately more negative emotional content or proportionately more emotional content; a control group saw content according to the current algorithm.
This study met vocal opposition not solely for manipulating the moods of Facebook users, but also because users neither volunteered nor opted in to such research, and were not informed of their participation in the study. This study is a motivating example of the moral, legal, and technical questions raised when algorithms permeate society.
This case explains the parameters of the experiment, the reaction in the media, and the legal issues introduced (including Federal Trade Commission standards for commercial practices and the US Department of Health and Human Services Policy for the Protection of Human Research Subjects, informally known as the "Common Rule"). In order to encourage examining the issues presented by algorithms in a number of different scenarios, the case uses six hypothetical situations that encourage participants to ponder the use of algorithms in different mediums, including their use with print media, charity, and business, among others. These hypothetical scenarios present varying aspects of the expanding role algorithms play, and will elicit more meaningful discussion and force participants to address through nuanced arguments the complicated issues surrounding algorithms. After briefing analyzing and debating all six scenarios, participants will delve deeper into one hypothetical, using their position on a hypothetical issue to inform their stance on the Facebook Emotional Contagion study. The exercise concludes with a class-wide debate of the ethics surrounding the Facebook Emotional Contagion study.
Learning Objectives
- Research and analyze the technical, moral, and legal issues surrounding the use of algorithms to impact daily life
- Discuss and assess the feasibility of implementing policy in a rapidly changing technology-powered landscape
- Discuss the responsibilities of interest groups, including corporations and nonprofits, in their use of algorithms
- Practice debating, writing policy briefs, and advising clients on legal options
Subjects Covered
Institutional Review Board, emotional contagion, A/B testing, social media, algorithms, terms of service, voluntary participation, ethical research, deception, business research, informed consent, Federal Trade Commission, Common Rule
Setting
Geographic: United States
Industry: Technology; Social Media; Research
Event Start Date: 2014
Accessibility
To obtain accessible versions of our products for use by those with disabilities, please contact the HLS Case Studies Program at hlscasestudies@law.harvard.edu or +1-617-496-1316.
Educator Materials
Registered members of this website can download this product at no cost. Please create an account or sign in to gain access to these materials.
Note: It can take up to three business days after you create an account to verify educator access. Verification will be confirmed via email.
For more information about the Problem Solving Workshop, or to request a teaching note for this case study, contact the Case Studies Program at hlscasestudies@law.harvard.edu or +1-617-496-1316.
Additional Information
New Mega-Hit Case Study on Tech Ethics
Copyright Information
Please note that each purchase of this product entitles the purchaser to one download and use. If you need multiple copies, please purchase the number of copies you need. For more information, see Copying Your Case Study.