You are viewing an archived webpage. The information on this page may be out of date. Learn about EPIC's recent work at epic.org.

In re: Facebook (Psychological Study)

Top News

  • Following EPIC Complaint, Senator Seeks Investigation of Facebook User Manipulation Study: Senator Mark Warner has asked the Federal Trade Commission to investigate the legality of Facebook's emotional manipulation study. In a letter to the Commission, Senator Warner stated that "it is not clear whether Facebook users were adequately informed and given an opportunity to opt-in or opt-out." He asked the FTC to conduct an investigation to see "if this 2012 experiment violated Section 5 of the FTC Act or the 2011 consent agreement with Facebook," two issues raised in EPIC's earlier complaint. "The company purposefully messed with people's minds," wrote EPIC in a complaint to the Commission. EPIC charged that Facebook violated a consent decree that required the company to respect user privacy and also engaged in a deceptive trade practice. EPIC has asked the FTC to require that Facebook make public the News Feed algorithm. For more information, see EPIC: In re Facebook, EPIC: In re Facebook (Psychological Study), and EPIC: FTC. (Jul. 17, 2014)
  • EPIC Challenges Facebook's Manipulation of Users, Files FTC Complaint: EPIC has filed a formal complaint to the Federal Trade Commission concerning Facebook's manipulation of users' News Feeds for psychological research. "The company purposefully messed with people's minds," states the EPIC complaint. EPIC has charged that the study violates a privacy consent order and is a deceptive trade practice. In 2012, Facebook subjected 700,000 users to an "emotional" test with the manipulation of News Feeds. Facebook did not get users' permission to conduct this study or notify users that their data would be disclosed to researchers. In the complaint, EPIC explained that Facebook's misuse of data is a deceptive practice subject to FTC enforcement. Facebook is also currently under a 20 year consent decree from the FTC that requires Facebook to protect user privacy. The consent decree resulted from complaints brought by EPIC and a coalition of consumer privacy organizations in 2009 and 2010. EPIC has asked the FTC to require that Facebook make public the News Feed algorithm. For more information, see EPIC: In re Facebook, EPIC: In re Facebook (Psychological Study), and EPIC: FTC. (Jul. 3, 2014)

Background on the Facebook Psychological Study

Facebook Represents That Its News Feed Rankings Are Based on Content-Neutral Factors

Facebook's News Feed was created in 2006. According to a post on the official Facebook blog, the News Feed was created to allow users to “get the latest headlines generated by the activity of [their] friends and social groups.” The News Feed provides Facebook users with a summary of their friends' activity on the website. Facebook represents to users that its “News Feed is an ongoing list of updates on [a user’s] homepage that shows [him] what's new with the friends and Pages [the user] follow[s].” According to Facebook’s “Help” section, the items shown on the News Feed are determined by an “algorithm [which] uses several factors to determine top stories, including the number of comments, who posted the story, and what kind of story it is (ex: photo, video, status update).”

Some content, called “promoted posts” are placed higher in the News Feed, and labeled as “promoted.” Other posts called “suggested posts,” are interspersed into the News Feed and marked as “Suggested Post[s].” Posts that contain advertisements are marked as “sponsored.” Facebook asserts that users are able to control which posts their News Feed will show by adjusting their settings.

Facebook Represents That It Only Shares User Data With Advertisers, App Developers, and Other Facebook Users

Facebook’s Data Use Policy states that Facebook uses information it receives about its users “in connection with the services and features [Facebook] provide[s].” Facebook’s September 2011 Data Use Policy stated that Facebook “may use” information it received about its users (1) as part of its efforts to keep Facebook safe and secure; (2) to provide users with location features and services; (3) to measure or understand the effectiveness of ads; and (4) to make suggestions to users, such as suggested that a user add another user as a friend.

In a subsection called “How we use the information we receive,” Facebook’s Data Use Policy stated, “We use the information we receive about you in connection with the services and features we provide to you and other users like your friends, the advertisers that purchase ads on the site, and the developers that build the games, applications, and websites you use.” The subsection “How we use the information we receive” also stated, “Your trust is important to us, which is why we don't share information we receive about you with others unless we have:

  • received your permission;
  • given you notice, such as by telling you about it in this policy; or
  • removed your name or any other personally identifying information from it.

Facebook’s September 2011 Data Use Policy did not mention the use of users’ data for research, testing, or analysis.

In May 2012, four months after the research at issue was conducted, Facebook made changes to its Data Use Policy. These changes included adding “internal operations, including troubleshooting, data analysis, testing, research and service improvement” to the list of things for which Facebook may use information it receives from users.

The Facebook Core Data Study Used Data From Users’ News Feeds to Manipulate Users’ Emotions

The Study was designed and written by Adam D. I. Kramer, from the Core Data Science Team at Facebook, Inc.; Jamie E. Guillory, from the Center for Tobacco Control Research and Education at the University of California, San Francisco; and Jeffrey T. Hancock, from the Departments of Communication and Information Science at Cornell University in Ithaca, New York. For one week (January 11-18, 2012), Facebook “manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed” to test “whether exposure to emotions led people to change their own posting behaviors.” Facebook conducted two parallel experiments: “One in which exposure to friends’ positive emotional content in their News Feed was reduced, and one in which exposure to negative emotional content in their News Feed was reduced.”

Prior to the experimental period, Facebook reviewed subjects’ Facebook posts to determine that the “experimental groups did not differ in emotional expression during the week before the experiment.” During the experimental period, when subjects of the experiment loaded their News Feeds, emotional posts written by family and friends “had between a 10% and 90% chance (based on their User ID) of being omitted from their News Feed for that specific viewing.” “Both experiments had a control condition, in which a similar proportion of posts in their News Feed were omitted entirely at random (i.e., without respect to emotional content).”

After manipulating the subjects’ News Feeds, Facebook analyzed “the percentage of all words produced by a given person that was either positive or negative during the experimental period.” In total, over 3 million posts were analyzed, containing over 122 million words, 4 million of which were positive (3.6%) and 1.8 million negative (1.6%). Facebook used Linguistic Inquiry and Word Count software to determine whether posts were positive or negative. Because computers, not human researchers, viewed the content of Facebook users’ posts, the researchers found the study to be “consistent with Facebook’s Data Use Policy.”

Facebook asserted that it obtained “informed consent for this research” because all users agree to Facebook’s Data Use Policy “prior to creating an account on Facebook.”

The Proceedings of the National Academy of Science Study

The findings from the Facebook study were published in the June 17, 2014 issue of the scientific journal The Proceedings of the National Academy of Science (PNAS).

The study, written by Facebook researcher Adam D. I. Kramera and co-researchers Jamie E. Guillory and Jeffrey T. Hancock, is entitled Experimental evidence of massive-scale emotional contagion through social networks (PNAS vol. 111 no. 24 (June 17, 2014).

In an addendum to the June 17 issue, Inder M. Verman, the editor in chief of PNAS, published a "Correction," entitled Editorial Expression of Concern. The statement read, "Questions have been raised about the principles of informed consent and opportunity to opt out in connection with the research in this paper. The authors noted in their paper, '[The work] was consistent with Facebook’s Data Use Policy, to whichall users agree prior to creating an account on Facebook, constituting informed consent for this research.' When the authors prepared their paper for publication in PNAS, they stated that: 'Because this experiment was conducted by Facebook, Inc. for internal purposes, the Cornell University IRB [Institutional Review Board] determined that the project did not fall under Cornell’s Human Research Protection Program.” This statement has since been confirmed by Cornell University.'"

Professor Verman goes on to explain that while "obtaining informed consent and allowing participants to opt out are best practices in most instances under the US Department of Health and Human Services Policy for the Protection of Human Research Subjects (the “Common Rule”). Adherence to the Common Rule is PNAS policy, but as a private company Facebook was under no obligation to conform to the provisions of the Common Rule when it collected the data used by the authors, and the Common Rule does not preclude their use of the data."

The Correction concludes, "Based on the information provided by the authors, PNAS editors deemed it appropriate to publish the paper. It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out.

On July 16, 2014, Michelle N. Meyer, director of bioethics policy at the Union Graduate College-Icahn School of Medicine at Mount Sinai Bioethics Program, published a statement in the science journal Nature rejecting the wisespread criticism of the Facebook study. In her letter, which was co-signed by 32 other ethicists, Meyer wrote, "The Facebook experiment was controversial, but it was not an egregious breach of either ethics or law. Rigorous science helps to generate information that we need to understand our world, how it affects us and how our activities affect others. Permitting Facebook and other companies to mine our data and study our behaviour for personal profit, but penalizing it for making its data available for others to see and to learn from makes no one better off."

Responses to the Facebook Study

EPIC's Complaint

On July 3, 2014, EPIC filed a formal complaint with the Federal Trade Commission regarding Facebook's deceptive trade practices. The complaint concerns Facebook’s "secretive and non-consensual use of personal information to conduct an ongoing psychological experiment on 700,000 Facebook users, i.e. the company purposefully messed with people’s minds."

The FTC Act prohibits unfair and deceptive acts and practices, and empowers the Commission to enforce the Act’s prohibitions. Under the Act, a business practice is deceptive if it "involves a representation, omission, or practice that is likely to mislead the consumer acting reasonably under the circumstances," and is "material," or meaningful to the consumer. The FTC presumes that an omission is material where “the seller knew, or should have known, that an ordinary consumer would need omitted information to evaluate the product or service, or that the claim was false . . . because the manufacturer intended the information or omission to have an effect.”

In the complaint, EPIC charged Facebook with engaging in deceptive business practices in violation of Section 5 of the Federal Trade Commission Act. EPIC explained that Facebook's misrepresentations were likely to mislead reasonable users. Further, these misrepresentations were "material," since Facebook represented to consumers that the company shared user data with users’ “friends” on the website, advertisers, and developers, but actually shared user data with third-party researchers at multiple universities. Furthermore, Facebook subjected certain users to ongoing behavioral testing by collecting user data and feeding it into a separate algorithm without informing users that they were potentially subject to behavioral testing.

EPIC also charged Facebook with violating a 2012 FTC Consent Order. On July 27, 2012, following a complaint by EPIC, the Commission entered into a consent order with Facebook regarding violations of Section 5 of the FTC. The Consent Order established new privacy safeguards for Facebook users and prohibits Facebook from misrepresenting the extent to which it maintains the privacy or security of covered information. Specifically, Count I of the Consent Order includes language that prohibits Facebook from misrepresenting “its collection or disclosure of any covered information,” and “the extent to which Respondent makes or has made covered information accessible to third parties.” In the Psychological Study complaint, EPIC charged Facebook with misrepresenting the extent to which it made covered information accessible to third parties, in contravention of Count I of the Consent Order.

Related Investigations

Irish Data Protection Commissioner Billy Hawkes has announced that his office will launch an investigation in the wake of the study. “We will be examining more closely the uses that Facebook is making of personal data for research purposes” Hawkes told the Wall Street Journal. Facebook is headquartered in Ireland, and has been subject to privacy monitoring by Hawkes’ office since 2011. The Data Commissioner’s Office can levy fines of up to 100,000 pounds ($167,900) for each violation of the law. The Journal reported that the Commissioner’s office would be investigating this and other Facebook studies to ensure that the company complies with Irish privacy laws. Irish public media organization RTÉ reported that Hawkes contacted Facebook, asking for an explanation and more information on the study.

The UK's Information Commissioner’s Office (ICO) also plans to investigate Facebook’s actions. The UK's Data Protection Act requires companies to specify the purpose for collection and companies must inform users if those purposes change. ICO spokesman Greg Jones told Reuters that it could not yet know if Facebook had violated the law, but that the agency was “aware of this issue, and will be speaking to Facebook, as well as liaising with the Irish data protection authority, to learn more about the circumstances.” The ICO has the authority to force companies to change their practices and to impose fines up to 500,000 pounds ($839,500). However the legal blog Information Rights and Wrongs stated that “the ICO is highly unlikely to have any power to investigate, let alone take action” against the Irish-headquartered company, due to jurisdictional constraints. Facebook spokesman Richard Allen said in a statement, “We are happy to answer any questions regulators may have.”

EPIC and Research Ethics

EPIC has previously participated in discussions about the “Belmont Report” and the future of the Common Rule:

EPIC and the FTC

EPIC is the group responsible for several of the Federal Trade Commission's major privacy decisions, including:

News Reports

Share this page:

Defend Privacy. Support EPIC.
US Needs a Data Protection Agency
2020 Election Security