Facebook announced new guidelines on Thursday for research on the social-media giant’s 1.3 billion users. The new oversight includes enhanced reviews of any proposed studies that would focus on particular groups or populations, or that would relate to “content that may be considered deeply personal (such as emotions),” the company said in a statement written by Mike Schroepfer, its chief technology officer.
The policy change followed a months-long debate over a controversial study that manipulated users’ news feeds to see how the changes affected the emotional content of the users’ own posts. On the one hand, privacy advocates and others accused Facebook of questionable ethics in conducting the experiment, and even the journal that published the study, the Proceedings of the National Academy of Sciences, later expressed concerns in an editorial about whether Facebook’s collection of the data used in the study had followed best practices of obtaining informed consent and allowing participants to opt out.
On the other hand, some scholars, including a group of 33 ethicists, have defended the study, with some warning that the backlash against it could put a chill on future research by academic scientists on the troves of data collected by corporations.
Indeed, Facebook’s announcement of the new policy suggested that the company was taking a harder look at how academic researchers use its data. Projects that will be subject to enhanced review, Mr. Schroepfer’s statement says, include work that “involves a collaboration with someone in the academic community.”
The review panel, the statement says, will include some of the company’s own senior researchers, engineers, and legal, privacy, and policy experts. The company will also train all of its engineers in research ethics, the statement says.
The study at the center of the controversy this past summer was designed in part by researchers at Cornell University. Jeffrey T. Hancock, a Cornell professor and one of the researchers, told The New York Times he was pleased that Facebook was going to train its researchers in ethics. He said it was important to know, however, what standards the company would use to judge internal research, and whether it might conduct future experiments like the emotion-manipulation study and not disclose the results.
“Will they keep doing those and not publish them?” Mr. Hancock asked. “Or does the review panel say, We need to think about that?”