Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Events
    • Virtual Events
    • Chronicle On-The-Road
    • Professional Development
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Events
    • Virtual Events
    • Chronicle On-The-Road
    • Professional Development
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    College Advising
    Serving Higher Ed
    Chronicle Festival 2025
Sign In
Research

In Backlash Over Facebook Research, Scientists Risk Loss of Valuable Resource

By Paul Voosen July 1, 2014

It was a remarkable result: By manipulating the news feeds of thousands of Facebook users, without their knowing consent, researchers working with the goliath of social media found that they could spur a significant, if small, effect on people’s behavior in the world beyond bits.

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

It was a remarkable result: By manipulating the news feeds of thousands of Facebook users, without their knowing consent, researchers working with the goliath of social media found that they could spur a significant, if small, effect on people’s behavior in the world beyond bits.

The year was 2010. The scientists were poking at voting patterns in the U.S. midterm elections. And when the results came out two years later, in Nature, there was barely a peep about questionable ethics.

As you may have heard, a more recent study, conducted by Facebook and co-designed by researchers at Cornell University, has kicked off a vigorous debate about the influence of Facebook’s algorithms over our lives and, more specifically to academe, whether researchers should be more careful in how they collaborate with the social-media giant.

The response to the study, which examined how positive or negative language spreads in social networks, has been blistering, and has raised credible criticisms about whether Internet users should be informed about experiments that test profound questions about human behavior.

But the backlash, researchers say, also poses the risk that the corporations that govern so much of our day-to-day experience online will decide there’s less benefit in allowing academic scientists to have access to their internal data. When it comes to a choice between enlightening society or expanding the bottom line, for corporations, there’s rarely a question of which side will win.

“The main consequence is that academics will be wary of collaborating with Facebook,” said Michelle N. Meyer, an assistant professor of bioethics at the Icahn School of Medicine at Mount Sinai, in New York. Facebook, she added, “will not have an incentive to collaborate with researchers motivated by publications.” The research will still happen, but in private. It’s not going to be published and discussed.

“I’m definitely worried that’s going to be the upshot,” added David M.J. Lazer, a political scientist at Northeastern University and leader in computational social science. While the research that has come through Facebook has not fundamentally changed our view of the world, Mr. Lazer said, it’s been clever, and many have viewed it as a down payment on more work to come.

‘You Are Likely to Be Next’

Briefly stated, over one week in early 2012, Facebook randomly selected a cohort of nearly 700,000 users and divided them into four groups. Like every other person on the service, the users were exposed to a custom suite of posts on their news feeds, dependent on Facebook’s algorithms. But for two of the groups, Facebook tweaked the algorithm, making it less likely that the subjects would see posts automatically classified as containing either positive or negative language. (Such classification itself is an error-prone endeavor.)

The researchers found that users who saw fewer positive posts were less likely to post something positive, and vice versa. That was surprising: Existing research had seemed to indicate that when people on Facebook compare the positive spin that friends post about their lives with the reality of the day-to-day, they can come away disappointed or sad. The new study showed that similar emotional language could be contagious, but with a vanishingly small effect: Over the next week, as one of the researchers posted (on Facebook), there was one fewer emotional word posted for every thousand words measured.

A data scientist at Facebook, Adam D.I. Kramer, conducted the research, collaborating with a Cornell researcher, Jeffrey T. Hancock, and his former postdoc on its design and subsequent analysis. Since the Cornell duo did not participate in data collection, the university’s institutional review board concluded that the study did not merit oversight from its human-subjects panel. The team published the study in early June in the Proceedings of the National Academy of Sciences, a premiere journal, and made broad claims in it: Not only were the tests influencing language choices, they said, but also the emotions of the Facebook users.

That was enough.

Unlike the 2010 voting study, the notion that researchers were manipulating emotions, even if it was a questionable conclusion, found a press and a public already wary of Facebook’s influence. In particular, the idea that Facebook was intentionally making people “sad” allowed easy access to deep fears about scientists’ and corporations’ damaging society. The experiment wasn’t about expanding democracy or encouraging organ donation, another prominent Facebook study. The study became a catalyst for discussing the role Silicon Valley companies play in modern society.

ADVERTISEMENT

Already, one prominent bioethicist, New York University’s Arthur L. Caplan, has called for curbing the intrusion of Internet companies into our lives: “When entities feel entitled to experiment on human beings without informed consent, without accountability to anyone but themselves, that’s when bad things happen to research subjects,” Mr. Caplan wrote in a commentary with Charles Seife, a journalism professor at NYU. “And it’s now clear that if we don’t insist on greater regulatory oversight of their ‘research,’ you are likely to be next.”

The Limits of Informed Consent

In particular, the reaction may spur an important debate on the limits of informed consent, a bedrock principle in experiments involving human beings. Such consent is not universal; for example, in political science, it’s common to test various methods of voter turnout, including through social pressure, without consent. But those small-scale trials have never attracted much attention, while any story involving Facebook is likely to be covered on the Internet by reporters hungry for clicks.

Still, Ms. Meyer said, there is probably a middle ground between relying on Facebook’s obscure data-use policies to serve as informed consent, as the latest study did, and a full-scale campaign anytime the company seeks to conduct an experiment. Facebook could send those 700,000 people a notice describing potential research in a general way, and allow them to opt in or out. It’d be tricky to do without biasing the research, but so be it.

Even so, Ms. Meyer added, “I bet a bunch of people would have still been upset.”

ADVERTISEMENT

In the backlash to the study, there’s also been the assumption of a neutral world that, in many ways, no longer exists. We are constantly being sold and manipulated, Ms. Meyer said. This is an age when the characters on cereal boxes make eye contact with children in the supermarket. Internet companies and political campaigns regularly experiment on their users, a process known as A/B testing, and that’s not going to change.

Better instead to use this storm as an opportunity for dialogue, Mr. Lazer said. Someone should put together a well-vetted platform for ethicists to establish what is acceptable and unacceptable for social-media research, as more and more young researchers pour into the field. Such a gathering might keep Facebook around, and could serve society as a whole.

So, Facebook: Don’t unfriend academe just yet.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Scholarship & Research
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
paul voosen
About the Author
Paul Voosen
Paul Voosen was a Chronicle reporter. His stories have also appeared in National Geographic, Scientific American, and Greenwire, with reprints in The New York Times.
ADVERTISEMENT
ADVERTISEMENT

More News

Vector illustration of large open scissors  with several workers in seats dangling by white lines
Iced Out
Duke Administrators Accused of Bypassing Shared-Governance Process in Offering Buyouts
Illustration showing money being funnelled into the top of a microscope.
'A New Era'
Higher-Ed Associations Pitch an Alternative to Trump’s Cap on Research Funding
Illustration showing classical columns of various heights, each turning into a stack of coins
Endowment funds
The Nation’s Wealthiest Small Colleges Just Won a Big Tax Exemption
WASHINGTON, DISTICT OF COLUMBIA, UNITED STATES - 2025/04/14: A Pro-Palestinian demonstrator holding a sign with Release Mahmud Khalil written on it, stands in front of the ICE building while joining in a protest. Pro-Palestinian demonstrators rally in front of the ICE building, demanding freedom for Mahmoud Khalil and all those targeted for speaking out against genocide in Palestine. Protesters demand an end to U.S. complicity and solidarity with the resistance in Gaza. (Photo by Probal Rashid/LightRocket via Getty Images)
Campus Activism
An Anonymous Group’s List of Purported Critics of Israel Helped Steer a U.S. Crackdown on Student Activists

From The Review

John T. Scopes as he stood before the judges stand and was sentenced, July 2025.
The Review | Essay
100 Years Ago, the Scopes Monkey Trial Discovered Academic Freedom
By John K. Wilson
Vector illustration of a suited man with a pair of scissors for a tie and an American flag button on his lapel.
The Review | Opinion
A Damaging Endowment Tax Crosses the Finish Line
By Phillip Levine
University of Virginia President Jim Ryan keeps his emotions in check during a news conference, Monday, Nov. 14, 2022 in Charlottesville. Va. Authorities say three people have been killed and two others were wounded in a shooting at the University of Virginia and a student is in custody. (AP Photo/Steve Helber)
The Review | Opinion
Jim Ryan’s Resignation Is a Warning
By Robert Zaretsky

Upcoming Events

07-31-Turbulent-Workday_assets v2_Plain.png
Keeping Your Institution Moving Forward in Turbulent Times
Ascendium_Housing_Plain.png
What It Really Takes to Serve Students’ Basic Needs: Housing
Lead With Insight
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin