> Skip to content
FEATURED:
  • The Evolution of Race in Admissions
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
ADVERTISEMENT
News
  • Twitter
  • LinkedIn
  • Show more sharing options
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Copy Link URLCopied!
  • Print

Jeffrey Hancock Wants to Keep Talking About How We Use Social Media for Research

By  Steve Kolowich
April 20, 2015
Jeffrey Hancock, a professor at Cornell, caused a furor when he used Facebook to do an experiment on emotion.
Heather Ainsworth
Jeffrey Hancock, a professor at Cornell, caused a furor when he used Facebook to do an experiment on emotion.

The most widely read paper of Jeffrey Hancock’s career was not conceived in a university laboratory. The data were collected by machines. The subjects were unwitting. The methods were not approved by an institutional review board.

That’s because a university was not in charge of the study. Facebook was.

Mr. Hancock, a professor of communication and information science at Cornell University, and Jamie Guillory, a Cornell graduate student, wrote the paper with Adam Kramer, a data scientist at Facebook. For years the professor had been studying people’s relationships with text-based communication. Facebook, which has nearly 1.4 billion users worldwide, was uniquely positioned to track how online social networks can affect people’s emotions.

We’re sorry. Something went wrong.

We are unable to fully display the content of this page.

The most likely cause of this is a content blocker on your computer or network. Please make sure your computer, VPN, or network allows javascript and allows content to be delivered from c950.chronicle.com and chronicle.blueconic.net.

Once javascript and access to those URLs are allowed, please refresh this page. You may then be asked to log in, create an account if you don't already have one, or subscribe.

If you continue to experience issues, contact us at 202-466-1032 or help@chronicle.com

The most widely read paper of Jeffrey Hancock’s career was not conceived in a university laboratory. The data were collected by machines. The subjects were unwitting. The methods were not approved by an institutional review board.

That’s because a university was not in charge of the study. Facebook was.

Mr. Hancock, a professor of communication and information science at Cornell University, and Jamie Guillory, a Cornell graduate student, wrote the paper with Adam Kramer, a data scientist at Facebook. For years the professor had been studying people’s relationships with text-based communication. Facebook, which has nearly 1.4 billion users worldwide, was uniquely positioned to track how online social networks can affect people’s emotions.

It was a natural marriage — and a portentous one. Companies are collecting more data than ever about how their customers use their products, and frequently conduct discreet experiments, called A/B tests, to see how those customers respond to tweaks in a website’s design and functionality.

  • Tech Innovators 2015

    Check out the rest of the Digital Campus issue and meet more of the people who are helping to drive change through education technology.

As everyday human interaction has migrated to data-rich social-media and online-dating platforms, the companies that run those platforms have become gatekeepers. The ability of academic researchers to keep pace with corporate data scientists may depend on how well they can work with those companies. After his own collaboration with Facebook created a furor, Mr. Hancock, 41, found himself at the center of a debate about the ethical obligations of companies and their academic collaborators as they mine data from commercial web services in hopes of better understanding human behavior in the Internet age.

ADVERTISEMENT

The story of Mr. Hancock’s collaboration with Facebook began in 2006, when he started investigating the idea that typed conversations hold less emotional charge than face-to-face ones. “I would see this assumption written into scientific papers,” he says. “I would see it written into how people would talk about tech.” Computer-mediated communication was considered inferior.

Mr. Hancock doubted this truism, and he began chipping away at it in a series of experiments. He found that the dynamics of text-based interaction were nuanced. People could detect irony. They could pick up on social cues. People who were lying via text made subtle adjustments to the kinds of words they were using. Remarkably, people who were being lied to also made adjustments, even when they couldn’t spot the deception.

People bury their noses in their iPhones not because they are socially stunted, says Mr. Hancock. If anything, they are more social than ever before.

And yet his lab experiments had limitations. The sample sizes were relatively small, and the researchers had to go to somewhat absurd lengths to control unwieldy variables like the emotional state of a test subject. “If we wanted to make somebody happy, for example, we had to show them a really funny clip,” says Mr. Hancock. “And then we’d have to continue priming them while they talked to another person. So we’d have really happy music on, for example. And we’d have them solving anagrams leading to really happy words — like ‘awesome’ and ‘fun’ and ‘happy’ and that kind of stuff.”

In 2011, Facebook was trying to battle the notion that its product was making users unhappy. A Stanford University study had reinforced the idea that people overestimate the happiness of their friends. That study didn’t have to do with Facebook directly, but the author said he got the idea after noticing how Facebook seemed to fuel those misapprehensions. “By helping other people look happy, Facebook is making us sad,” declared a headline on Slate.

ADVERTISEMENT

By then, Mr. Hancock had developed a professional relationship with Mr. Kramer, the Facebook data scientist. (Mr. Kramer declined an interview request.) They agreed to work together on a project. The now-infamous study tested an existing theory called “emotional contagion.” By reducing the proportion of “positive” or “negative” posts visible to certain Facebook users, and analyzing the emotional content of the posts those users would then write, the researchers hoped to glean insights into whether emotions can spread through online social networks.

The study was larger than anything Mr. Hancock could have cooked up at Cornell. During a single week, in January 2012, Facebook experimented on nearly 700,000 users — over a hundred times as many test subjects than Mr. Hancock believes he has studied in 18 years as a researcher. And the company didn’t have to drag a single person into a laboratory.

The process was so hands-off, in fact, that the Facebook users did not even know they were test subjects.

Scientifically, the experiment was a success. The researchers were able to show that people who see fewer “negative” words on Facebook tend to post more positive words to the site, and that the inverse is true of people who see fewer “positive” words.

The researchers got their results without creating much of a ripple; at the level of individual users, the effects were vanishingly small. On average, people who saw a decrease in the positive words visible in their news feeds ended up typing 0.04 percent more negative words than the control group. That is the equivalent of four extra sad words for every 10,000.

ADVERTISEMENT

When the study came out last year, however, the backlash was huge. Mr. Hancock and his co-authors faced a deluge of negativity far worse than anything they had prescribed in the study. It included not just professional criticisms but also hate mail and threats. Facebook’s platform offered a reach beyond what previous generations of social scientists could have dreamed of, but it also created enmity at a dizzying scale. The experience was traumatic enough that even now, almost a year later, Mr. Hancock is reluctant to talk about it.

Other scientists rushed to defend the study. “The vitriolic criticism of this study could have a chilling effect on valuable research,” wrote Michelle N. Meyer, a bioethicist, in Nature.

Duncan J. Watts, a former Columbia University sociologist who now works for Microsoft Research, pointed out that companies are always manipulating people’s emotions on the sly. “The only difference between the Facebook study and everyday life,” he wrote in The Guardian, “is that the researchers were trying to understand the effect of that manipulation.”

The legacy of the Facebook study might have more to do with research ethics than with social science. The paper has been cited more than 100 times since it was published in June, according to Mr. Hancock, and more than half of those citations refer to the ethical concerns it raised rather than to its insights on emotional contagion. The Cornell professor has accepted several invitations to talk about the ethical implications of “computational social science” — research that plumbs large data sets, often collected from people using commercial products and services. “I want to help with that conversation,” he says. “Talking about it seems like an important way to move forward.” (In October, Facebook introduced new policies aimed at handling some of the issues raised by the study.)

One important lesson from the study became apparent only after the analysis was published: “We have this giant study — super impersonal, machines are doing all the analysis,” says Mr. Hancock. And yet the backlash suggests that “these big data-science studies feel much more personal than traditional science in a lab at a university.”

ADVERTISEMENT

Steve Kolowich writes about how colleges are changing, and staying the same, in the digital age. Follow him on Twitter @stevekolowich, or write to him at steve.kolowich@chronicle.com.

Read other items in this The Digital Campus: Tech Innovators 2015 package.
We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Technology
Steve Kolowich
Steve Kolowich was a senior reporter for The Chronicle of Higher Education. He wrote about extraordinary people in ordinary times, and ordinary people in extraordinary times.
ADVERTISEMENT
ADVERTISEMENT
  • Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
    Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
  • The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
    The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
  • Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
    Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
  • Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
    Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
1255 23rd Street, N.W. Washington, D.C. 20037
© 2023 The Chronicle of Higher Education
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin