> Skip to content
FEATURED:
  • The Evolution of Race in Admissions
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
ADVERTISEMENT
Research
  • Twitter
  • LinkedIn
  • Show more sharing options
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Copy Link URLCopied!
  • Print

A Professor Once Targeted by Fake News Now Is Helping to Visualize It

By  Fernanda Zamudio-Suarez
December 22, 2016
Filippo Menczer, a professor of informatics and computer science, and colleagues at Indiana U. at Bloomington have built a tool called Hoaxy that demonstrates how fake news spreads on social networks.
Indiana U. at Bloomington
Filippo Menczer, a professor of informatics and computer science, and colleagues at Indiana U. at Bloomington have built a tool called Hoaxy that demonstrates how fake news spreads on social networks.

In 2014, Filippo Menczer, a professor of informatics and computer science at Indiana University at Bloomington, felt the effects of fake news. A partisan website used a sentence from one of his abstracts out of context, and the spread of misinformation took flight.

Long before he fell victim to fake news, Mr. Menczer studied how and why information spreads online. Now a new tool called Hoaxy, built on work led by Mr. Menczer, who is also the director the Center for Complex Networks and Systems Research at the university’s School of Informatics and Computing, aims to show people what sites are spreading fake news and what they are posting about.

We’re sorry. Something went wrong.

We are unable to fully display the content of this page.

The most likely cause of this is a content blocker on your computer or network. Please make sure your computer, VPN, or network allows javascript and allows content to be delivered from c950.chronicle.com and chronicle.blueconic.net.

Once javascript and access to those URLs are allowed, please refresh this page. You may then be asked to log in, create an account if you don't already have one, or subscribe.

If you continue to experience issues, contact us at 202-466-1032 or help@chronicle.com

Filippo Menczer, a professor of informatics and computer science, and colleagues at Indiana U. at Bloomington have built a tool called Hoaxy that demonstrates how fake news spreads on social networks.
Indiana U. at Bloomington
Filippo Menczer, a professor of informatics and computer science, and colleagues at Indiana U. at Bloomington have built a tool called Hoaxy that demonstrates how fake news spreads on social networks.

In 2014, Filippo Menczer, a professor of informatics and computer science at Indiana University at Bloomington, felt the effects of fake news. A partisan website used a sentence from one of his abstracts out of context, and the spread of misinformation took flight.

Long before he fell victim to fake news, Mr. Menczer studied how and why information spreads online. Now a new tool called Hoaxy, built on work led by Mr. Menczer, who is also the director the Center for Complex Networks and Systems Research at the university’s School of Informatics and Computing, aims to show people what sites are spreading fake news and what they are posting about.

With the tool, users type in keywords and the site presents news articles — some of them fake — along with how many times the stories have been shared on Twitter and Facebook. Users can also see visualizations of how such articles spread on social media.

Mr. Menczer spoke with The Chronicle about how the project came about. The following conversation has been edited for length and clarity.

Q. When did you first become interested in fake news?

ADVERTISEMENT

A. I’ve been interested in the abuse of social media and social networks and the web for at least 10 years. About 10 years ago we started some experiments looking at how people might fall for manipulation or deception through social networks. We also looked at how the web could be manipulated by creating fake content, a combination of the financial incentives of ads and social media to get people to pay attention, how that might lead to people creating junk and having traffic directed toward that junk.

Around 2010 we started focusing on social media because it had become a prevalent medium for this kind of information. At that time we started looking at abuse in terms of social bots. Basically, accounts controlled by some entity — directly or software — to give the appearance of something. [For example,] to help spread misinformation, make it look like some opinion or some person has more support than they actually have. We spent the last few years building tools to help people understand how misinformation spreads and how you can detect it.

Q. You co-wrote a paper about fake news and people being manipulated on the internet. This was before people started talking about fake news after the election. Did you sense it would be a problem this election?

A. We think fake news started becoming really widespread around 2014. There were midterm elections in 2014. And we were even the target of a misinformation campaign about our own research in the summer and fall of 2014.

A hyperpartisan site posted an article that took a sentence out of a National Science Foundation abstract, from a grant that we had from the NSF, out of context. We were talking about studying models, building models to study the spread of misinformation and social media. They made it look like this was some kind of ominous, secret-government program of the administration to spy on you and to suppress free speech and to go after conservative accounts.

ADVERTISEMENT

None of this was true and a lot people believed it. It made news on Fox News and eventually even some politicians started talking about it. It was big story and there were multiple waves, and during each wave this misinformation would be embellished and the story would mutate a little bit. It was mixed with fake news and propaganda.

The stories were debunked very quickly. We noticed that people who wanted to believe this were impervious to this debunking and fact checking. It kept going until the election.

Q. Where did the idea for Hoaxy come from?

A. We have been working on tools to visualize the spread of information in general. We have a set of tools called OSoMe, Observatory on Social Media, where we have something similar. You can track and observe the spread of information about a hashtag on Twitter, for example.

We realized that instead of tracking hashtags we should track links to fake news websites. Another part of the inspiration came from another project, called emergent.info. That was done by Craig Silverman, who is now a journalist at BuzzFeed. He did this project, Emergent, where they would look at claims that were spreading online and label them as correct or false, and then they would track how many people would share them. It was all done manually. You could only see how many people would share them, not who was sharing them and how they were spreading. That’s when we starting thinking we should try and do this automatically. That’s how Hoaxy came about.

ADVERTISEMENT

Q. Do you want people to use this as a fact-checking tool or to show there is a lot of fake news?

A. Hoaxy checking is not a fact-checking tool. We let people search through claims that could come through any number of websites that often post misinformation, as well as fact-checking sites, so people can see both claims and fact checking. We don’t make an judgment as to whether something is true or false. There’s no technology right now that can look at a piece of text and say that it’s true or false. The idea is to study and observe and analyze how this fake news spreads, but it’s not about fact checking.

Q. How do you plan on expanding it or changing it in the future?

A. We’re pretty happy about the tool as it is right now. One way we already know we want to improve it is to come up with some kind of list of top spreaders of both misinformation and fact-checking information. And we could either look at people based on their activity. So, maybe you have an account that tweets thousands of times, and it’s probably a bot.

We could also look at influence. Like, maybe there is an account that has a lot of followers who doesn’t post thousands of times but because it has a lot of followers it is influential and gets retweeted a lot. It acts as big hubs for these claims. It may not be so active but can be extremely influential.

ADVERTISEMENT

Q. What can be done in the classroom to educate students to help stop fake news?

A. This semester I taught a class on network science and I had a really great discussion in that class about fake news. Just having these kinds of discussion is great. Students are smart. You don’t want to tell them what is true what is not. It’s important to educate them about their vulnerability and literacy in trying to make an assessment about what they look at. And what is your responsibility, not only in terms of what you believe but in terms of becoming the perpetrator of the abuse.

Like when somebody clicks on something, just based on the headline, and shares it without even reading an article, let alone checking it, they are not just the victims but they become the perpetrators. Because now they spread it to a lot of other people who are more likely to believe that something is true because they are receiving it from a friend.

Raising awareness about how we can end perpetrators and how the information that we are exposed to is not an unbiased sample. It’s carefully biased and selected because we are using social media that are more likely to expose us to things that we like or agree with or come with friends with similar opinion. We live in a very biased information environment. As people become more aware of this, then they become more of aware of their own biases and their own vulnerability.

Fernanda Zamudio-Suaréz is a web writer. Follow her on Twitter @FernandaZamudio, or email her at fzamudiosuarez@chronicle.com.

ADVERTISEMENT

Clarification (12/22/2016, 2:50 p.m.): This article originally quoted Professor Menczer as saying that he had taught a class on “metric science.” He has since clarified that the class was on “network science.” The article has been updated accordingly.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Fernanda Zamudio-Suarez
Fernanda is newsletter product manager at The Chronicle. She is the voice behind Chronicle newsletters like the Weekly Briefing, Five Weeks to a Better Semester, and more. She also writes about what Chronicle readers are thinking. Send her an email at fernanda@chronicle.com.
ADVERTISEMENT
ADVERTISEMENT

Related Content

  • How Can Students Be Taught to Detect Fake News and Dubious Claims?
  • Meet the Professor Who’s Trying to Help You Steer Clear of Clickbait
  • Fake News Release Targets Scholars and Student Critical of Israel
  • The true history of
  • Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
    Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
  • The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
    The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
  • Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
    Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
  • Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
    Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
1255 23rd Street, N.W. Washington, D.C. 20037
© 2023 The Chronicle of Higher Education
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin