> Skip to content
FEATURED:
  • The Evolution of Race in Admissions
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
  • News
  • Advice
  • The Review
  • Data
  • Current Issue
  • Virtual Events
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Career Resources
    • Find a Job
    • Post a Job
    • Career Resources
Sign In
ADVERTISEMENT
Blog Logo

Percolator: Why Lies Often Stick Better Than Truth

Research that matters.

  • Twitter
  • LinkedIn
  • Show more sharing options
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Copy Link URLCopied!
  • Print

Why Lies Often Stick Better Than Truth

By  Josh Fischman
September 20, 2012
A child cries as a nurse applies a vaccine in a health center during a National Immunization Day in Managua, on March 20, 2012.  (Elmer Martinez, AFP, Getty Images)
ELMER MARTINEZ
A child cries as a nurse applies a vaccine in a health center during a National Immunization Day in Managua, on March 20, 2012. (Elmer Martinez, AFP, Getty Images)

There is no good reason to believe vaccines cause autism. A 1998 paper in The Lancet that championed the link was immediately pilloried and later withdrawn as fraudulent. Its author, the British physician Andrew J. Wakefield, was found guilty of dishonesty and abuse of developmentally disabled children by the British General Medical Council. He has been stripped of his medical license. No other researcher has been able to replicate his work, and journals have retracted his other papers. The Centers for Disease Control and Prevention, the National Academy of Sciences, and many other groups found

We’re sorry. Something went wrong.

We are unable to fully display the content of this page.

The most likely cause of this is a content blocker on your computer or network. Please make sure your computer, VPN, or network allows javascript and allows content to be delivered from c950.chronicle.com and chronicle.blueconic.net.

Once javascript and access to those URLs are allowed, please refresh this page. You may then be asked to log in, create an account if you don't already have one, or subscribe.

If you continue to experience issues, contact us at 202-466-1032 or help@chronicle.com

A child cries as a nurse applies a vaccine in a health center during a National Immunization Day in Managua, on March 20, 2012.  (Elmer Martinez, AFP, Getty Images)
ELMER MARTINEZ
A child cries as a nurse applies a vaccine in a health center during a National Immunization Day in Managua, on March 20, 2012. (Elmer Martinez, AFP, Getty Images)

There is no good reason to believe vaccines cause autism. A 1998 paper in The Lancet that championed the link was immediately pilloried and later withdrawn as fraudulent. Its author, the British physician Andrew J. Wakefield, was found guilty of dishonesty and abuse of developmentally disabled children by the British General Medical Council. He has been stripped of his medical license. No other researcher has been able to replicate his work, and journals have retracted his other papers. The Centers for Disease Control and Prevention, the National Academy of Sciences, and many other groups found no evidence of a link.

Yet surveys in 2002 found that as much as 53 percent of the public believed there was good evidence on both sides, as did a good number of health professionals. Politicians also bought in. In 2008 the presidential candidate Barack Obama said, “We’ve seen just a skyrocketing autism rate. Some people are suspicious that it’s connected to the vaccines … The science right now is inconclusive.” His rival, John McCain, said that “there’s strong evidence that indicates it’s got to do with a preservative in vaccines.” And in 2011 Web sites were still reporting that vaccine injury cases showed evidence of autism.

There are also many people who, even after seeing President Obama’s birth certificate, believe he was not born in the United States. And many doubt there is global warming, despite an overwhelming scientific consensus that things are heating up. Why do we like our slanted information and outright lies so much?

Because rejecting them is hard work, say psychologists in a new article in Psychological Science in the Public Interest. Making a cognitive shift means rethinking already-held beliefs. It’s much easier to slot evidence into ideas we already hold, says Stephan Lewandowsky, a professor of psychology at the University of Western Australia and an author of the report.

ADVERTISEMENT

That’s not a new discovery. More interesting, however, are the strategies the psychologists recommend for breaking through the fog of disbelief. You need to find an alternate explanation that fits the same basic facts, says another report author, Colleen M. Seifert, a professor of psychology at the University of Michigan at Ann Arbor. Misinformation persists when “you don’t have an alternative account that works as well as does the wrong one,” she explains by e-mail.

She examined reaction to a report of food poisoning. “This is a true example,” she writes. ”Suppose you hear of a family of four who died after eating at Golden Gate Chinese Restaurant. The authorities investigate, and release the information that food poisoning was not the cause. Do you go out for Chinese tonight?

“You know it is not true,” she continues, “but it’s such a good explanation for what happened that you fall back on it even while knowing it is in error. So you say, ‘It was not food poisoning, but let’s have Italian tonight.’”

In her studies, the only way to get people to let go of such an idea was to give them a plausible alternative. So in fact, “the family was found to have suffered carbon-monoxide poisoning,” she notes. “Now I have an account that is wholly satisfactory and explains the circumstances, and now I am happy to eat at Golden Gate.”

One of the worst ways to offer alternatives, though, is to repeat the bad information while doing so. “There is a risk that repeating an association can make it stronger in memory,” says Ullrich K.H. Ecker, another author, in an e-mail. “Saying that ‘it’s incorrect that the flu vaccine has major side effects’ repeats and hence potentially strengthens the link between ‘vaccine’ and ‘side effects’ even though it negates it,” notes Ecker, an assistant professor of psychology at Western Australia. His research and that of others has demonstrated this. Much smarter, he says, is to stick to the alternative, talking about the safety of the vaccine.

ADVERTISEMENT

Seifert adds that the backfire effect is very common. “In later studies, we tried overemphasizing the fact that the information is wrong,” she writes. She tried saying to her skeptical Chinese-restaurant patrons that “‘it was definitely not food poisoning.’ But that made people more suspicious of the truth of the information, and didn’t make them any less likely to use it.”

Lewandowsky and another co-author, a researcher at the University of Queensland named John Cook, have collected those strategies and cautions in a publication for academics and science communicators called The Debunking Handbook.

None of this is easy, cautions Edward Maibach, director of the Center for Climate Change Communication at George Mason University. If it were, misinformation would diminish and facts would win in the marketplace of ideas. But, he wrote in a commentary in the same issue of the journal, the report from Lewandowsky and his colleagues does a great service by showing why some distortions are so “sticky” in our minds.

[Photo credit: Elmer Martinez, AFP, Getty Images]

ADVERTISEMENT
ADVERTISEMENT
  • Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
    Explore
    • Get Newsletters
    • Letters
    • Free Reports and Guides
    • Blogs
    • Virtual Events
    • Chronicle Store
    • Find a Job
  • The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
    The Chronicle
    • About Us
    • DEI Commitment Statement
    • Write for Us
    • Talk to Us
    • Work at The Chronicle
    • User Agreement
    • Privacy Policy
    • California Privacy Policy
    • Site Map
    • Accessibility Statement
  • Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
    Customer Assistance
    • Contact Us
    • Advertise With Us
    • Post a Job
    • Advertising Terms and Conditions
    • Reprints & Permissions
    • Do Not Sell My Personal Information
  • Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
    Subscribe
    • Individual Subscriptions
    • Institutional Subscriptions
    • Subscription & Account FAQ
    • Manage Newsletters
    • Manage Your Account
1255 23rd Street, N.W. Washington, D.C. 20037
© 2023 The Chronicle of Higher Education
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin