Just the Facts?

Brian Taylor

June 12, 2013

Either of the following two headlines would work to describe a new course on "Statistics in Journalism" that we offered for the first time this past spring at the University of Pittsburgh:

  • "Lackluster Enrollment Plagues New Course" would describe the seven students brave enough to sign up for the one-credit honors class.
  • "Rave Reviews From Students on Groundbreaking Course" would sum up our end-of-semester teaching evaluations.

Those headlines also reflect what we were trying to teach students: Different versions of the truth can sway readers in widely divergent directions. The experience was so eye-opening that we are convinced that a course on the use and misuse of statistics in science-based news coverage is one that would benefit all students, no matter their major. It would not only focus their attention on the importance of accuracy, precision, and word choice in writing, it would also help them understand the difficulties of arriving at an accurate version of the truth.

We took a cross-disciplinary approach in developing and teaching the course. One of us teaches statistics (Pfenning) and the other teaches nonfiction writing after spending most of her career in journalism (Skrzycki). We designed the course to deal with the weaknesses that can occur both in how journalists present statistical results and how research studies arrive at those results. Our goal was to take students behind the scenes of both professions.

For the first part of the course, we asked students to critique news articles in major publications. By searching and analyzing the underlying research, they came to understand the shortcuts and shallow reporting that can plague important stories that the public eagerly consumes. For the second half of the term, they took a stab at writing stories based on recently published research, to get a sense of the appropriate questions, attributions, and facts that need to be included in a professional piece of writing.

We started the course with a quick review of basic statistics, and then outlined various approaches to reporting on numbers in the news media. An introductory statistics course was a prerequisite for our class, but we didn't require a journalism course. We provided handouts to guide students on stylistic conventions in journalism.

In the first half of the course, the weekly assignment was for a student to find a news report, delve into its statistical and journalistic strengths and weaknesses (with guidance from us), and give a presentation. Meanwhile, classmates would read the same articles and prepare at least three questions or comments for use in discussion following the student's report.

Week after week, students went to reputable sources such as The New York Times, online science blogs, and Time magazine, to name a few, to search for relevant news stories. They quickly found disparate interpretations of reality as we discussed stories and research studies on topics such as whether high-heel shoes cause knee problems and whether Ecstasy can help veterans cope with stress.

One student, a geology major with a love of fashion, zeroed in on a Daily Mail article entitled, "High Heels Are 'Good' for You." She e-mailed us a copy of the article, expressing her intention to question the researchers' emphasis of a rather surprising discovery: Women who danced in high heels were less likely to have knee problems than those who did not. She argued that that relationship wasn't expressed in the researchers' original hypotheses; they'd cherry-picked it from the data to sensationalize their results, and the Daily Mail jumped on the bait in the story. We approved the student's project.

Besides the flaw in the study noticed by the student, we urged her to consider another problem we had spotted—namely, the limitations of an observational study in an attempt to establish a causal connection. There might be a simpler explanation for why women with knee arthritis were less likely to have worn high-heel shoes than women with no pain: Women who do physically demanding work like scrubbing floors do tend to have knee problems, and what kind of shoes do they tend to wear to work? Certainly not three-and-a-half-inch stilettos. Similarly, women whose knees are in good shape are the ones more likely to venture onto the dance floor in ultra-high heels.

The student presented, and the class was able to reach a consensus on what was good and bad in the underlying research, as well as in the reporting. Our expertise was requested occasionally to resolve questions such as "How do we interpret a relative risk?" or "How much detail is appropriate when a reporter cites quantitative results?"

Another student began with an article from The New York Times, "Parents' Financial Support May Not Help College Grades." Next she found a news report from The Fiscal Times that took the claim one step further: "When Parents Pay for College, Kids' Grades Could Suffer." Even more forceful encouragement to stop the tuition checks came from this blog post in Forbes: "Want Your Kids to Succeed? Don't Pay For Their Education."

The student was curious as to whether the original study had also made the leap from association to causation. Could the researcher really control for confounding variables like students' personalities? Wasn't it possible that parents were less likely to give a financial boost to children who were already very self-motivated—the same sort of children who manage to achieve higher grades?

Good questions that all the stories failed to answer. Moreover, they downplayed the more intuitive result that parental aid increases the odds of graduating. Not surprisingly, nobody opted to go with a dud headline like, "Students More Likely To Graduate If They Get Help With Tuition."

Students also discovered in their reporting that the researcher was preparing to publish a book, and that an emphasis on the counterintuitive nature of her findings could help to drum up interest and sales. That taught them to be as skeptical about the source of the information as about the poorly written headlines.

The first half of the course had laid the groundwork for students to take an active rather than passive approach to good reporting of statistical information. They were armed with a wealth of recent examples of common pitfalls in interpreting research findings. They had dissected news stories and found some journalism practices worth emulating and some to be avoided.

In the second half of the course, students played the role of reporter. One student each week chose a research article and wrote a news story about it, to be read aloud and then discussed. Again, as the instructors, we guided the presenting student through the process, and the rest of the class came to the presentation armed with questions and comments.

The lineup of topics was ambitious and intriguing. One student investigated bias in Israeli school textbooks about Palestinians, and vice versa. Other students wrote about eating disorders in adolescents; a new technique for making tissue transparent to better view the brain; evidence that the moon is shrinking (but not by much, we were relieved to hear).

For their final papers, students were charged again with writing an in-depth news story about a study of their choosing. This time, however, they went through the revision process on their own or with help from classmates, not from their instructors. They continued to select studies that sparked their personal interest, such as "Human Face Structure Correlates With Professional Baseball Performance: Insights From Professional Japanese Baseball Players" and "Atherosclerosis Across 4,000 Years of Human History."

By the time the semester was over, we had witnessed an alarming number of serious problems in both journal publications and news stories, including conflict-of-interest issues or researchers ignoring obvious confounding variables; reporters focusing on topics purely for their shock value; and a sensationalistic spin put on research results in both academe and the news media.

Students began to discuss what can be done to combat misinformation that grows out of researchers trying to enhance themselves professionally, or flashy headlines and reporting that lead readers to believe half-truths.

Although our one-credit course had a minimal impact on our students' GPAs, it played an important role in their educational experience. It taught them to read skeptically. But it also had more immediate results: A student about to graduate in neuroscience began to consider a career as a science writer; a journalism-focused student realized there were plenty of fascinating stories based on scientific research. As one graduating senior put it, "I thoroughly enjoyed being in the class and have gained more practical knowledge from it than I have from any other course."

Now, that's a headline.

Nancy Pfenning is a senior lecturer in statistics at the University of Pittsburgh, and Cindy Skrzycki is a senior lecturer in the university's nonfiction writing program.