This week David Broockman received his doctoral degree. He also helped persuade one of the most respected political scientists in the country to ask a prestigious academic journal to retract one of its most buzzed-about studies from last year.
The study, published in Science, purported to show that short conversations could change people’s opinions about same-sex marriage, and many leading news outlets, including The Wall Street Journal, The New York Times, and The Washington Post, wrote about it.
We’re sorry, something went wrong.
We are unable to fully display the content of this page.
This is most likely due to a content blocker on your computer or network.
Please allow access to our site and then refresh this page.
You may then be asked to log in, create an account (if you don't already have one),
or subscribe.
If you continue to experience issues, please contact us at 202-466-1032 or help@chronicle.com.
This week David Broockman received his doctoral degree. He also helped persuade one of the most respected political scientists in the country to ask a prestigious academic journal to retract one of its most buzzed-about studies from last year.
The study, published in Science, purported to show that short conversations could change people’s opinions about same-sex marriage, and many leading news outlets, including The Wall Street Journal, The New York Times, and The Washington Post, wrote about it.
Mr. Broockman and Joshua Kalla, both graduate students at the University of California at Berkeley who were impressed with the study, decided to extend the research’s findings, according to an account published by the blog Retraction Watch, which broke the story on Wednesday.
When they found they could not verify that a firm that was said to have collected the data for the study had ever done so, among other “irregularities,” they contacted the study’s authors: Donald P. Green of Columbia University and Michael J. LaCour, a Ph.D. candidate at the University of California at Los Angeles. Mr. LaCour, who had been in charge of the study’s data, could not prove that it was authentic. Mr. Green then promptly asked Science to retract the study.
For his part, Mr. LaCour said on Wednesday: “I’m gathering evidence and relevant information so I can provide a single comprehensive response. I will do so at my earliest opportunity.” Also on Wednesday, Science published an “editorial expression of concern” about the study.
ADVERTISEMENT
Late Wednesday, Mr. Broockman, who begins an assistant-professor post at Stanford University on July 1, spoke with The Chronicle about the unusual result of his and Mr. Kalla’s decision to take a closer look at the study. His comments have been edited for length and clarity.
Q. This study made a big splash when it was published, in December. The upshot of the study seemed hopeful. One way to read it was to say that empathy, even among strangers, can override people’s deeply held political beliefs. What was your first reaction when you read this study?
A. I was probably one of the most enthusiastic boosters of the study, for a few reasons. I myself am gay. I also have spent a lot of time going door to door for causes that I care about. And we know vanishingly little about how that kind of work should be done, if it works, or how long it lasts.
The study provided incredibly clear answers to those questions. Methodologically, it indicated a way of doing research that made those questions all of a sudden much more answerable. Since I became aware of the results, I had basically changed my own research agenda to try to do some of this work.
Q. Had you talked to Michael LaCour about the design of his study as he was working on it?
ADVERTISEMENT
A. I don’t believe I spoke with him at that time. I spoke with others who were thinking about the design. But I was aware that this study was going on from the beginning, and supported it enthusiastically until last Friday.
Q. When did you first have doubts that the findings were genuine?
A. The nature of the work that we do as quantitative researchers is that you allow the data to tell you what you think the truth should be. You don’t take your views and then apply those to the data; you let the data inform your views. I don’t think it really crossed many people’s minds that there might be some issues with the data or the procedures.
I think it was early 2015 that I started working with my colleague Josh Kalla on trying to do a set of follow-up studies — finally acting on the enthusiasm I had for their design. We started reviewing aspects of the data just to form our own expectations about what it might take to do a new study. When we looked at those statistics in this study, they just surpassed our most optimistic prior expectations. So that provided some hint. But I just shook it off because it just didn’t seem like even a proper thought to have, that the data would not be accurate.
Q. It sounds as if your own confirmation bias was at work, so it probably took a pretty overwhelming realization to overcome that.
ADVERTISEMENT
A. These things were kind of in the back of my head, but I kind of put them to rest. On Friday, I finally decided to contact that firm [which had supposedly recruited subjects for the study] and say: Here’s what we’re trying to do, and do you think you would be able to do that? But what they in fact said was: This request is very strange because we don’t do that kind of work, and the person you’re asking for does not exist here.
Q. Is that the moment when you realized, Oh my God, something is very wrong here?
A. Yes. And I think on some level what it took was that combined with the uneasiness we had had in January. And those two things were necessary for us to say, “Holy crap, we need to take a look at the data.”
Q. Donald Green is a well-known and well-respected researcher. How did it feel to approach him and say that you thought his study was invalid?
A. One piece of context is that Josh and I wouldn’t be studying political science if it were not for Don Green. He was our undergraduate adviser at Yale. We trust his integrity more than anything else in this discipline.
ADVERTISEMENT
Also, I knew the paper had appeared with only Michael’s name on it before Don’s name appeared on it. Mike had done this as his project, and had only later decided to invite Don to help him with the data analysis. That is not at all a strange thing. The essence of graduate school is some version of working with faculty in a way where you’re both contributing something to research projects.
But because I knew that the issues in the data set were collected by LaCour — and that Don was not involved at that time — I knew that if our concerns were true, it would not have looked like Don was involved in any malicious activities. I felt very comfortable going to Don and raising my concerns.
Q. Could the problems with the study have been detected during peer review, or could they have been caught only by researchers who were trying to replicate the work?
A. At most journals, it’s not typical for the replication data to be provided at the same time as the draft research report because it’s so rare that anyone suspects that the data has been altered in some way or that aspects of the data collection have been inaccurately described. One initiative that I support — by an academic named Uri Simonsohn at the University of Pennsylvania — is that every paper should have a statement that basically says everything necessary to understand how credible the research is, is described in the paper.
In general I’m still comfortable saying that I’m not going to read academic papers wondering if people made false statements. It’s just that we need all the statements that allow us to understand what was done.
ADVERTISEMENT
So there are a wide variety of procedure changes that should be made. I don’t think this one isolated incident should prompt an overreaction that leads us to always subject data to a million possible tests. But I do think it should be concerning that this could get to the point where it is now.
Q. Donald Green has said that this shouldn’t lead us to believe that the hypothesis of the study was false, only that it hasn’t been proved. Do you and/or Josh Kalla plan to try?
A. Doing a study that tests this hypothesis convincingly would be a truly heroic endeavor, which is why the study received the accolades that it did. I don’t know if I can promise that Broockman and Kalla are going to soon reveal their version of this because the fact is that studies like this just don’t come around that often. But we will certainly be trying to work on this hypothesis because it’s so important.
Steve Kolowich writes about how colleges are changing, and staying the same, in the digital age. Follow him on Twitter @stevekolowich, or write to him at steve.kolowich@chronicle.com.
Steve Kolowich was a senior reporter for The Chronicle of Higher Education. He wrote about extraordinary people in ordinary times, and ordinary people in extraordinary times.