Ever since word got out that a prominent Harvard University researcher was on leave after an investigation into academic wrongdoing, a key question has remained unanswered: What, exactly, did he do?
The researcher himself, Marc D. Hauser, isn’t talking. The usually quotable Mr. Hauser, a psychology professor and director of Harvard’s Cognitive Evolution Laboratory, is the author of Moral Minds: How Nature Designed Our Universal Sense of Right and Wrong (Ecco, 2006) and is at work on a forthcoming book titled “Evilicious: Why We Evolved a Taste for Being Bad.” He has been voted one of the university’s most popular professors.
Harvard has also been taciturn. The public-affairs office did issue a brief written statement last week saying that the university “has taken steps to ensure that the scientific record is corrected in relation to three articles co-authored by Dr. Hauser.” So far, Harvard officials haven’t provided details about the problems with those papers. Were they merely errors or something worse?
An internal document, however, sheds light on what was going on in Mr. Hauser’s lab. It tells the story of how research assistants became convinced that the professor was reporting bogus data and how he aggressively pushed back against those who questioned his findings or asked for verification.
A copy of the document was provided to The Chronicle by a former research assistant in the lab who has since left psychology. The document is the statement he gave to Harvard investigators in 2007.
The former research assistant, who provided the document on condition of anonymity, said his motivation in coming forward was to make it clear that it was solely Mr. Hauser who was responsible for the problems he observed. The former research assistant also hoped that more information might help other researchers make sense of the allegations.
It was one experiment in particular that led members of Mr. Hauser’s lab to become suspicious of his research and, in the end, to report their concerns about the professor to Harvard administrators.
The experiment tested the ability of rhesus monkeys to recognize sound patterns. Researchers played a series of three tones (in a pattern like A-B-A) over a sound system. After establishing the pattern, they would vary it (for instance, A-B-B) and see whether the monkeys were aware of the change. If a monkey looked at the speaker, this was taken as an indication that a difference was noticed.
The method has been used in experiments on primates and human infants. Mr. Hauser has long worked on studies that seemed to show that primates, like rhesus monkeys or cotton-top tamarins, can recognize patterns as well as human infants do. Such pattern recognition is thought to be a component of language acquisition.
Researchers watched videotapes of the experiments and “coded” the results, meaning that they wrote down how the monkeys reacted. As was common practice, two researchers independently coded the results so that their findings could later be compared to eliminate errors or bias.
According to the document that was provided to The Chronicle, the experiment in question was coded by Mr. Hauser and a research assistant in his laboratory. A second research assistant was asked by Mr. Hauser to analyze the results. When the second research assistant analyzed the first research assistant’s codes, he found that the monkeys didn’t seem to notice the change in pattern. In fact, they looked at the speaker more often when the pattern was the same. In other words, the experiment was a bust.
But Mr. Hauser’s coding showed something else entirely: He found that the monkeys did notice the change in pattern—and, according to his numbers, the results were statistically significant. If his coding was right, the experiment was a big success.
The second research assistant was bothered by the discrepancy. How could two researchers watching the same videotapes arrive at such different conclusions? He suggested to Mr. Hauser that a third researcher should code the results. In an e-mail message to Mr. Hauser, a copy of which was provided to The Chronicle, the research assistant who analyzed the numbers explained his concern. “I don’t feel comfortable analyzing results/publishing data with that kind of skew until we can verify that with a third coder,” he wrote.
A graduate student agreed with the research assistant and joined him in pressing Mr. Hauser to allow the results to be checked, the document given to The Chronicle indicates. But Mr. Hauser resisted, repeatedly arguing against having a third researcher code the videotapes and writing that they should simply go with the data as he had already coded it. After several back-and-forths, it became plain that the professor was annoyed.
“i am getting a bit pissed here,” Mr. Hauser wrote in an e-mail to one research assistant. “there were no inconsistencies! let me repeat what happened. i coded everything. then [a research assistant] coded all the trials highlighted in yellow. we only had one trial that didn’t agree. i then mistakenly told [another research assistant] to look at column B when he should have looked at column D. ... we need to resolve this because i am not sure why we are going in circles.”
The research assistant who analyzed the data and the graduate student decided to review the tapes themselves, without Mr. Hauser’s permission, the document says. They each coded the results independently. Their findings concurred with the conclusion that the experiment had failed: The monkeys didn’t appear to react to the change in patterns.
They then reviewed Mr. Hauser’s coding and, according to the research assistant’s statement, discovered that what he had written down bore little relation to what they had actually observed on the videotapes. He would, for instance, mark that a monkey had turned its head when the monkey didn’t so much as flinch. It wasn’t simply a case of differing interpretations, they believed: His data were just completely wrong.
As word of the problem with the experiment spread, several other lab members revealed they had had similar run-ins with Mr. Hauser, the former research assistant says. This wasn’t the first time something like this had happened. There was, several researchers in the lab believed, a pattern in which Mr. Hauser reported false data and then insisted that it be used.
They brought their evidence to the university’s ombudsman and, later, to the dean’s office. This set in motion an investigation that would lead to Mr. Hauser’s lab being raided by the university in the fall of 2007 to collect evidence. It wasn’t until this year, however, that the investigation was completed. It found problems with at least three papers. Because Mr. Hauser has received federal grant money, the report has most likely been turned over to the Office of Research Integrity at the U.S. Department of Health and Human Services.
The research that was the catalyst for the inquiry ended up being tabled, but only after additional problems were found with the data. In a statement to Harvard officials in 2007, the research assistant who instigated what became a revolt among junior members of the lab, outlined his larger concerns: “The most disconcerting part of the whole experience to me was the feeling that Marc was using his position of authority to force us to accept sloppy (at best) science.”
Update 3:47 p.m., August 20: A letter from Michael D. Smith, Dean of Arts and Sciences at Harvard University, confirms allegations against Hauser, saying, “it is with great sadness that I confirm that Professor Marc Hauser was found solely responsible, after a thorough investigation by a faculty investigating committee, for eight instances of scientific misconduct under FAS standards.” To read the full text of the letter, click here.