More than 140 collaborators of Francesca Gino, the Harvard Business School professor who has been accused of data fabrication, have been scrambling to verify the research that they’ve published with her. On Monday, they started making their findings public.
The mass self-auditing effort, called the Many Co-Authors Project, has already initiated the retraction of at least one paper that Gino collected data for, according to one of her collaborators.
As of Monday, the database listed 56 papers in which Gino was named as being involved with collecting data for at least some of the experiments. For about 60 percent of them, all the other co-authors who responded as of Monday said that they did not have access to the raw data, according to a Chronicle review.
In a statement posted online on Monday, Gino said that she felt unfairly singled out by the Many Co-Authors Project.
“Like all scholars, I am interested in the truth. But auditing only my papers actively ignores a deeper reflection for the field,” she wrote. “Why is it that the focus of these efforts is solely on me?”
Over more than a decade, Gino rose to become one of Harvard Business School’s most visible scholars: a prolific researcher, a consultant to Fortune 500 corporations, a sought-after speaker, and a go-to expert for news stories about workplace management and behavioral issues, including dishonesty. Then in June, Harvard put her on administrative leave — stripping her of her salary and endowed faculty title and barring her from campus — after finding she engaged in research misconduct. Its investigation was sparked by a trio of business-school professors who identified signs of data fabrication in four of Gino’s papers, all of which have been retracted.
Gino has strenuously denied ever fabricating any data. She is suing the professors, who blog together under the title Data Colada, as well as Harvard University and its business-school dean, seeking $25 million and claiming that she was defamed by the defendants, and was investigated and punished in ways that violated Harvard policy.
Since news of the allegations broke in June, several of Gino’s collaborators have been trying to pin down which papers she collected data for and whether her co-authors have access to it. Questionnaires were sent to 143 co-authors who share credit with her on about 140 papers (including 16 for which no original data was collected), according to the newly launched website.
The paper of Gino’s that is reportedly slated to be retracted found that people can reduce their anxiety and perform better on anxiety-inducing tasks, like singing in public, by doing a specific series of behaviors framed as rituals. The findings were published in 2016 in Organizational Behavior And Human Decision Processes.
Juliana Schroeder, an associate professor at the University of California at Berkeley’s Haas School of Business who co-authored the paper, wrote on the Many Co-Authors Project site that the data for four of the paper’s experiments could not be tracked down, including the pilot study, which Gino and a lab manager collected the data for. In addition, “unexplained issues” were identified in two other data sets, one of which was also characterized by “uncertainty regarding the data provenance,” Schroeder wrote, adding that “we are currently in the process of retracting this paper.” The journal’s editor in chief — herself a collaborator of Gino’s — did not return a request for comment.
In her statement on Monday, Gino wrote — without specifying which paper or scientist she was referring to — that a researcher had falsely claimed that Gino had collected data for a paper, and that the rest of the team had decided to retract said paper using “ambiguous language” about who was responsible. “The project suggests an intent to introduce a rigorous evaluation of my work but lacks guardrails to protect me without a mechanism to validate the statements or claims by co-authors,” she wrote.
She added that she and many of her collaborators are being held to an unfair standard for not keeping years-old data, when “data sharing practices have changed over the last decade or so.”
And Gino argued that the project has the “potential for it to be biased,” given that one of its organizers, Uri Simonsohn of Ramon Llull University, is one of the Data Colada bloggers and is named as a defendant in her lawsuit. Simonsohn and others behind the Many Co-Authors Project did not return requests for comment.
Schroeder, a Many Co-Authors organizer who declined to comment, also reported that two other papers co-authored with Gino, which Gino did not collect data for, were found to have minor errors that are being corrected. A third paper did contain some Gino-collected data, which could not be tracked down, according to the co-authors, but other aspects were found to contain errors. All three appeared in the Journal of Personality and Social Psychology. Asked for comment, an editor said that all correction processes are confidential until they’ve been finalized.
So far, underlying data is available for relatively few of the papers — often because Gino’s collaborators have indicated that they do not have it. Out of the 120 papers that were listed as of Monday and relied on original data, around 20 contained links where accompanying data sets could be downloaded.
Lamar Pierce, a professor of strategy at Washington University in St. Louis’s Olin Business School, reported that he had replicated one paper he had co-authored with Gino, had almost finished replicating another, and planned to replicate a third. He could not locate data for a fourth paper. Concerning two of those papers, he wrote, “I will not be publicly posting the data or replication packet because I have concerns that evaluations will not be well-adjudicated in the current public sphere,” but added that “fair and reasoned” scholars could contact him directly for it. (He also noted that he is “agnostic to the truth of allegations against Francesca Gino.”)
Pierce told The Chronicle that he’d struggled with what to write on the project’s website right up to the deadline. “When you put new information out into an environment where people are looking to accuse and where people have very strong priors and aren’t open-minded about things, you don’t get an unbiased evaluation,” he said. He added that while he thinks the Many Co-Authors Project has had some good outcomes — like allowing scholars to clarify which papers Gino did not contribute data to — it may also have “a lot of unintended consequences.”
Other collaborators shared different discoveries, explanations, and questions that came up in their self-audits. One posted a string of emails showing her, as she wrote, “attempting to obtain access to the raw data used in the only paper I ever published with Dr. Gino.” Some posted detailed plans to replicate their research, writing that by the end, “We will be able to confirm not only that Fran did not commit fraud on this paper, but that none of us did.” And some defended their findings as having been “conceptually” replicated by other research teams.
A few collaborators had not responded to the questionnaires by the time the site went live, while others indicated that their responses were forthcoming but not yet public. Still others gave answers that in turn raised more questions. Dan Ariely, a professor of psychology and behavioral economics at Duke University, reported that he didn’t know whether Gino had collected data for 10 studies they worked on together, including one where he is the only other author named. Ariely did not return a request for comment.
James Heathers, a data watchdog who was not involved with the Many Co-Authors Project, said that such responses made him worry that some academics were not taking the situation as seriously as they should be.
“I think that I’m a little bit unsettled by the attitude that seems to be coming through where some of them seem to be treating it as a kind of box-checking exercise,” he said.
But he added that he was hopeful that the site would continue to evolve. “We’re talking about more than 100 papers and more than 100 people that are dealing with something that’s obviously professionally difficult for them,” he said. “What I’m hoping is this forms a template for some broadly ongoing transparency around this particular set of results.”