I hear both colleagues and students ask this question in a tone of dread: “Does this have to go through the IRB?” Except for the ones who ask it with a sense of grievance. They all hate the idea that an institutional review board gets to decide whether their research plan is good enough to proceed.
And every time, I wish I could just reach over and flick that chip off their shoulders. I’m soft on the IRB, and for a reason. Ours, which primarily deals with social-science and humanities research, has been more helpful to me than I ever expected it to be.
The IRB process in academe derives from legal requirements and is overseen by the U.S. Department of Health and Human Services. Each institution is allowed to tailor the IRB requirements; the federal guidelines are a floor, not a ceiling. If you want to find out what you have to do, your university has a webpage for that — here’s ours. If you’re doing research that involves human subjects (i.e., people), you should study that webpage. You can walk yourself through the checklist to see what you have to do.
Or you can do what a lot of us do (even though we will deny it): Call a staff member. At American University, where I teach, that person is Matt Zembrzuski, our IRB administrator (officially: research compliance manager), whose patience appears to be unfailing.
When I started working on copyright issues a decade ago, with my colleague Peter Jaszi, I entered IRB territory. That’s because we were surveying and interviewing creative colleagues on how they did their work, given their understanding of copyright. We had a pretty open protocol — a general set of questions — but we expected the conversations to be wide-ranging. We encouraged people to think and talk about, among other things, when they thought they might have violated copyright law, and when and why they would flout the law if they could.
The two basic questions facing campus IRBs are: How is the potential harm to the human subjects balanced by the larger public good to be gained from the research results? And how can researchers mitigate any potential harm?
Both Matt and IRB members at my university helped me identify the potential harm posed by our project and then find ways to mitigate it. We all agreed that the biggest potential harm for interviewees would probably be to their reputation (among peers and potential grant agencies) if it got out that they had done something illegal — or even legal but not current professional practice. Your publisher, grantor, or supervisor may not want to know that you’ve been wiggling around copyright requirements. Another potential harm would be that they could get into legal trouble for admitting to anything illegal.
We worked out a plan. Our research team gave interviewees an informed-consent form that said in simple words what we were trying to find out and why, why we valued their time, what we thought the risks were, and how we would deal with those risks. We promised not to use their names in any published material, unless they wanted to be named (a surprising number did). We kept all the information in a passworded project-management site (Basecamp), and we deleted all the data after we completed our research.
Now in fact, I don’t actually remember hearing any self-indicting stories in 10 years of research. But sometimes people mistakenly thought that what they were doing was dangerous, or wrongly believed that legal activities were illegal or could be viewed that way. They usually assessed their own risk at much higher levels than we would have.
So the ritual of beginning with the informed-consent form and assurances of confidentiality became a kind of stamp of legitimacy for us. It calmed fears at the start. A few interviewees wanted even more reassurances — for instance, some wanted no record of their names even in passworded documents. Some regarded the informed-consent form as an irritating formality. But all of them got a clear signal at the start of our work together that we were conscientious and considerate professionals.
After that experience, I wanted to know how our IRB approaches social-science work, which often is very different from the work for which IRBs were originally designed. Like other review boards, the members of American University’s IRB come from across the campus, so that the board can draw from a wide variety of research practices and experiences. When I spoke with members of our IRB, they all found serving on the board to be rewarding personally and an opportunity to give what one member, Derrick L. Cogburn, an associate professor of international relations, called “meaningful peer review.” He urged researchers to get over their fear of IRBs and embrace the collegiality.
Here are their collective answers to some common questions about IRBs. (All of the academics quoted below are members of my university’s IRB).
What are the most common mistakes the IRB sees?
Some social-science researchers underestimate risks to their subjects, especially social risk — shaming, reputation damage, exposure. “I wish they would put themselves in the shoes of those participants and ask: Does this feel right?,” said Cristel Russell, an associate professor of marketing.
Molly O’Rourke, a public-opinion researcher in our communications school, said that thinking about social risk is important not only when the project involves children, prisoners, the disabled, and other “vulnerable” populations. It’s also important for subjects who are “described as ‘standard’ respondents — as if there is a whole population of people who don’t have any vulnerability.”
The IRB sees this mistake more in ethnographic-style research, according to Matthew Wright, an assistant professor of government, and less in fields like psychology, where researchers have years of experience with IRB processes.
Sometimes researchers minimize the risks posed by their work — especially in the “minor risk” category — apparently in hopes of avoiding red flags that could delay or derail their research plans. Nathaniel Herr, an assistant professor of psychology, noted, “This usually prompts the IRB members to ‘fill in the blank’ and come up with possible minor risks. This leads to a back-and-forth with the researcher that is inevitably resolved positively but slows down the process.” Herr thought that having to fill out a form detailing the risks was good: “At the very least, it helps to humanize the research process, particularly when (as in internet research) researchers will not be in personal contact with participants.”
IRB members generally agreed: The review process goes more quickly when they can see that researchers have really thought about the risks. They also wanted researchers to think carefully about how seemingly ordinary questions may be challenging for subjects who carry a lifetime of associations.
“For example, asking about the number of children someone has or marital status seems very standard — but not for a respondent who lost a child or is in the process of a painful divorce or separation,” said O’Rourke, the public-opinion researcher. “Appreciating that and writing research instruments that reflect that is hard but important.” She added: “I will never forget moderating a focus group about terrorism/national security in 2002, and I had someone in the session whose brother was killed in the World Trade Center on 9/11. … We somehow missed it in the prescreening. Having been invited to a focus group about ‘policy priorities for our country,’ she (rightfully) felt misled.”
Likewise, researchers need to think about whether they really need all the information they are asking for. Gwendolyn Reece, an associate university librarian, said, “If your study is about test anxiety, you better be able to justify why you are asking someone if they have a history of sexual assault.”
Herr urged, “Don’t ask about potentially psychologically challenging topics like abuse history, illegal activities, or controversial opinions if you don’t have a plan for how you would use such data.”
How often does the IRB see poorly crafted applications?
The least prepared submitters are students. Faculty advisers need to coach their students better on the process, IRB members agreed. But by and large, they were satisfied with the quality of the research plans.
How often does the IRB reject a project?
So far, never, according to our IRB members. But one student did give up on a deeply flawed research project.
How long does the review process take?
This is what professors and students most fear: delay. And the reason is often that they did not budget enough time for IRB review.
At American University, scholars usually receive an initial response from the board within two weeks of submission. Exemption requests are usually handled within 10 days, expedited requests take three weeks, and full board approval depends on the situation but is, on average, four to five weeks.
What slows it down?
For one thing, people not reading directions. (Some things are universal.) Also, the review process is delayed if a research proposal uses mixed methods. In such cases, it’s important for the researcher to examine the potential risks and the possible mitigations for each method. If a researcher hasn’t done that clearly and separately for each method, it can get confusing.
Researchers also might want to volunteer to visit the committee meeting. “A lot of our questions,” said Reece, the librarian, “really boil down to whether or not we have confidence that the researcher understands enough about the context to be able to manage the unexpected issues that come up in the field” — especially junior researchers.
What can you do to move your research plan along the IRB process quickly?
The committee has a few standard recommendations:
- Write good research protocols — be clear, and don’t overreach.
- Make sure the informed-consent forms you will give to your human subjects include your contact information.
- Promise confidentiality to your subjects.
- Explain clearly how you are going to guarantee confidentiality — by meeting in private places, using pseudonyms in published work, not linking to people’s actual identities in the research data.
- Use passworded websites to protect the data.
- Delete the data afterward to protect the subjects from future claims (for example, in lawsuits).
- Detail your data-security plan. “Most survey software has built-in security,” said Herr, the psychologist, “but that is not enough. Researchers will be downloading, copying, and sharing their data with others — all of which needs to be addressed. Having a plan for secure local storage is important.”
- Don’t overclaim. If your research method cannot guarantee anonymity, don’t pledge it. Instead, pledge confidentiality.
- Mandatory reporting laws may conflict with confidentiality and privacy. Have a plan for that.
It’s easy to get frustrated by the bureaucracy of the review process. But don’t forget that this is a safeguard not just for your subjects, but for you, too. As Zembrzuski said, “The goal of the IRB is to protect the subjects of research, but it also protects the researchers themselves. The IRB helps look for risky behaviors.” With luck, the board will spot such behavior before you engage in it.