As the press release, "AAAS Meeting Explores Ways to Improve Ethics Panels that Oversee Social Science Research," notes, "from both researchers and administrators alike, there was general agreement that the system can and should work better," but participants disagreed about what a better system would look like.
Social researchers recounted some horror stories, the most vivid of which was Gigi Gronvall's:
Gigi Gronvall, a senior associate at the Center for Biosecurity of the University of Pittsburgh Medical Center, recalled her efforts to do a survey of scientists who do dual-use research in biology, such as work on viruses that might have military as well as civilian application. The IRB at Johns Hopkins University, where the biosecurity center was located at the time, asked Gronvall to give a three-page consent form to all those she interviewed. It included a warning that the respondent, in agreeing to answer Gronvall's questions, ran the risk of being investigated by government agencies, being "exploited by hostile entities," or even being kidnapped.
The IRB's members, Gronvall said, "were totally over-identifying with my subject population." The result was a six-month delay in the survey project, during which Gronvall almost lost her funding. "The IRB should be on your side," she said. "That's not how I felt during this."
Gronvall said that blocking or delaying research on a controversial topic can mean that it will be explored only in the news media, without any IRB-style protections for those being interviewed.
In addition to such personal experiences, some participants warned of fundamental flaws in the IRB system. I described the origins of IRB review as the work of physicians, psychologists, and bioethicists who had no understanding of the methods and ethics of social scientists, and assembled no evidence of widespread abuse by them. Joan Sieber, editor of the Journal of Empirical Research on Human Research Ethics, suggested that the horror stories are not abberations, but common.
IRBs had their defenders, particularly from scholars and consultants with a background in medicine. Anne N. Hirshfield, associate vice president for health research, compliance and technology transfer at the George Washington University, claimed that "IRBs can only work if there is mutual attention to a common goal—conducting ethical research that protects the rights and welfare of participants." But she also argued that "it is the obligation of the P.I. to know what the IRB needs and to stage the argument. Give the citations to show that your work is not risky." In other words, researchers must prove a negative, while IRBs are free to conjure up scientist-kidnappers.
Perhaps the most impressive presentation was that of Janet DiPietro, associate dean for research at Johns Hopkins University's Bloomberg School of Public Health, who described her efforts to reform the IRB there. If all IRBs were managed by someone as thoughtful as she, we'd have far fewer complaints. But there aren't enough DiPietros to go around, so her achievements do not make a good case for granting administrators nationwide such power.
The press release concludes with a statement from Mark Frankel, staff officer for the AAAS's Committee on Scientific Freedom and Responsibility:
Some Committee members believe that the balance is awry because IRB's are imposing unwarranted and arbitrary demands on proposed research by social and behavioral scientists. This raises serious issues related to scientific freedom insofar as such actions lead to potentially valuable research that is inappropriately altered, unduly delayed, or not done at all.
While noncommital about specific policy recommendations, this recognition that IRB review is a threat to scientific freedom is in itself an important finding. I look forward to further AAAS efforts to explore this problem and contribute to its resolution.