On July 16 I attended the second day of the open meeting of the Secretary's Advisory Committee on Human Research Protections (SACHRP, pronounced sack-harp) in my home town of Arlington, Virginia. This was the first time I have observed such a meeting, and I am sure there is much I missed for want of context. But in this and following posts, I will record a few impressions.
The most interesting part of the meeting came at the end, when the committee's chair, Samuel Tilden, invited committee members to participate in "a systems level discussion" of today's human subjects protection regime. Not all committee members offered comments, and I was disappointed that anthropologist Patricia Marshall, the sole social scientist on the committee, did not do so. But the members who did speak displayed a range of viewpoints.
The most enthusiastic advocates of the status quo were Jeffrey Botkin and Daniel Nelson. Botkin described himself as an "unabashed advocate of current system." He noted that IRBs rose in response to documented abuses in medical research, such as those detailed by Henry Beecher in 1966 ["Ethics and Clinical Research," New England Journal of Medicine 274 (16 June 1966): 1354-1360]. Today, he noted, most researchers know the rules. While the system may let an occasional unethical project slip through, there is no "hidden underbelly of unethical research."
This is an important point, and I remain agnostic about whether IRBs are appropriate for medical research. But I am also sure that Dr. Botkin understands that even beneficial drugs can have nasty side effects, and that he would not prescribe the same drug to treat all ailments. I would be interested to know what he considers the social science analogue to Beecher's article. For if we are to judge today's system by its ability to avoid documented problems of the past, we need to know what we are trying to avoid for every type of research we regulate.
Nelson declared that the "Subpart A Subcommittee" he co-chairs decided early in its existence that "there is general consensus that the Common Rule is not 'broken.'" Yet in his system-level talk, he conceded that the power granted by the Common Rule to local IRBs results in arbitrary decisions (he called this "variability") and "well-intended overreaching." He noted that the only sure way to eliminate all risky research is to eliminate all research.
Other committee members, while not calling for changed regulations, were more explicit about current problems. Lisa Leiden, an administrator at the University of Texas, has heard from a lot of upset faculty, and she is looking for ways to relax oversight. This would include "unchecking the box," that is, declining to promise to apply federal standards to research not directly sponsored by a Common Rule agency. Without going into specifics, she suggested that the federal standards are too stringent, and that the University of Texas system, if freed from them, would craft exemptions beyond those now offered by the Common Rule. Overall, she is looking for ways to move from a "culture of compliance to one of conscience."
Liz Bankert, Nelson's co-chair of the subcommittee, also showed her awareness of the overregulation of social research, and her frustration with IRBs' emphasis on regulatory compliance. "I've gone to IRBs all over the country," she reported. "They are thoughtful, sincere, really intelligent groups. To have all this brainpower sucked into the vortex of minimal risk research is not efficient." It also contributes to what Bankert sees as a lack of mutual respect between IRBs and reseachers. She blamed the problems on a "fear factor which has been developing over the past several years."
Both Leiden and Bankert implied that it was the interpretation of the regulations, not the regulations themselves, that caused the problems they have identified. Without saying so explicitly, they seemed to blame the OPRR of the late 1990s for scaring IRBs all over the country into letter-perfect regulatory compliance, at the expense of research ethics.
In contrast, two committee members seemed willing to reconsider the regulations themselves. David Strauss hoped for a system that was "clinically and empirically informed," terms that no one could apply to the regulation of social research. And he recognized that the regulations are not divine revelation. "We shouldn't be reviewing research that we don't think needs to be reviewed because some folks 30 years ago, at the end of a long, hot day, decided to use the word 'generalizable,'" he explained. "We have to have language that makes sense to us."
Finally, Tilden himself described the Common Rule as largely broken. He noted that the 1981 regulations--which have changed only slightly since--were accompanied by the promise that most social research would not have to undergo IRB review. The fact that so few social science projects escape review, he concluded, showed that the exemption system has collapsed. Rather than try to shore it up again, he suggested that concerns about confidentiality be separated from other risks, and that projects whose only risks involved breaches of confidentiality be evaluated only for the adequacy of their protections in that area.
This last proposal interests me, because when scholars talk seriously about the wrongs committed by social science researchers, they almost always come back to questions of confidentiality. If IRBs were restrained from making up other dangers--like interview trauma--and instead limited to more realistic concerns, they could potentially do some good.
In sum, I did not get the impression that, in Nelson's words, "there is general consensus that the Common Rule is not 'broken.'" Strauss and Tilden, in particular, seem to understand that the present system has wandered far from the stated intentions of the authors of the regulations, and from any empirical assessment of the risks of research or the effectiveness of IRBs. I hope they will continue to think about alternative schemes that would keep controls on medical experimentation without allowing federal and campus officials free rein to act on their fears.