the core problem with the Common Rule is the IRB’s power to treat its insights and risk–benefit calculations as ‘‘right answers’’ that may be imposed at no cost to the IRB upon researchers whose own ethical reflection may have led to different, equally defensible conclusions.
Robert Dingwall concurs in his essay, "The Ethical Case Against Ethical Regulation in Humanities and Social Science Research," 21st Century Society 3 (February 2008): 1-12. Though Dingwall is British, he notes that the system there looks "very like US Institutional Review Boards, and their analogues in Canada and Australia." (4) British boards, and British rules in general, fail to account for the costs of ethical review.
This has real consequences. Dingwall relates his own experience:
A colleague and I were recently commissioned by the NHS [National Health Service] Patient Safety Programme to study the national incidence and prevalcence of the reuse of single-use surgical and anaesthetic devices, and to consider why this practice persisted in the face of strict prohibitions. Part of this involved an online survey, using well-established techniques from criminology to encourage self-reporting of deviant behaviour, so that relevant staff in about 350 hospitals could complete the forms without us ever needing to leave Nottingham. However, a change in NHS ethical regulation meant that we needed approval from each site, potentially generating about 1600 signatures and 9000 pages of documentation. Although we never planned to set foot in any site, it would also have required my colleague to undergo around 300 occupational health examinations and criminal record checks. As a result, we were unable to carry out the study as commissioned and delievered a more limited piece of work. Other estimates suggest that the practice we were studying leads to about seven deaths every year in the UK and a significant number of post-operative infections. The ethical cost of the NHS system can be measured by the lives that will not be saved because our study could not investigate the problems of compliance as thoroughly as it was originally designed to. (10)
This is a stark example, but Dingwall sees it as emblematic of a general drag on social research that has consequences for the future of free socieites. Ethical regulation of humanities and social science research, he argues, contributes to "a waste of public funds, serious information deficits for citizens, and long-term economic and, hence, political decline . . . " (10)
Dingwall discounts the need for oversight, arguing that humanities and social science researchers "do nothing that begins to compare with injecting someone with potentially toxic green stuff that cannot be neutralised or rapidly eliminated from their body if something goes wrong. At most there is a potential for causing minor and reversible emotional distress or some measure of reputational damage." (3) I think this takes the case too far. See Sudhir Venkatesh’s Gang Leader for a Day for a recent example of a social scientist who seriously hurt people by breaking their confidences. (The book is recent; the incident took place in the early 1990s.) Dingwall's own research, had it exposed a physician who was illegally re-using devices, would have done irreversible harm to that physician. Rather than arguing that such harms are impossible, Dingwall would be better off arguing that they are a) rare, and b) not likely to be prevented by the forms of prior review now in place.
The Belmont Report calls for "systematic, nonarbitrary analysis of risks and benefits . . . This ideal requires those making decisions about the justifiability of research to be thorough in the accumulation and assessment of information about all aspects of the research, and to consider alternatives systematically." If we were to hold regulatory regimes to the same standard, we would find ample risks, few documented benefits, and no consideration of alternatives.