Thursday, June 2, 2011

Dreger on Naming Names

Alice Dreger explains why historians are reluctant to promise either anonymity or nonmaleficence:

Real accountability requires real names.

 A colleague of mine writes about this in his most recent book, where he discusses why we historians cannot promise our IRBs that we will not harm our subjects. He points out that sometimes we go into a project pretty much knowing that we're likely to harm some of our oral history subjects, because we're tracking an uncomfortable history where – almost by definition – somebody did some dumb or bad stuff.

It's also really hard to appropriately laud those who did the right thing without naming the names of those who didn't along with those who did. For much needed inspiration and perspective, lately I've been reading the definitive biography of a particular founding father by an historian whose name you would surely know. Through it, I have been reminded how the stirring lessons we take from the history of our brave and wise founders is made possible by knowing who exactly said and did what to whom. We need to know the names of the cowards and traitors to really appreciate the heroes and martyrs.


Eloquent, but it's still hard to beat Tacitus: "This I regard as history's highest function, to let no worthy action be uncommemorated, and to hold out the reprobation of posterity as a terror to evil words and deeds."

Dreger also throws in a bit of pedagogy:



Years ago, I developed a little bit of fame at a certain Big Ten university for banning the word "society" from my course. I was teaching a course on something and something else, and I had grown weary of my students constantly saying, "Society thinks . . ." or, "Society says . . ." This was my students' way of seeing the world as hopeless in its oppression: Society was to blame for gender discrimination, for oppression of the poor, etc.

As long as Society was to blame, no one was to blame. And no one had to change the status quo, because no one could change Society. Once I forced my students to start naming who exactly thinks or says this or that, their whole view of the world changed. Suddenly they realized who was responsible for promoting this (mis)representation or that ugly norm. And they realized you just had to change the behaviors of those people. Suddenly my students had power. The giant named Society had magically shrunk; the short guy with the slingshot had magically grown.


As my students (especially in HIST 332) well know, the forbidden word in my classroom is always. Claim that people have always behaved in a certain way, and I can find a point in a time for which that claim is false. (The planet is 4 billion years old, you know.) The study of history is the study of beginnings and of endings. This too, is the study of power: if the world was different once, it can be different again.

6 comments:

Jeffrey Cohen said...

Just a note to remind everyone that there is no requirement that research does not harm subjects. The only requirements are that risks are minimized - that is, the research inflicts the least amount of harm necessary, and that risks are reasonable in relation to the benefits of the research - that is, there is no harm inflicted that is not necessary to obtain important information. Also, there is no requirement that research participation be anonymous, only that subjects are informed about how their identity will be protected, if at all. As long as subjects know that their identity will be revealed and give their consent, then an IRB should have no problem approving the research.

Zachary M. Schrag said...

Dr. Cohen, what would it take to get you to read the Belmont Report?

Anonymous said...

Come on Zack, if you are going to make an argument, make it. Which part of the Report did you have in mind and why? Otherwise, let's just agree that Jeffrey is correct.

Zachary M. Schrag said...

Please see the following two posts, including the comments:

Australian Political Scientist: "Causing Harm . . . May Be the Whole Point"

First, Do Some Harm, Part III: Loosies in San Francisco

Jeffrey Cohen said...

Dr. Schrag, do not make this personal. I was reading and teaching the Belmont Report before you were out of elementary school.

The Belmont Report states that "do not harm" is only one part of two "complementary expressions" of the principle of beneficence, the other being "maximize possible benefits and minimize possible harms." It then goes on to explain, "The problem posed by these imperatives is to decide when it is justifiable to seek certain benefits despite the risks involved, and when the benefits should be foregone because of the risks." Thus, it is clear that the drafters of the report did not intend research to be risk free.

Zachary M. Schrag said...

Thank you for your comment. It is always helpful to quotes sources.

I agree that the Belmont drafters did not intend research to be risk free. But that is not to say that they ever accepted the idea that some ethical researchers might deliberately harm their subjects. It seems that the University of California-San Francisco IRB members read the report differently from you, and I can't say I blame them.

Indeed, your understanding of the Belmont Report appears to be at odds with that of some of its authors. In Belmont Revisited, Robert Levine writes, "The principle of beneficence, as interpreted by the National Commission, creates an obligation to secure the well-being of the individuals who serve as research subjects and to develop information that will form the basis of being better able to serve the well-being of similar persons in the future. However, in the interests of securing societal benefits, one should not intentionally injure any individual." This is not consistent with the ethics described by Dreger.

Similarly, Albert Jonsen told me, "We really should have made much clearer distinctions between the various activities called research. The principles of the medical model are beneficence—be of benefit and do no harm. I simply don’t think that that applies to either the intent or the function of most people doing research."

Then there is the reaction of sociologist Albert Reiss, who attended the Belmont conference and contributed a paper to the National Commission. Having unsuccessfully called for the drafters to recognize the value of "muckraking sociology," he later denounced the Belmont Report as "ethical malpractice." [Albert J. Reiss Jr., "Governmental Regulation of Scientific Inquiry: Some Paradoxical Consequences," in Carl B. Klockars and Finbarr W. O’Connor, eds., Deviance and Decency: The Ethics of Research with Human Subjects (Beverly Hills: Sage, 1979), 67]

If you can find an explicit statement by any of the Belmont drafters calling for the "naming the names of those who didn't [do the right thing]," in Dreger's words, I would be grateful for the reference.

What the drafters of the Belmont really wanted was for the federal government to answer questions like this rather than leaving folks like you and me to guess what they meant. As Jonsen explains in his contribution to Belmont Revisited, "my colleagues and I fully anticipated that an Ethical Advisory Board (EAB) would he established as a standing agency within the Department of Health and Human Services. We had so recommended in almost all of our reports. We expected that such a Board could be the living oracle of Belmont's principles. Just as our Constitution requires a Supreme Court to interpret its majestically open-ended phrases, and, if I may allude to my own Catholic tradition, as the Bible requires a living Magisterium to interpret its mystic and metaphoric message, so does Belmont, a much more modest document than Constitution or Bible, require a constantly moving and creative interpretation and application."

Canada has revised its TCPS in the light of experience and debate, and in doing so it has explicitly recognized the value of critical inquiry. Australia has pledged to revisit its ethical guidelines periodically, giving researchers there the hope that they can amend the National Statement to match their principles. But in the United States, we are stuck with the Belmont Report, ambiguous in its language, inconsistent with federal regulations, in conflict with the ethics of social science, and impervious to change. It is time to rethink it.