Sunday, March 18, 2007

My Problem with Anthropologists

(In honor of the discussion going on at Savage Minds, I present some thoughts on the role of anthropologists in the IRB debates.)
In terms of methods, anthropology and oral history seem to have a lot in common. Researchers in both disciplines enjoy learning about other people’s lives by talking to those people, often with a recording device. But the two fields have different ethical approaches, and I sometimes fear this makes anthropologists unreliable allies in the struggle for the freedom of research.
The problem starts with the code of ethics of the American Anthropological Association, which states that


anthropological researchers have primary ethical obligations to the people, species, and materials they study and to the people with whom they work. These obligations can supersede the goal of seeking new knowledge, and can lead to decisions not to undertake or to discontinue a research project when the primary obligation conflicts with other responsibilities, such as those owed to sponsors or clients. These ethical obligations include to avoid harm or wrong, understanding that the development of knowledge can lead to change which may be positive or negative for the people or animals worked with or studied.

Maybe I am missing something, but I don't see anything in the American Sociological Association’s code of ethics, the Principles and Standards of the Oral History Association or the American Historical Association’s Statement on Standards of Professional Conduct that obliges members of those organizations to avoid harm or to abandon the pursuit of knowledge lest someone be hurt. In seeking a balance between truth and inoffensiveness, the anthropologists have gone much further toward physicians’ Hippocratic standard of doing no harm than have their fellow social scientists. This decision may explain some troubling behavior:

1. Knee-Jerk Anonymity


My favorite anthropologist is Kathryn Marie Dudley, author of Debt and Dispossession and The End of the Line. Both books show how Americans struggle to reconcile their faith in the free market with their often conflicting belief that hard work should be rewarded regardless of market demand. That tension is central to many of today’s cultural and political debates, and Dudley did a magnificent job getting Midwestern farmers, teachers, and automobile workers to talk about their beliefs.
My complaint is that having done so, she fabricated names for her narrators, and barely felt the need to explain that decision. (Each book has a two-sentence note declaring that she wished to protect the privacy or confidentiality of the narrators, not why she thought this necessary.)
This has terrible consequences. First, it prevents other researchers from learning more about the lives of the people she studies, the way Dudley’s advisor, Katherine Newman, did by following up on the lives of her informants from a previous book. (Or at least it is supposed to. When I assigned Debt and Dispossession, one of my undergraduates did a quick search of newspaper databases and identified Dudley’s pseudonymous “Star Prairie” and some of its inhabitants.) And second, it suggests that the people who are the subjects of her books are not real, important people the way that other figures in the books—Lee Iacocca and Jesse Jackson—are. Unlike Iacocca and Jackson, their backgrounds need not be explored, and their words need not have consequences.
Most significantly for this discussion, the assumption that anonymity should be the norm contributes to the idea that interviewing is a dirty, dangerous activity. Of course some narrators wish their names to be disguised, and under some circumstances that is appropriate. But to see what happens when anonymity is the exception, not the rule, compare Dudley’s books to historian Leon Fink’s Maya of Morganton, another wonderful study of work in contemporary America. Fink offered anonymity to all his subjects, but the only ones who chose it were powerful executives and lawyers, not ordinary workers. In his book, interviews are opportunities to be heard, not sources of shame.


2. Disciplinary Imperialism



A voice of moderation in the IRB debates it that of Kristine L. Fitch, Professor of Communication Studies at the University of Iowa. (Since she defines herself as an ethnographer, I am including her in this rant about anthropologists. Perhaps that is unfair, but keep reading before you decide.) Fitch has been through IRB fights on both sides. In the 1990s, she writes, “I saw firsthand the aspects of human subjects review that so frustrate social science researchers, particularly those in the qualitative/ethnographic domain: applications full of questions aimed at biomedical research, requirements to obtain written consent despite cultural barriers to doing so, board members who said, in so many words, that their goal was to put a stop to as much research as they could.” (“Ethical and Regulatory Issues,” noted below.)
Fitch joined the University of Iowa’s social and behavioral IRB as chair and developed training materials that focused on the challenges faced by social scientists, in contrast to the medically-oriented materials mandated by most IRBs. At Iowa, researchers have their choice between “a two-hour workshop [focused on social and behavioral research], or completion of the National Institutes of Health (NIH) web-based certification course.” (See Fitch, “Difficult Interactions between IRBs and Investigators: Applications and Solutions,” Journal of Applied Communication Research 33 [August 2005]: 269–276. Apparently social-science ethics require live teachers, while medical ethics can be done in multiple-choice.) To help researchers and IRBs beyond Iowa, Fitch helped develop a CD and online training course called “Ethical and Regulatory Issues in Ethnographic Human Subjects Research.”
It is encouraging to see some training materials built from the ground up for social scientists. Rather than clubbing researchers over the head with more stories of Tuskegee, the CD focuses specifically on challenges in social-science research, such as how to protect privacy when studying sensitive topics, like eating disorders and illegal drug use.
But the CD goes wrong when it lumps in other disciplines with anthropology:


An issue that frequently creates tension between ethnographic researchers and IRBs has to do with translation of the ethical principles outlined in the Belmont Report into interpretations of federal regulations governing human subjects research. Some disciplines, such as the American Anthropological Association, the Oral History Association, and others have established systems of ethical principles specific to the kinds of research most characteristic of their areas. Those ethical principles arise from particular disciplinary histories and have been crafted by respected members of those professions. As such, they are often defended as more relevant, appropriate, and in fact more stringent than the necessarily more distant philosophy of the Belmont Report. Part of the disputable territory between researchers and IRBs that becomes contentious, then, is the distinction between abstract ethical principles and the regulations that spell out particular definitions, distinctions and prohibitions. Although researchers and IRBs would probably agree on ethical principles interaction between them is usually limited to the application of regulations to particular procedures, wording of consent documents, and so forth.


Did you catch the sleight of hand? Because anthropologists “and IRBs would probably agree on ethical principles,” Fitch assumes that “researchers and IRBs would probably agree on ethical principles.” I, for one, do not, and I see nothing in the guidelines of the Oral History Association that conforms to the Belmont Report’s demands for beneficence or what it calls justice. (See “Ethical Training for Oral Historians.”)
Beyond this misperception, I think Fitch is simply na├»ve about the operations of IRBs. “Ethical and Regulatory Issues,” for example, states that “IRB chairs and board members often have seen firsthand the negative consequences of . . . unanticipated problems.” Talk to researchers, talk to IRB members, read the postings on IRB Forum, and I think you’ll see that IRBs generally make decisions based on guesswork and what other IRBs are doing (in Fitch’s terms “long and thoughtful discussion among several reasonable people”), not firsthand or scholarly knowledge of the consequences of poor protocol design. If IRBs were required to support each decision with real-life examples of comparable projects gone bad, we would have many fewer restrictions on research, and essentially none on oral history.
And then there’s her claim in “Difficult Interactions” that “university administrators have a stake in human subjects oversight being carried out effectively and should be open to addressing problems within their IRB system. If they are not, the Office of Human Research Protection (OHRP) can be notified of hypervigilant regulation on the part of a local IRB. They can sanction IRBs for over-interpretation or misapplication of regulations when there is evidence that such is the case.” If OHRP has ever sanctioned an IRB for hypervigilance, I would love to hear about it.


3. Submission to the IRB Regime


What really concerns me are anthropologists in government, and here I am thinking of Stuart Plattner. Plattner served for thirteen years as the human subjects specialist for the National Science Foundation, and he worked to moderate some of the claims of IRBs. For example, the NSF’s website, "Frequently Asked Questions and Vignettes: Interpreting the Common Rule for the Protection of Human Subjects Behavioral and Social Science Research," created under his watch, includes the clearest statement by a federal agency that the Common Rule does not apply to classroom projects.
But Plattner is too ready to apply anthropology’s delicate ethics to other fields. In his 2003 Anthropological Quarterly article, “Human Subjects Protection and Cultural Anthropology,” he complains of “biomedical hegemony,” that is, the imposition of biomedical ethics on other disciplines. Yet in the same article he promotes a sort of anthropological hegemony when he writes, “no one should ever be hurt just because they were involved in a research project, if at all possible.” That’s consistent with anthropologists’ ethical statements, but other disciplines are happy to bring malefactors to account.
Where Plattner really gets scary is in his more recent “Comment on IRB Regulation of Ethnographic Research” (American Ethnologist 33 [2006]: 525–528). There he writes,


The journalist has a mandate from society to document contemporary reality. It is expected that this may involve an exposure of wrongdoing. A reporter’s reason for getting information from a person is to establish what happened. Those who speak to reporters accept the potential for harm that publicity may bring. Social scientists have no such mandate; we document reality to explain it. Our audience is professional, and society gives us no protection in the First Amendment. Our reason for getting information from individuals is to help us explain general processes. A normal condition for an ethnographic encounter or interview is that the information will never be used to harm the respondent.

The line that “our audience is professional” is simply defeatist; my undergraduates enjoyed Dudley’s books, and I hope plenty of readers outside the academy have found them as well. The bit about seeking to “explain general processes” is at odds with historians’ study of contingency, and I hope that other social scientists would reject that notion as well. But for the issue at hand, the key statement is that “society gives us no protection in the First Amendment.”
As a matter of jurisprudence, this is simply false. First Amendment liberties are common to all Americans, and if anything scholars enjoy heightened protection. In the 1967 case Keyishian v. Board of Regents of the University of the State of New York (385 U.S. 589) the Supreme Court held that “our Nation is deeply committed to safeguarding academic freedom, which is of transcendent value to all of us, and not merely to the teachers concerned. That freedom is therefore a special concern of the First Amendment, which does not tolerate laws that cast a pall of orthodoxy over the classroom.” It’s shocking that Plattner, who served so long as perhaps the most senior social scientist involved in shaping federal human subjects policy, has such little understanding of the law and such little concern for academic freedom.
Of course many anthropologists have written critically of IRB interference with their research. In the same journal that features Plattner’s dismissal of academic freedom, Richard Schweder eloquently argues that “a great university will do things that are upsetting,” citing Socrates, rather than Hippocrates, as the best Greek model for social science. (“Protecting Human Subjects and Preserving Academic Freedom: Prospects at the University of Chicago,” American Ethnologist 33 [2006]: 507–518). Yet who best represents the discipline: Schweder or Plattner? How far a leap is it from the American Anthropological Association’s subordination of the search for knowledge to Plattner’s suggestion that the First Amendment does not apply to scholarly research? Can anthropologists fight for academic freedom while holding that research shouldn’t hurt?

4 comments:

Anonymous said...

Given that you state that historians have no ethical demands to "to avoid harm or to abandon the pursuit of knowledge lest someone be hurt" , could you elaborate on what kind of harm you, or oral historians in general, are comfortable with in terms of causing harm or hurt as a result of your rearch to human subjects?

Or are you pointing out that the kind of research oral historians or social scientists do has no potential to cause real harm so why worry about the consequences?

Zachary M. Schrag said...

Thanks for your question. The type of harm historians should accept is harm to reputation.

Let me give you an example. In the early 1970s, Harry Weese, the chief architect of the Washington Metro, opposed the addition of elevators to Metro stations, something demanded by people who used wheelchairs and their advocates. In his 1991 interview for the Chicago Architects Oral History Project (an absolutely superb project, by the way), Weese explained,

"what we don’t like is the government being a fall guy for the handicapped. The handicapped got very tough in Washington, slowed us down for two or three years and that added a huge amount of cost to upgrading the building. They wanted all kinds of fancy gadgets."

(Harry Mohr Weese, interview by Betty J. Blum, Chicago Architects Oral History Project, 1991)

The first question is whether this statement "could reasonably . . . be damaging to [Weese's] financial standing, employability, or reputation," the criterion for review under 45 CFR §46.101(b)(2)(ii). The first two seem unlikely, and there may be many people out there who applaud Weese's willingness to stand up to the handicapped. But the callousness of this statement certainly diminished my own admiration for Weese, and every time I have talked to audiences about the elevator debate, they side with the advocates for the handicapped. So let's stipulate that by printing this statement, Blum damaged Weese's reputation.

Good for her. Her job as a historian was to seek and report the truth, not to edit out any unpleasantness. Saying only nice things about other people makes for good manners but bad history.

Jeffrey Cohen said...

This discussion about harm is based on a misunderstanding of the principle of beneficence in the Belmont Report and in the regulations. The principle is not limited to "do no harm", but, rather, requires that the risks of the research be balanced by the benefits to come from the research. As the regulations state [45 CFr 46.111(a)(2)]: "Risks to subjects are reasonable in relation to anticipated benefits, if any, to subjects, and the importance of the knowledge that may reasonably be expected to result." Neither the Belmont Report nor the regulations require that research not harm subjects, only that any risks are justified by the potential benefits to come from the research. I can't imagine that any reputable scholar in any field would be willing to harm individuals for no good reason. Presumably, the benefits to be derived from Blum's work justifies the damage to the respondant's reputation. If the respondent had inadvertantly admitted during the interview that he beat his wife, would Blum have published that? I would guess not, because there would be no benefit to publishing that (although she might have felt compelled to report it to the appropriate officials to protect the wife). IRBs that impose an absolute "do no harm" requirement are misapplying both the Belmont Principles and the regulations.

Zachary M. Schrag said...

An oral historian following the
Principles and Standards of the Oral History Association would not publish an inadvertent comment by Weese, whether it involved beating his wife or winning the Brunner Prize. The procedures specified by the standards are designed to make all comments as advertent as possible. This tracks the Belmont Report's call for respect for autonomy.

But this has nothing to do with beneficence, since it leaves narrators free to record comments that are damaging to themselves yet useless to society.

Dr. Cohen writes, "I can't imagine that any reputable scholar in any field would be willing to harm individuals for no good reason." If the search for truth alone is enough of a reason to justify harm to a narrator's reputation, then all oral history projects are justified, and we do not need IRBs to test for beneficence.

I challenge Dr. Cohen to provide examples of IRBs that have engaged in what the Belmont Report calls a "Systematic Assessment of Risks and Benefits" of oral history projects and come up with a meaningful result that reflects the ethics of oral history.