Blogger's note: In March the Harvard Crimson mentioned an unpublished essay by ethnographer Scott Atran of the University of Michigan, detailing his complaints about the IRB process. With Dr. Atran's kind permission, I present the complete essay here. ZMS
Scott Atran, research director in anthropology at the National Center for Scientific Research in Paris, visiting professor of Psychology and Public Policy at the University of Michigan, and presidential scholar in sociology at the John Jay College of Criminal Justice in New York City.*
Many bemoan the academic community's lack of input and influence in shaping society's understanding and actions regarding one of the most pressing issues of our time – terrorism.
On one side, those close to the "war on terrorism" often argue that academics are stuck in a post-Vietnam syndrome, basically out to lunch on the "hard" issues of helping their own society defend against aggressors or build enduring peace, or even acknowledging that there may be fundamental clashes between different cultural values that underlie and sustain conflict. In an article in The National Interest, "Thinking Outside the Tank," senior Rand analyst Steve Simon and counter-terrorism expert Jonathan Stevenson surmised that "scholars are now farther than ever from furnishing creative analytical support to policymakers," and recommended that academics should be left to their irrelevance.
On the other side, many in academia fear that their intellectual interests, integrity and independence could again be corrupted, as in the McCarthy and Vietnam years, and as in societies where governments have dictated what those interests should be, such as Nazi Germany, Soviet Russia, Saddam's Iraq, or many of our current allies in the Muslim world. Often they believe that the U.S. administration's motives in "the war against terrorism" are dishonest and that its effects disastrously corrode civil society and international relationships. They believe that academics should stay above the fray, but remain critically-minded.
Both sides have valid points, but also tend to exaggerate the naivété or malevolence of the other side. That's not unique to these groups; it's pretty much a character of adversarial groups anywhere, anytime. Research by Stanford social psychologist Lee Ross and others shows that individuals tend to misperceive differences between groups as more extreme than they really are, and also that individuals believe that their own group sees the world more objectively than other groups.
But a strong impediment to academic involvement in the great social issue of where terrorism comes from and what to do about it may not be from any lack of will or interest, but from the narrow vision of a relatively young but increasingly powerful bureaucratic institution that rigidly rules university research – the Institutional Review Board (IRB).
The roots of the IRB go back to the Nuremberg trials, where recognition of the need for guidelines dealing with human subjects in research emerged following disclosure of medical experimentation abuses by Nazi doctors. But the three guiding principles for IRBs were only formally codified in the 1979 "Belmont Report" by the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research (established by Public Law 93-348): Beneficence (to maximize benefits for science, humanity, and research participants and to avoid or minimize risk or harm), Respect (to protect the autonomy and privacy rights of participants), Justice (To ensure the fair distribution among persons and groups of the costs and benefits of research). At one of my home institutions, the University of Michigan, you can't even submit research for IRB approval until you take a test showing that you've memorized these principles.
Despite these good intentions, almost anyone who has tried to do research with human subjects outside of standard contexts has horror stories to tell about IRB demands to fit research that explores unfamiliar settings into a structurally familiar mode, no matter how arbitrary or uncomfortable the fit for subjects or researchers. What's a standard context? In non-clinical psychology, for example, well over 90 percent of studies are done within a few miles of major research universities, typically with undergraduates, and then usually generalized to Homo sapiens without a blink.
Many complaints, my own included, often have more to do with conflicts over procedures and protocols than over the nature and value of the research itself. For example, it's standard procedure to have subjects sign statements of informed consent acknowledging that the researcher has told them clearly and understandably what the study is about, making provisions to safeguard subjects from physical or psychological harm or discomfort, clarifying issues of how data will be accessed and used, spelling out costs and payments, and so on. IRBs do genuinely care about protecting the people who are being studied from being exploited or coming to harm. And so do I; I've used my research to help set up a forest reserve for the last Lowland Maya of northern Guatemala and to provide backdoor channels for Middle East truce negotiations. But it took me months to convince the University of Michigan IRB that Maya who can't read or write can't sign consent forms, that if they could they wouldn't (these Maya once put their "X" on a piece of paper they couldn't read, signing away most of their land), and that collective payment to subjects in the form of supplies for the community's forest reserve was a better idea than individual payments that would cause jealousies.
On terrorism, though, my row with IRB is different. There are very, very few scholars who directly talk to terrorists or those who inspire and care for them, although there's no end to elaborate theories and voluminous books on the subject. But I'm an anthropologist who believes in the principle, first spelled out by Isaac Newton in a letter to Nathaniel Hawes, that: "If, instead of sending the observations of able seamen to able mathematicians on land, the land would send able mathematicians to sea, it would signify much more to the improvement of navigation and the safety of men's lives and estates on that element."
On the basis of this principle, joined to solid research proposals, the National Science Foundation and Department of Defense independently granted a considerable sum of the U.S. taxpayer money to our cross-disciplinary, multi-university, international team (including researchers from Michigan, MIT, Harvard, Northwestern, Germany, France, Israel, Palestine and Indonesia) to interview jihadis in different settings, and run experiments with them on a range of theoretically interesting issues, including how group dynamics can trump individual personality in motivating suicide bombers, and how sacred values can limit rational choice with cultural taboos that block tradeoffs and kill attempts at compromise and negotiation. For example, our society considers that selling children or selling out one's country is immoral and sociopathic; many Native Americans believe that uninhabited burial grounds ought not be violated no matter what the majority voters decide or what material compensation is offered. But what's sacred and non-negotiable for jihadis, and what policy implications follow?
To date, UM's's arguments are these: You can't interview failed suicide bombers or their sponsors who are prisoners, because prisoners cannot, in principle, freely give informed consent. Thus, the IRB stipulated that parole boards must be kept abreast of everything and lawyers had to come along into the jail cells of the Bali bombers to verify that questions weren't potentially prejudicial to the prisoners. But it's nigh impossible to do serious anthropology or psychology with a lawyer present, and there are no parole boards in Indonesian military prisons. Nor is there any reasonable likelihood of that this sort of research will worsen the condition of convicted mass killers. Even if the IRBs conditions could be met, UM's told me that it considers research with prisoners like these to be "in principle" incompatible with the rights of human subjects and therefore "never" likely to be approved. This, despite the fact that the prisoners and the organizations that originally sponsored their actions are more than willing to give their consent and eager to get their ideas out.
So what about interviewing freely operating jihadis and would-be suicide bombers. Initially, the IRB decided that federal funds could not be used, despite well-accepted guarantees of complete anonymity (no recording of names, physical characteristics, places, settings, etc), because subjects might inadvertently reveal operational plans that could put them in jeopardy. Although any statement of consent would expressly inform subjects not to talk about operations, IRB's argument was that government intelligence services, or others, might find out about the interviews to identify and use against subjects. I do believe it is reasonable for the IRB to require a researcher not to ask about current or future operations because such information could put the interviewer in an impossible ethical bind with respect to whether to inform authorities so that victims lives might be spared. But it seems unreasonable to prevent research wherever avoidance of ethical dilemmas cannot be foolproof.
As it turns out, in August 2005 I did inadvertently find out about the formation of a rogue Jemaah Islamiyah suicide squad, thoifah moqatilah (fighting group), and vague plans to attack western targets, possibly tourist spots in Bali again. And I did report this to the U.S. Senate Foreign Affairs staff at a briefing on my research in September 2005, shortly before the October Bali bombing. But my receiving this information, and the moral obligation to report it, did not seriously compromise the health or welfare of the suicide bombers who carried out the attack or those who sponsored them. And if it had? If the suicide bombers had been stopped because of the information I inadvertently obtained, then by UM's's moral logic I would have been ethically remiss by disrespectfully violating the bombers' wishes in helping to save their lives and the lives of their intended victims.
Other IRBs, it is true, might have balanced this ostensible ethical lapse on Respect with the values of Beneficence and Justice. But I also apparently failed to make a sufficient case of the "costs and benefits" to the university and society of the research, though I pointed out that my interviews with radical Islamist leaders resulted in fruitful contacts during a crucial Middle East ceasefire negotiation and that any lives saved should count as a net benefit of the research. More generally, helping to understand why someone would want to blow up Manhattan, London, Tel Aviv or Jakarta could help to stop Manhattan, London, Tel Aviv or Jakarta from getting blown up and that, too, would be a pretty good benefit. But that argument was not, it appears, strong or clear enough.
After many months, the IRB decided to release emergency funds that were specifically awarded by the National Science Foundation for "high risk research" to do pilot interviews with freely operating jihadis, but with two caveats: no group identifications should be registered (this forbids comparing, say, Jemaah Islamiyah to Hamas, or to any other group in the world, which puts a serious constraint on a project aimed at comparative understanding of jihadi groups); and no personal details should be collected (this rules out asking what personally may have motivated someone to join jihad, which puts a serious constraint on understanding what motivates individual jihadis). In a penultimate round of discussions, the IRB seemed to have accepted the argument that labels like "Group A" versus "Group B" were permissible and that general descriptions of motivations without any details that could identify persons, places or events were allowed.
In the end, UM's IRB decided that the permission it had given me to carry out the truncated emergency research could not be pursued further, even on matters that had been previously approved, and against which no new objections had been raised. Although initial research results were tentative, as in almost any research project, the research itself could not reasonably have been judged shoddy or trivial: preliminary results of the research were published in reputable scientific and public outlets, including articles in Science and Nature magazines, The Washington Quarterly and Foreign Policy, and The New York Times and Wall Street Journal.
Because the IRB has dictatorial powers, with no right of representation by those being judged and no right of appeal to any higher authority in the university – or for that matter, in the land – it seemed the research was dead. It turns out, however, that after intense lobbying by several faculty and university administrators, the IRB did ultimately give temporary dry approval (approval to plan research only, with no involvement of human subjects) for one of the research projects, although the IRB declined to do so for the identical design on another project. But dry approval only means that money may be spent for travel and meetings with colleagues to discuss the results of previous research and to plan for future research, but not to have students or anyone else undertake research, or even to analyze or do any work with the results of previous research. For those who may think that sounds crazy, the IRB has a stunning reply: because the implications of previously collected data or previous findings related to human subjects may have implications for those subjects, or other humans, that may be different from the implications originally foreseen, then any proposal for the analysis of secondary data that does not directly involve human subjects must be treated as a new proposal involving human subjects. This is a chilling constraint that has the potential to stop a research program dead in its tracks, in the face of any politically correct wind, no matter how advanced that research or how distant from any living or dead human subjects.
My own view is that most of this is nuts: how is anybody in academia ever going have as much as possible to offer in this whole mess – though people in academia keep complaining that the government doesn't pay attention to serious scholars – if no one can even talk to the people most involved? Now, it's getting next to impossible to even talk to people who are dying to kill in order to better understand why they die to kill, or just why they want what they want. And, of course, IRB expressly forbids even thinking about trying to stop them from actually doing what they want because that could interfere with their rights. "Don't ask, don't tell" isn't enough – IRB wants guarantees that the opportunity for discovery can never arise.
Ask any anthropologist, political mediator, or hostage negotiator worth their salt and they'll tell you that you need to show empathy and respect towards the other party to learn anything of true or lasting value. If that's what the IRB required it would be good and right. But that isn't what IRB asks for: IRB rules say nothing about promoting empathy for subjects, only about following to the letter rules that are tailored to respect subjects' individual rights to their own privacy and property as if the research were in an American university classroom or laboratory.
I'd like to be clear that I don't simply blame the members of UM's IRB. I think the major fault lies with the IRB as an institution and the rules it is required to implement, rather than the people enforcing the rules. Perhaps one remedy is that certain kinds of IRB approvals should be taken at the national rather than the university level, though institutionally protected from the political riptides of the electoral cycle. The advantage of a national board is that their sponsors could be government agencies whose interests focus more on their mission (such as national security) than on protection of undergraduate students. A national board (or boards) could then use guidelines that would differ from those designed to protect interests of typical subjects. There are various ways to define the domain of a national board, such as "prisoner's and those hiding from the law." Alternatively, it could be defined as using subjects who are relevant to national security (under a very narrow formulation). One would have to decide whether, for example, studies of urban gangs in the USA should be covered or not by a national board.
There are, to be sure, also serious disadvantages to a national IRB, including the potential for pressure from the sponsors to say "anything goes." So, another remedy might be to change the guidelines used by university IRB's that would apply to special circumstances, such as working with violent militants, where even requiring respect may be tricky. Perhaps most important, all the boards should understand and evaluate the facts to some definable standard and apply the same values, unless there were defensible differences in community standards. Lack of inter-board reliability is a guarantee of lack of validity in judgment of facts and in judgment of values.
Over three years ago, in testimony before the House Science Committee, Dr. M.R.C. Greenwood, Chancellor of the University of California, Santa Cruz, argued that "the traditions and structure of research in the U.S. today depends on replication and refutation, which means… sufficient data and methods," and that "balancing the perceived risks of open access with the risks to the health and vitality of the research community is exactly the kind of issue that calls for a new partnership between the research community and the government." That partnership is woefully lacking when it comes to dealing with terrorism, in part because universities and the government have chained themselves to an institution that not only never fathomed dealing with suicide bombers – true not just for IRBs – but which lacks the flexibility and imagination to face the problem. Yet suicide bombers are here; they've burst upon the world and, along with their sponsors and supporters, are changing how societies seek security and interact. This needs to be looked at up close. So IRBs, let the scholars go out to sea.
*I wish to thank Robert and Amy Axelrod, Richard Nisbett, Douglas Medin, Steven Pinker, Baruch Fischhoff and Charles Strozier for suggestions on an earlier draft. They bear no responsibility for any arguments presented here.
No comments:
Post a Comment