Showing posts with label Dreger. Show all posts
Showing posts with label Dreger. Show all posts

Friday, June 19, 2015

Goffman's Tightrope

Two new articles add useful context to the debate about Alice Goffman’s On the Run. Together, they show just how narrow a path Goffman was walking between privacy and verifiability, and between scholarship and good writing. I will address the IRB issues in a separate post.


[Jesse Singal, “The Internet Accused Alice Goffman of Faking Details In Her Study of a Black Neighborhood. I Went to Philadelphia to Check,” Science of Us, June 18, 2015.; Leon Neyfakh, “The Ethics of Ethnography,” Slate, June 18, 2015.]

Wednesday, May 6, 2015

OHRP Inaction Leaves IRBs Reliant on Gut Feelings

Theresa Defino’s Report on Research Compliance describes an April 14 webcast by Robert Klitzman about his new book, The Ethics Police.


[“Books, Bioethics Panel Say OHRP Inaction Weakens Protection System, Thwarts Trials,” Report on Research Compliance, May 2015.]


Most of the excerpts from Klitzman concern the way that OHRP silence hampers IRBs:


When they reach out to OHRP for support, IRB officials reported getting nowhere. In the chapter titled, “Federal Agencies vs. Local IRBs,” Klitzman wrote that one chair told him, “Many times when you call for advice, they essentially just read back the regulations.”


One recounted waiting two years to hear from OHRP on changes it had made. When federal officials respond, “they often refrain from doing so in writing, or say that the clarification does not apply more generally,” Klitzman was told.


Without assurance that they are acting correctly, IRBs act arbitrarily:


IRB chairs and members, according to Klitzman, “relied on gut feelings, intuition, the sniff test. People wanted to feel comfortable….They wanted peace of mind” about the studies they approved. Decisions were influenced by “pet peeves” and the “prudishness” of IRB members and chairs. Some IRBs are “user-friendly” or “pro-research,” he said.


In truth, such arbitrariness serves neither researchers nor research participants. Defino quotes Alice Dreger’s new book Galileo’s Middle Finger, which argues that “in practice, protections for people who become subjects of medical research may be their weakest in decades.”


The IRB system is premised on the notion that, at times, researchers and subjects have competing interests. Thanks to OHRP, they also have a common enemy.

Sunday, March 17, 2013

On Signing the Markingson Petition

By April 1942, the Pentagon was 40 percent over budget, partly because it had been enlarged since first approved, but mostly because the original estimate of $35 million had never been realistic. Lieutenant General Brehon Somervell delayed telling Congress, but in June he finally sent Colonel Leslie Groves to appear before a House Apppropriations subcommittee.

Wednesday, February 20, 2013

Dreger Reviews Stark: It Is Lawyers All The Way Down

Alice Dreger reviews Laura Stark's Behind Closed Doors for the Journal of American History:

Contrary to the self-aggrandizing story bioethicists like to tell about how IRBs arose out of concern for human subjects of research, Stark shows that, when you dig into this history, it is lawyers all the way down . . . She argues that IRB work was decentralized not to make it more ethical, but to protect the NIH from lawsuits. Stark convincingly concludes that IRBs today do not primarily enact ethical principles; they manage procedures.

[Dreger, Alice. “Behind Closed Doors: IRBs and the Making of Ethical Research.” Journal of American History 99, no. 4 (March 2013): 1328–1328. doi:10.1093/jahist/jas666.]

Monday, August 1, 2011

ANPRM's Problem Statement: Helpful but Incomplete

One of the many remarkable sections of the July 26 advance notice of proposed rulemaking (ANPRM) is its admission that the Common Rule is flawed.

(Note: I have added a link to the ANPRM at the top of the link list in the sidebar.)

Since the 1970s, IRB apologists have claimed that federal regulations are flexible enough, and that local IRBs are to blame for any problems. In 2007, for example, Jerry Menikoff quoted with approval Jeffrey Cohen's 2006 claim that "the regulations provide sufficient flexibility for the efficient and appropriate review of minimal risk research. IRB review of such research does not have to be burdensome or unreasonable if IRBs appropriately utilize the flexibility in the regulations." Menikoff reiterated his claim of "flexibility within the system" in his 2009 speech, “The Legal Assault on the Common Rule."

After thirty years of such claims, it is wonderfully refreshing that the ANPRM takes so seriously many of the critiques leveled at the federal regulations themselves. And the ANPRM helpfully organizes those critiques into seven general categories.

On the other hand, ANPRM's problem statement (pages 44513-44514 in the Federal Register version) overlooks some major critiques. Fortunately, some of those critiques are implicitly recognized by some of the ANPRM's proposals.

Friday, July 29, 2011

Elliott Wants to Scrap IRBs

Carl Elliott, author of White Coat, Black Hat: Adventures on the Dark Side of Medicine, calls IRBs "incapable" and wants them replaced.

[Carl Elliott, Useless Pharmaceutical Studies, Real Harm, New York Times, 29 July 2011.]

Tuesday, March 1, 2011

Who Should Investigate Research Misconduct?

Two recent items do not directly involve IRBs, but they raise broader issues of accountability for research misconduct.

[Erin O'Connor and Maurice Black, "Save Academic Freedom," Inside Higher Ed, 28 February 2011; Alice Dreger, "Darkness's Descent on the American Anthropological Association: A Cautionary Tale," Human Nature (published online 16 February 2011).]

Thursday, December 23, 2010

First, Do Some Harm, Part II: The AAA Ethics Task Force

In mid-October, the Ethics Task-Force of the American Anthropological Association solicited comments on the following text, a section of a draft Code of Ethics now being written:


Do No Harm

Anthropologists share a primary ethical obligation to avoid doing harm to the lives, communities or environments they study or that may be impacted by their work. This includes not only the avoidance of direct and immediate harm but implies an obligation to weigh carefully the future consequences and impacts of an anthropologist’s work on others. This primary obligation can supersede the goal of seeking new knowledge and can lead to decisions not to undertake or to discontinue a project. Avoidance of harm is a primary ethical obligation, but determining harms and their avoidance in any given situation may be complex.

While anthropologists welcome work benefiting others or increasing the well-being of individuals or communities, determinations regarding what is in the best interests of others or what kinds of efforts are appropriate to increase well-being are complex and value-laden and should reflect sustained discussion with those concerned. Such work should reflect deliberate and thoughtful consideration of both potential unintended consequences and long-term impacts on individuals, communities, identities, tangible and intangible heritage and environments.


As of December 13, 33 people (presumably all anthropologists, but I'm not sure) had posted comments. The comments are often nuanced, making it hard to say whether they endorse the language or not. But they broke down roughly as follows:

Do No Harm



Significantly, the most wholehearted supporters of the "do no harm" proposal are those who uncritically embrace the Belmont Report and the Common Rule. "'Do no harm' is an IRB principle, and so it should be in our code," writes Bethe Hagens. Four other responses, from Chip Colwell-Chanthaphonh, mkline, Robert T Trotter II, and Simon Craddock Lee, all seem to suggest that the AAA code should conform to those documents, without asking much about their origins or their fit to the practices and beliefs of anthropologists.

Four other responses--from Barbara Rose Johnston, Seamus Decker, socect, and Vicki Ina F. Gloer--endorse Hagens's idea that anthropologist should "intend no harm." Despite the Belmont Report's description of "the Hippocratic maxim ”do no harm” [as] a fundamental principle of medical ethics," this form is more faithful to the Belmont's overall section on beneficence.

Do Some Harm



Eight responses--almost as many--appear to reject the "do no harm" idea on the grounds that neutrality is impossible, and anthropologists should not hesitate to harm those who deserve it. "A blanket edict to 'Do No Harm' could easily lead to a professional paralysis when one considers that a few steps away from the person giving you this interview is someone who will not like, will want or need to fight, or will suffer consequences for what is said much further down the line," writes Benjamin Wintersteen. Murray Leaf concurs. "Do no harm is fine as principle of medical practice," he writes, "where you are working with a single individual. It is nearly meaningless when you (we) work with human communities, in which what is good and what is harm is usually in contention. As some of these posts suggests, what we do is often a matter of helping some while undermining the position of others. No harm at all, in such a context, would almost always be also no help at all–and no effect at all."

Bryan Bruns offers an example. "I work, in conjunction with communities and a government agency, to design and support a process in which communities are likely to, in a reasonably democratic way, act to restrain the behavior and thereby (harm) reduce the benefits of a few people (upstream irrigators, large landowners) who currently take advantage of others, it’s not clear how a principle of 'do no harm' would allow any practical engagement."

I would say that the responses by Dimitra Doukas, Joan P Mencher, Moish, Noelle Sullivan, and Ray Scupin all fall in this general category of respecting critical inquiry. Margaret Trawick's comment is harder to categorize. "I have been teaching 'Do no harm' to my students as the first ethical principle for anthropological fieldwork, for many years," she writes. "It is a difficult principle to follow, precisely because you never know what might cause harm, and therefore you have to THINK about what you are doing in the field more carefully than you might in everyday life. Good intentions are not enough. Additionally, 'harm to whom' is a good question . . . Sometimes to protect and advocate for one party (.e.g. Untouchables in India) is to, at the least, offend some other party – e.g. high caste Hindus." Given her understanding of this problem, I'm not sure why she teaches "do no harm" rather than something like "think about whom you are harming."

It's the Wrong Question



An even greater number of responses suggest that, in the words of Carl Kendall, "This principle is way too vague and self-directed to be practically useful." Kendall hints, perhaps cynically, that anthropologists need one set of principles these ethical principles to "pass IRB muster" and a second set "to protect communities and fieldworkers." Carolyn Fluehr-Lobban argues that "'Harm' should be problematized—are there agreed upon universal standards of harm, and where is there discussion of reasonable disagreement."

James Dow rejects the medical language of IRBs: "'Do no harm' is an good ethical principle to be applied to individual social relationships, which we hope that we understand; however, there is a problem when applying it to larger societies and cultures." Likewise, David Samuels writes that "The place where you need to get informed consent is at the point at which you have turned people into characters in your story. The medicalized pre-framing of the IRB process doesn’t cover that at all."

Taken as a whole, the responses suggest that only a minority of those commenting embrace the Belmont Report and the IRB process as enthusiastically as the AAA did in its 2004 statement that presents the active involvement of IRBs as a positive good. I hope the Task Force recognizes this, and takes the opportunity to reconsider the AAA's overall position in regard to IRB review.

[Hat tip to Alice Dreger. For a historical perspective on another discipline's efforts to craft a research ethics code, see Laura Stark, "The Science of Ethics: Deception, the Resilient Self, and the APA Code of Ethics, 1966–1973," Journal of the History of the Behavioral Sciences 46 (Fall 2010): 337–370.]

Tuesday, October 26, 2010

Dreger Wants to Scrap IRBs

On the heels of Laura Stark's Los Angeles Times op-ed calling for the replacement of local IRBs with centralized boards of experts, historian Alice Dreger has published her own call for a national system of ethics review based on expertise and transparency.

[Alice Dreger, "Nationalizing IRBs for Biomedical Research – and for Justice," Bioethics Forum, 22 October 2010.]

Troubled by her IRB's approval of a project she considers unethical, and by Carl Elliott's White Coat, Black Hat: Adventures on the Dark Side of Medicine, Dreger concludes that the system of local review is ineffective:


We’ve reached the point where many people in medicine and medical ethics don’t even expect IRBs to act as something other than liability shields for their universities. But do patients who come to us only to be turned into subjects know that? Do they know that there is literally a price on their heads put there by research recruiters?

I’ve come to believe we need a radical solution. Maybe what we need is a nationalized system of IRBs for biomedical research, one that operates on the model of circuit courts, so that relationships cannot easily develop between the IRBs and the people seeking approval. This system could be run out of the Office for Human Research Protections and involve districts, similar to the federal courts system. Deliberations would be made transparent, so that all interested parties could understand (and question) decisions being made.

Think of the advantages: the possibility of actually focusing on the protection of human subjects first and foremost, free of conflicts of interest; the possibility of having nothing but trained professionals (not rotating unqualified faculty and staff) sitting on review panels; the possibility of marking biomedical research as clearly different from the social science and educational research unreasonably managed by many IRBs; the possibility of much greater transparency to those interested in seeing what’s going on; the possibility of having multi-center trials obtain a single approval from one centralized IRB, rather than trying to manage approvals from multiple local institutions. And the possibility of shutting down the deeply opaque, highly questionable private IRBs Elliott describes as being increasingly used by universities. (Go ahead, call me a Communist for caring about the Common Rule.)


Her Communist leanings aside, I don't know why Dreger presents her argument as a defense of the Common Rule, which fails to distinguish between biomedical and social research, puts ethics review in the hands of rotating unqualified faculty and staff, and keeps deliberations opaque. But her wish for the kind of coordination and transparency provided by the court system has a long lineage. I've quoted it before, and I'll quote it again:


The review committees work in isolation from one another, and no mechanisms have been established for disseminating whatever knowledge is gained from their individual experiences. Thus, each committee is condemned to repeat the process of finding its own answers. This is not only an overwhelming, unnecessary and unproductive assignment, but also one which most review committees are neither prepared nor willing to assume.

[Jay Katz, testimony, U.S. Senate, Quality of Health Care—Human Experimentation, 1973: Hearings before the Subcommittee on Health of the Committee on Labor and Public Welfare, Part 3 (93d Cong., 1st sess., 1973), 1050].


It is not lack of good intentions or hard work that leads IRBs to restrict ethically sound surveys while permitting unethical experimental surgery. It is the ignorance and isolation identified by Katz in 1973 and still in place today.

Monday, October 6, 2008

A Conscientious Objector

In a column in the Hastings Center's Bioethics Forum, historian and ethicist Alice Dreger explains why she declines to submit oral history proposals to IRBs:


To remain “unprotected” by my university’s IRB system—to remain vulnerable—is to remain highly aware of my obligations to those I interview for my work. Without the supposed “protection” of my IRB, I am aware of how, if I hurt my interviewees, they might well want to hurt me back. At some level, I think it best for my subjects that I keep my kneecaps exposed.


Compare this stance to the position put forward by Charles Bosk in 2004:


Prospective review strikes me as generally one more inane bureaucratic requirement in one more bureaucratic set of procedures, ill-suited to accomplish the goals that it is intended to serve. Prospective review, flawed a process as it is, does not strike me as one social scientists should resist.


Who takes research ethics more seriously: the researcher who submits to inane requirements, or the researcher who resists?

For more on Dreger's work, see The Psychologist Who Would Be Journalist.

Monday, June 30, 2008

The Psychologist Who Would Be Journalist

Back in August 2007, I mentioned the controversy surrounding the book The Man Who Would be Queen (Washington: Joseph Henry Press, 2003) by J. Michael Bailey, Professor of Psychology, Northwestern University. At the time, Professor Alice Domurat Dreger, also of Northwestern, had just posted a draft article on the controversy. Now that article, along with twenty-three commentaries and a reply from Dreger, has appeared in the June 2008 issue of the Archives of Sexual Behavior.

Dreger's article, the commentaries, and Dreger's response focus on big questions about the nature of transsexuality, the definitions of science, power relationships in research, and the ground rules of scholarly debates. Only a handful take up the smaller question of whether—as a matter of law and as a matter of ethics--Bailey should have sought IRB approval prior to writing his book. But that's the question that falls within the scope of this blog.