Showing posts with label training. Show all posts
Showing posts with label training. Show all posts

Monday, May 23, 2016

CITI Program is not unique in its mortifying stupidity

Writing in Slate, L. V. Anderson condemns simplistic, online training programs that are supposed to encourage regulatory compliance, but really just suck up time and money without improving behavior.


[L. V. Anderson, “Ethics Trainings Are Even Dumber Than You Think,” Slate, May 19, 2016.]


Anderson writes,


Regulators, managers, and employees are caught in a vicious cycle. Regulators pressure companies to implement training programs in hopes of reducing corporate crime and malfeasance. Executives implement training programs in hopes of protecting themselves against lawsuits and prosecution. Employees see through executives’ motivations and ignore, or even rebel against, the lessons of the trainings.

Although there’s not much research one way or the other, the online nature of compliance courses probably exacerbates this vicious cycle.


Anderson does not specifically mention the mortifyingly stupid CITI Program and its cousins in the IRB world, but everything she says applies to them.

Wednesday, December 5, 2012

NIH Policy Makes Interviewing Children Easier

Susan Ridgely, assistant professor of religious studies at the University of Wisconsin at Oshkosh, finds that IRBs can cause trouble for qualitative researchers who want to talk with children, but that IRB review has some benefits. Moreover, since the NIH started calling for children to be included in medical studies, she is finding it easier to get IRB permission to speak to children.

[Susan B. Ridgely, “Doing Ethnography with Child Consultants: Making the IRB Process Work.” Journal of American Folklore 125, no. 3 (2012): 474–485.]

Friday, September 21, 2012

U of Illinois Launches Ethics CORE

The University of Illinois at Urbana-Champaign has officially launched Ethics CORE (NationalEthicsCenter.org), an online resource for education in research ethics.

Friday, July 20, 2012

Geographer: Unnecessary IRB Delay Threatens NSF Grants

In the fifth and final Professional Geographer essay, Scott M. Freundschuh, Professor of Geography at the University of New Mexico, notes that many IRBs "unnecessarily require research protocols to be reviewed by the full IRB, therefore impeding the progress of research projects." Rather than suggesting structural changes to the IRB system, he counsels geographers to work within existing rules.

[Scott M. Freundschuh, "Institutional Review for Research in the Social Sciences from the Federal Perspective," Professional Geographer 64, no. 1 (2012): 43-48, DOI:10.1080/00330124.2011.596791]

Friday, May 11, 2012

Community Researchers Flee the CITI

An article in the latest issue of the Journal of Empirical Research on Human Research Ethics finds that standard research ethics training programs--specifically the mortifyingly stupid CITI Program--are inappropriate for Community-Engaged Research (CEnR).

[Anderson, E., S. Solomon, E. Heitman, J. Dubois, C. Fisher, R. Kost, M. Lawless, et al. "Research Ethics Education for Community-Engaged Research: A Review and Research Agenda." Journal of Empirical Research on Human Research Ethics 7, no. 2 (April 2012): 3, DOI: 10.1525/jer.2012.7.2.3]

Friday, August 12, 2011

CITI Program as Mind-Numbing, Coercive, Counterproductive McEthics

Sanjay Srivastava of The Hardest Science kindly alerted me to a newly published critique of the mortifyingly stupid CITI Program.

[Jennifer J. Freyd, "Journal Vitality, Intellectual Integrity, and the Problems of McEthics," Journal of Trauma & Dissociation, available online: 15 July 2011, DOI:10.1080/15299732.2011.602290]

Thursday, July 21, 2011

U of Michigan Reports Some Progress

The University of Michigan has released the results of a 2009 survey of investigator experiences in human research. The survey suggests that matters have improved somewhat since the university launched its HRPP Policy Innovation and Demonstration Initiative in 2007, but that more work remains to be done.

[Survey Research Center, Institute for Social Research, University of Michigan, "2009 Follow-Up Survey of Investigator Experiences in Human Research," December 2010. h/t: Human Research Protections Blog.]

Saturday, June 4, 2011

The CITI Program as Mortifyingly Stupid, Marxist Doxology

The Presidential Commission for the Study of Bioethical Issues has posted videos and transcripts of its Meeting Five, held May 18 and 19 in New York City. I earlier linked to the Commission's summary of the statement by Ronald Bayer, professor and co-chair of the Center for the History of Ethics of Public Health at the Mailman School of Public Health at Columbia University. Now that we have the verbatim text, it is worth quoting as well.

Overall, Bayer lamented that the IRB system has "turned itself into an object of ridicule and sometimes contempt in a way that I think is dangerous to those who believe in the ethical conduct of research."

Particularly choice is Bayer's description of the CITI Program, a widely used online training course in research ethics, which Columbia University requires researchers to complete every three years.

Tuesday, April 26, 2011

University of Iowa: Ask IRB Before Researching Neanderthals

Someone at the University of Iowa apparently thinks that the IRB has jurisdiction over research with dead Neanderthals.

Friday, April 8, 2011

Princeton Offers PhD Students Serious Training in Historians' Ethics

Google alerted me to an innovative effort to train historians in the responsible conduct of research.

[Angela Creager and John Haldon, "Responsible Conduct of Research Workshop, June 14-15, 2010," Princeton University.]

Friday, November 26, 2010

Survey: One-Third of UConn Researchers Dislike CITI Program

A 2007 survey of researchers at the University of Connecticut found that more than one third were dissatisfied with the Collaborative Institutional Training Initiative (CITI) program in human subjects research.

The UConn IRB and Office of Research Compliance offered the survey to about 350 researchers, of whom 114 (33 percent) returned it. Part of the survey asked respondents about the CITI Program:


7 Questions asked respondents to rate different aspects of the CITI course on a scale of 1-7 (1=least, 7=most). 4 out of these 7 questions asked if the CITI course increased understanding of risks and protections for human subjects in research. There were no statistical differences in the answers received on this group of 4 questions.

53% rated this group 5 or above
16% rated this group 4, moderate
31% rated this group 3 or below

Similar rates were received for overall satisfaction with the CITI course:

54% rated it 5 or above
9% rated it 4, moderate
37% rated it 3 or below

The course did appear to have an impact on the respondent's understanding of the Federal
Regulations. On this criteria,

72% rated it 5 or above
4% rated it 4, moderate
24% rated it 3 or below

The course had a negative impact on the respondents' willingness to join an IRB:

29% rated it 5 or above
13% rated it 4, moderate
58% rated it 3 or below


These figures suggest wider dissatisfaction with CITI than one of its founders, Paul Braunschweiger, admitted in a 2006 presentation. That presentation (slide 60) reported that principal investigators gave the program an average of about 7.8 on a 10 point scale on overall satisfaction. Though the presentation did not show the distribution of researchers' responses, it would be difficult to get so high a mean if 37 percent of researchers offered negative assessments. We need more data.

The UConn survey also offered researchers the chance to write open-ended comments. The most common suggestions were that the training should be shorter, and that the course content "should be limited to a researcher's area of research." Researchers were happy with the online form of the course, with 74 asking for no change, and only 12 choosing the next most popular option: video instruction.

All of these results suggest the potential for online courses that are shorter than CITI and targetted to a specific research discipline, such as Macquarie University's Human Research Ethics for the Social Sciences and Humanities.

UConn also surveyed researchers on their views of the UConn IRB. But the university has only reported the mean ratings, not the distribution of responses, so it is impossible to say if the IRB earned as many unsatisfactory grades as did the CITI program.

Friday, July 16, 2010

Librarian Urges Cooperation with IRBs

Maura Smale, information literacy librarian at the New York City College of Technology, suggests that librarians "embrace research involving human subjects" and seek IRB approval to do so.

[Maura A. Smale, "Demystifying the IRB: Human Subjects Research in Academic Libraries," portal: Libraries and the Academy 10 (July 2010): 309-321, DOI: 10.1353/pla.0.0114]

Smale notes that librarians can interact with IRBs in two ways. First, they can serve as IRB members or consultants, helping researchers and reviewers inform themselves about a proposal. Better library research, she suggests, could have prevented the 2001 death of Ellen Roche, a volunteer in a Johns Hopkins University asthma study. Smale could also have mentioned that better library research might prevent unreasonable IRB demands.

Second, librarians can act as researchers. Smale offers as examples two of her own studies of student and faculty users of her library. She found value in the approval process:


While it was a lengthy and labor-intensive process, obtaining IRB approval was an experience with real value, not simply a bureaucratic hurdle to overcome. Applying to the IRB required us to think deeply and critically about the goals for our research project while still in the early planning stages of the study; navigating the IRB approval process helped us make our research project both stronger and more relevant. Additionally, because we created all of our materials for the IRB application, we were ready to get started on our project as soon as the IRB approval came through, which saved us time at the beginning of our study. (317)


Smale does note that approval took five months, leading the skeptic to ask whether the same deep thinking could have been achieved in less time by another form of review.

Most of Smale's article is less of an argument than an introduction to IRBs for librarians new to the concept. (309). While it serves reasonably well for this purpose, the article unfortunately includes some factual errors that deserve correction:


  • "Any study involving human subjects that meets the definition of research in the Belmont Report requires review by the IRB." (312) In fact, the Belmont Report has no legal force, and it is the definition of research in 45 CFR 46 that determines the need for IRB review. That this definition does not match the definition in the Belmont Report suggests the imprecision of the work of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. (More on this in Ethical Imperialism.)


  • "There are three levels of IRB review—exempt, expedited, and full. The IRB evaluates each research project and determines the level of review required; researchers may not make this determination on their own." (312) Exempt means exempt; it is not a level of IRB review. The regulations do not forbid researchers from making the exempt determination. And not even OHRP's recommendations insist that an IRB be involved in that determination.


  • "Certain types of studies automatically meet the criteria for exemption set forth in the Common Rule, including research on 'normal educational practices' such as curriculum design, instruction, and assessment. Research involving use of previously collected data is also usually exempt. In both cases the subjects' anonymity must be preserved." (313) The "normal educational practices" exemption, 45 CFR 46.101(b)(1), imposes no requirement of anonymity. The existing data exemption, 45 CFR 46.101(b)(4), does not require anonymity if the data are publicly available.


  • "Library research projects that include procedures in which the researcher is in direct contact with the subject will usually be required to undergo expedited review by the IRB." (315) Perhaps this is the practice at Smale's institution, but the regulations exempt this kind of research unless "any disclosure of the human subjects' responses outside the research could reasonably place the subjects at risk of criminal or civil liability or be damaging to the subjects' financial standing, employability, or reputation." [45 CFR 46.101(b)(2)]. This would not seem the case in the kind of research Smale proposes concerning "the use of space in the library" or "collaboration between the library and the campus writing center." (318)


  • "It is worth noting that the underlying principles used by the IRB to evaluate projects involve ethical treatment of subjects and preservation of privacy and are similar to the recommendations of many discipline-specific professional organizations, including the Oral History Association and the American Anthropological Association." (316). For over a decade, the Oral History Association has been fighting IRB requirements and insisting on the differences between the ethics of medical research and the ethics of oral history. Smale does cite the CITI Program in support of this assertion, but she fails to notice that the CITI Program offers no support for its statement.
  • {See comments for a correction.]


I am grateful to Smale for sharing her experience and for her kind citations to this blog and to my scholarship. But I fear that she has too readily accepted the claims of IRB administrators and training programs, leading her to advise librarians to tolerate months of delay when they should be demanding swift exemption.

Friday, November 6, 2009

Former IRB Chair Decries Inconsistency

Jim Vander Putten, Associate Professor of Higher Education at the University of Arkansas-Little Rock, kindly alerted me to his essay, "Wanted: Consistency in Social and Behavioral Science Institutional Review Board Practices," Teachers College Record, 14 September 2009.

Vander Putten, who chaired his university's IRB for six years, complains that IRBs fail to make decisions consistently. He accuses them of both under- and over-protection, and then offers two suggestions for reform.

Friday, June 5, 2009

Lisa Wynn's Words and Pictures

In April I commented on the ethics training program for ethnographers developed by Lisa Wynn of Macquarie University with some colleagues.

At Culture Matters, the blog of the Macquarie anthropology department, Wynn described the ideas that led her to develop the program.

Now, at Material World, a blog hosted by the the anthropology departments of University College London and New York University, Wynn describes another aspect of the training program: the pictures.

Wynn explains that along with its medical-centered ethics and jargon-laden text, the standard NIH ethics training program suffers from clip art in which people are depicted as faceless cartoons--probably not the best way to get researchers thinking about others as autonomous individuals. So for her program, Wynn offers pictures of real researchers and research participants, from Laud Humphreys to Afghan school administrators.

Gathering these photos--about a hundred in all--wasn't easy, but they contribute meaningfully to the warmth and depth of the site. And it put Wynn in touch with some prominent scholars.

[Side note: Professor John Stilgoe tells his students that it's rare to have enough photos of yourself at work. That's a good admonition; you never know when someone will want to show you doing controversial research.]

In another posting on Culture Matters, Wynn describes her continuing research on research ethics. She notes that ethics-committee oversight of ethnography is a relatively recent phenomenon. While it was debated as early as the mid-1960s, only in the 1990s did it become widespread. Thus, in studying the effect of ethics committees,

We’ve got a perfect “natural” control: an older generation of researchers who spent most of their careers not seeking ethics clearance, a younger generation for whom it is standard operating procedure, and a “middle-aged” group of researchers like myself who started their research under one regime and now live under another (I swear, this is the first time I’ve thought of myself as middle-aged). By correlating responses with different regulatory regimes, we can ask questions like: do researchers who never got ethics clearance have different ideas about what is ethical than researchers who go through ethics review? Does one group consider itself more or less ethical than the other? Or do they feel like ethics oversight hasn’t made any difference to their research practice?


Wynn plans to contact scholars in Australia and the United States to see how the spread of ethics review affected ideas about research ethics. I'm quite excited by this work; in fact, I plan to publish it in a special issue of the Journal of Policy History I am editing on the general topic of the history of human research ethics regulation.

How many pictures should I demand?

Friday, April 17, 2009

Macquarie's Innovative Ethics Training

In previous posts and my 2007 essay, "Ethical Training for Oral Historians," I have complained about standardized, medicine-centric ethics training systems like the CITI Program and called for training programs better tailored to individual disciplines.

Lisa Wynn of Macquarie University (also known as MQ) has alerted me to just such a program she created with Paul H. Mason and Kristina Everett. The online module, Human Research Ethics for the Social Sciences and Humanities, has some elements that I find inappropriate. Overall, however, it is vastly superior to the CITI Program and comparable ethics programs I have seen, and it deserves attention and emulation.

Friday, April 10, 2009

Training Day

Peter Klein of the Organization and Markets blog offers a sad account of what it takes for a University of Missouri economist to gain permission to interview entrepreneurs or hand out surveys to corporate executives. Like many scholars across the country, he was directed to an online training system, which demanded that he provide correct answers to questions like the following:


32. The investigator is a 1/8th V.A. employee. She proposes to recruit MU outpatients into a study conducted exclusively at MU facilities. Which of the following groups must approve the research project before participants can be enrolled?

* The MU Health Sciences Center IRB
* The V.A. Research and Development Committee
* Both a. and b.
* Neither a. nor b.


While such knowledge may be of critical importance to health researchers at Missouri, it is irrelevant to social scientists not doing medical work. The lesson Klein takes away from such an experience is not that he must be sure to obey laws and ethics standards while doing his research, but that his campus IRB administrators do not respect him enough to provide relevant ethical training.

Administrators take note: you are making fools of yourselves, and earning your faculty's contempt.

See Comments Oppose New Regulations on Training.

Friday, February 13, 2009

AAUP's Rhoades Takes Soft Line on IRB Training

In an essay on compulsory sexual harassment training ("Sexual Harassment and Group Punishment," Inside Higher Ed, 12 February 2009), the new AAUP general secretary, Gary Rhoades, offers side comments on human subjects research training:


In research universities (where professors’ work routinely involves human subjects, though even there literary and some other scholars are not required to undergo such training), perhaps the most obvious example of this is the human subjects training surrounding research grants and activity. Prior to getting grants approved by the sponsored projects division of a university, an investigator must have undergone human subjects training. Although the training varies by university, there are common patterns nationally. Typically, for example, such training is online, and is not particularly rigorous, to put it mildly. Indeed, the format involves investigators taking an exam by reading some written passages and then answering questions about them. After each section or module the person finds out whether he or she missed too many questions in a section, and proceeds. If they have missed too many questions in a section they simply backtrack, get the same questions in a different order, and retake the quiz, until they pass. A widely used set of exams (which are specified to social/behavioral and biomedical research) are those offered by the Collaborative Institutional Training Initiative, which over 830 institutions and facilities (including a very large number of research universities, and indeed including the University of California at Irvine) utilize. The modules for the CITI quiz typically include three to six questions.

For the most part, although faculty complain about the inconvenience and irrelevance of the training, I do not know of anyone who would suggest that such training should be required only of investigators found to have violated the rights of human subjects. The more important questions of process and principle surround the institutional review board activities that regulate the approval of an investigator’s proposal. Here, serious questions have been raised about compromising investigators’ academic freedom to engage in certain types of research and to research certain subject matter. But the controversy is not, for the most part, about the human subjects training per se. Indeed, I would venture to say that for colleagues in the social and behavioral sciences, among the most common comments and complaints about human subjects training are that it is ineffective, that it does little by way of actually protecting human subjects and seems to be geared more to protecting the institution.


Apparently, Dr. Rhoades is unfamiliar with the widespread, principled opposition to CITI and other online training programs. That is worrisome, if it signals the retreat of AAUP from its longtime leadership in the fight against overly broad human subjects regulations and requirements.

Saturday, November 15, 2008

Comments Oppose New Regulations on Training

As reported by the Chronicle of Higher Education, OHRP recently released the replies it had received in response to its July call for comments on education and training requirements. I thank Chronicle reporter David Glenn and OHRP associate director Michael Carome for supplying me with copies of the comments.

As I see it, the comments pose three main questions.

Tuesday, November 4, 2008

Chronicle of Higher Education on OHRP Training Comments

The Chronicle of Higher Education quotes your faithful blogger in "Scholars Mull Rules for Training in Research Ethics," by David Glenn, 4 November 2008.

The story concerns the eighty or so comments received in response to OHRP's July call for comments on education and training requirements. Glenn notes that by and large, the comments were skeptical about the need for new guidance, and particularly skeptical about regulations. As he reports, "the American Association for the Advancement of Science, the Association of American Universities, the Association of American Medical Colleges, and a consortium of large social-science organizations [all] said that before the federal government issues new rules, it should carefully study whether training actually improves researchers' conduct."

I will offer some of my own comments on the comments in coming posts.

Monday, September 29, 2008

AHA Comments on IRB Training

The American Historical Association has posted a copy of the comments on IRB training and education it sent to OHRP in response to the July notice in the Federal Register. The AHA letter states that historians "are concerned that the proposed training program will reinforce the tendency to treat all research as if it was conducted in the experimental sciences" and that "the proposed training program would only cover what should be assessed by the review boards, and does not include room for discerning among different types of research methods."