Monday, July 12, 2010

Social Work Research Gets Few Exemptions

Stephanie Valutis and Deborah Rubin, both of Chatham University, sought "to explore the attitudes toward, knowledge about, and practices of IRBs across colleges and universities as reported by BSW [bachelor of social work] and MSW [master of social work] program directors as they pertain to faculty and student research."

[Stephanie Valutis and Deborah Rubin, "IRBs and Social Work: A Survey of Program Directors’ Knowledge and Attitudes," Journal of Social Work Education 46 (Spring/Summer 2010): 195-212, DOI 10.5175/JSWE.2010.200800059.]

They sent a survey to social work programs around the country, receiving 201 responses. They asked both factual questions about the composition and operations of the IRBs, and questions about the program directors' attitudes.

Among the key findings:

  • Familiarity improves attitudes. "Respondents who reported higher levels of knowledge about their IRBs had more positive responses to several attitude questions." (201)

  • IRBs grant few exemptions for three types of social work research: closed case files (28 percent of IRBs consider them exempt from review), satisfaction surveys (23 percent), and staff interviews (16 percent). The article does not go into depth about what each type of research entails, why an IRB might choose to require review, or whether social work program directors believe such research should be exempt. (205)

  • IRBs take a long time to approve research. While about half of program directors reported that the exempt and expedited reviews took less than two weeks, 17 percent reported exempt reviews taking one month or longer, and 11 percent reported expedited reviews taking that long. Thirty-seven percent reported full reviews taking one month or longer. And this question produced many "Do not know" responses, so the true level of delay may be much higher. (205)

  • Some students aren't allowed to do research with human subjects. Seven percent of program directors reported that "social work students were not permitted to do research that required IRB approval." (206)

I have my doubts about the usefulness of this survey, for two reasons. First, the survey posed factual questions (e.g., "How long does it take for initial review of an expedited submission?") to program directors who had no easy way of finding out this information. The authors rightly note that "the many 'don't know' responses" suggest a lack of transparency in IRB operations. But a better survey would have reached IRB administrators or chairs as well, allowing for some comparison. [For an example of this type of survey, see Robert E. Cleary, "The Impact of IRBs on Political Science Research,"IRB: Ethics and Human Research 9 (May-June 1987): 6-10.]

As for the attitudinal questions, they only allowed respondents to agree or disagree with positive statements about IRBs, e.g., "The IRB process helps students learn research ethics." I can't credit the conclusion that "We did not find the frustration with the process and scope of IRB reviews discussed in the broader social science literature," when the survey offered no opportunity to register such frustration. In his pioneering IRB survey of 1976, Bradford Gray understood the need to give respondents a chance to react to more critical statements, e.g., "The review procedure is an unwarranted intrusion on an investigator's autonomy--at least to some extent." [Bradford H. Gray, Robert A. Cooke, and Arnold S. Tannenbaum, "Research Involving Human Subjects," Science, new series, 201 (22 September 1978): 1094-1101] This survey should have done the same.

Indeed, while Valutis and Rubin cite a fair amount of IRB-related scholarship, it is not clear that they read any previous surveys of this sort before designing their own. Rather, they report concerns about the use of "a new survey instrument." (209)

The article also shows some confusion about federal regulations. It states that "Calling research 'exempt' by federal guidelines means that the research poses no risk to human subjects." While it is true that the 1981 Federal Register announcement of the exemptions describes them as exempting "broad categories of research which normally present little or no risk of harm to subjects," little risk is not the same as "no risk." And the regulations themselves exempt some research, e.g., interviews with public officials, regardless of risk. Later, the article claims that "an example of criteria for exemption by federal guidelines is research that does not pose more than minimal risk to human subjects." Actually, that's the criterion for expedited review, not exemption. Finally, the article claims that "Federal regulations require that IRBs make IRB membership available by name, role on the board, and earned degrees, but this information may not be widely disseminated." Indeed, that information is included on federal assurances, but those assurances are rarely made public.

Valutis and Rubin have raised important questions about how IRB oversight affects the education of social work students. But complete answers will require further research.

No comments: