Wednesday, September 29, 2010

OHRP Issues Guidance on Withdrawal

On September 21, OHRP posted new "Guidance on Withdrawal of Subjects from Research: Data Retention and Other Related Issues." The document makes it clear that if a research participant in a social science study wishes to withdraw, the researcher is not obliged under federal regulations to throw out information that has already been collected:

May an investigator retain and analyze already collected data about a subject who withdraws from the research or whose participation is terminated by the investigator?

OHRP interprets the HHS regulations at 45 CFR part 46 as allowing investigators to retain and analyze already collected data relating to any subject who chooses to withdraw from a research study or whose participation is terminated by an investigator without regard to the subject's consent, provided such analysis falls within the scope of the analysis described in the IRB-approved protocol. This is the case even if that data includes identifiable private information about the subject.


Of course, in some cases, researchers may still choose to discard such data:

For research not subject to regulation and review by FDA, investigators, in consultation with the funding agency, certainly can choose to honor a research subject's request that the investigator destroy the subject's data or that the investigator exclude the subject's data from any analysis. Nothing in this document is intended to discourage such a practice. For example, an investigator studying social networks in a community may agree to omit all of the data they have collected from a subject of the study at the request of that subject.


(The clause about the FDA is due to the FDA's concern that withdrawals can skew the findings of clinical trials.)

This guidance strikes me as helpful. When reading sample consent forms, e.g., Cornell's, I am often left wondering what is meant by the boilerplate, "you are free to withdraw at any time," especially when it comes to interviews. If a researcher does a great interview and writes a dissertation chapter around it, can the narrator show up at the dissertation defense and pull the information? (This is apparently the case with undergraduate research at Bard College.)

Fortunately, OHRP says no.

Tuesday, September 28, 2010

Thanks, Law Professors!

Concurring Opinions, which describes itself as "a multiple authored, general interest legal blog," features an interview with your humble blogger, while the Legal History Blog also takes notice of Ethical Imperialism.

Sunday, September 26, 2010

Unfair IRBs Provoke Misbehavior

"Researchers who perceive that they are being unfairly treated are less likely to report engaging in 'ideal' behaviors and more likely to report misbehavior and misconduct," according to a survey of faculty at fifty top research universities in the United States.

[Brian C. Martinson, A. Lauren Crain, Raymond De Vries, Melissa S. Anderson, "The Importance of Organizational Justice in Ensuring Research Integrity," Journal of Empirical Research on Human Research Ethics 5, no. 3. (Sep 2010): 67–83.]

As the authors note, this is mostly a quantitative confirmation of earlier findings. Most relevant for this blog, they cite a 2005 article by Patricia Keith-Spiegel and Gerald P. Koocher that found that "The efforts of some institutional review boards (IRBs) to exercise what is viewed as appropriate oversight may contribute to deceit on the part of investigators who feel unjustly treated."

Like the Singer and Couper article in the same issue, this article presents a mass of quantitative data in a difficult form. Let me suggest that the Journal of Empirical Research on Human Research Ethics invest some money in decent graphs.

Monday, September 20, 2010

Survey Consent Form Language May Not Matter Much

Eleanor Singer and Mick P. Couper of the Survey Research Center of the Institute for Social Research at the University of Michigan find that the wording used to describe the confidentiality offered to survey participants may not play a big role in their decision to participate.

[Eleanor Singer and Mick P. Couper, "Communicating Disclosure Risk in Informed Consent Statements," Journal of Empirical Research on Human Research Ethics 5, no. 3 (Sept. 2010): 1–8.]

Singer and Couper sent out more than 150,000 e-mails to get 9,206 responses to a questionnaire about willingness to participate in a hypothetical survey. Respondents were significantly more likely to say they'd be willing to answer questions about work and leisure than about the more sensitive topics of money and sex. In contrast,


the precise wording of the confidentiality assurance has little effect on respondents’ stated willingness to participate in the hypothetical survey described in the vignette. Nor does adding a statement on the organization’s history of assuring confidentiality appear to affect stated willingness. However, these experimental manipulations do have some effect on perceptions of the risks and benefits of participation, suggesting that they are processed by respondents. And, as we have found in our previous vignette studies—and replicated in a mail survey of the general population—the topic of the survey has a consistent and statistically significant effect on stated willingness to participate.


Singer and Couper hint that researchers and IRBs should spend less time fretting about the wording of consent forms used by survey researchers, since it does not affect decisions and since it is hard to estimate the risk of disclosure. Rather, the real burden on survey orgnizations is to take precautions once they have collected the data.

Sunday, September 5, 2010

IRB is No Substitute for Shield Law

Education Week reports that researchers are dismayed by the release of data about teachers and students.

[Sarah D. Sparks, L.A. and Ariz.: Will Data Conflicts Spur a Chill Effect?," Education Week, 3 September 2010.]

The article discusses the decision by the University of Arizona to release some data in response to a subpoena. It claims that "The Code of Federal Regulations for the Protection of Human Subjects delegates confidentiality decisions to university institutional review boards, or IRBs, but in Arizona, the IRBs released the full data over the researchers' opposition." I believe this is incorrect on three counts:


  1. The Common Rule gives power to IRBs to review and approve research. Once the research was complete, it was up to the universities to decide whether to comply with the subpoenas, not the IRBs. Indeed, the open letter from the researchers states that "lawyers at the University of Arizona," not the IRB, turned over information. (The letter does complain that "researchers have received little or no support from their campus IRB, lawyers, or administration," but that's a different thing.)


  2. The use of the plural "IRBs" suggests that more than one university released data. As Education Week itself made clear, Arizona State did not release any data, and the Arizona State professor involved withdrew as an expert witness.


  3. Also as reported Education Week, the University of Arizona did not hand over "full data," but rather only the names of schools and school districts, not individuals.



That said, Gary Orfield, one of the researchers in the Arizona case, hits on a larger truth when he complains of the University of Arizona's behavior:


"I think it's tragic and very dangerous," Mr. Orfield said. "I was shocked at the way the [State of] Arizona people went after this data and that the universities just went along with it. It really calls into question not just the access to schools but the integrity of the IRB process." Mr. Orfield, Ms. Hannaway, and other researchers suggested researchers may need a federal shield law similar to state laws that protect reporters from being compelled to name sources. "We thought the IRBs served that purpose for us, but we were wrong," Mr. Orfield said.


Indeed, since the 1970s, social scientists have argued that shield laws make more sense for protecting the participants in social science research than do IRBs. [James D. Carroll and Charles R. Knerr, Jr., "A Report of the APSA Confidentiality in Social Science Research Data Project," PS 8 (Summer 1975): 258-261 and James D. Carroll and Charles R. Knerr, Jr., "The APSA Confidentiality in Social Science Research Project: A Final Report," PS 9 (Autumn 1976): 416-419.]

I haven't figured out how a shield law would apply to expert witness testimony. (Anybody looking for a good law review topic?) And even without such a law, Judge Collins's order seems to strike a good balance between the rights of research participants and those of parties to the lawsuit.

Still, it seems that in this case the IRB process left Orfield with a dangerously false sense of security.

NOTE:

The Education Week article also mentions an analysis of teacher effectiveness published by the Los Angeles Times based on 1.5 million test scores.. It quotes Felice Levine, the executive director of the American Educational Research Association, on the L. A. Times study: " think it would really have a crippling effect on all social science, education, and health enquiry if public employees in the sector couldn't be guaranteed the same confidentiality as any other research participant . . . In this economy, people are feeling pressed in a number of ways, and being a participant in a voluntary study is probably lower on one's list of priorities than is providing for oneself and one's children."

But the newspaper analysis was not based on a voluntary study, but rather on scores obtained under the California Public Records Act. Making the scores public in this manner may have been bad policy or bad journalism for other reasons, but I don't see what it has to do with voluntary participation in research.

Friday, September 3, 2010

Oral Historians Open Discussion on Principles and Best Practices

As noted on this blog, in October 2009, the Oral History Association replaced its Evaluation Guidelines with a new set of Principles and Best Practices. The new guidelines are considerably clearer in format, and they distance oral history from the biomedical assumptions of the Belmont Report.

Now the Oral History Association is further distancing itself from the Belmont Report by opening an ongoing discussion of the principles, including suggestion for additional revisions. Whereas the Belmont Report was prepared by a small group of people and has not been amended since 1978, the OHA Principles can remain a living document, revised in response to a discussion that is open to all.

Hat tip: AHA Today.