[Survey Research Center, Institute for Social Research, University of Michigan, "2009 Follow-Up Survey of Investigator Experiences in Human Research," December 2010. h/t: Human Research Protections Blog.]
In 2007, the University of Michigan launched its HRPP Policy Innovation and Demonstration Initiative (Michigan Initiative), intending "to identify and improve Human Research Protection Program (HRPP) policies and procedures to minimize administrative burden for investigators and the institution without compromising the protection of the human participants." In 2007, the university conducted an initial survey of its investigators, and in 2009 conducted this follow-up survey. The survey covered both the institutional review boards of the University of Michigan Medical School (IRBMED) and the Health Sciences and Behavioral Sciences Institutional Review Boards (HSBS). It offers some important lessons for all university programs.
Most investigators are satisfied
84 percent of HSBS researchers said they were very or somewhat satisfied with the IRB process, up from 78 percent in 2007. But only 67 percent were satisfied with full committee applications, down from 78 percent in 2007. Unsurprisingly, the higher the level of review, the less satisfied investigators were. When given open-ended questions, both medical and non medical researchers were apt to complain about the process. (Table 20)
Customer service matters
Researchers like having messages returned. "Taken altogether, 22% of IRBMED investigators and 11% of HSBS investigators reported either an unanswered telephone call that was never, rarely, or only sometimes returned or an email message that was not returned when a reply was expected . . . Unreturned inquiries are associated with much higher levels of dissatisfaction with the IRB review and approval process: the overall level of dissatisfaction with the IRB review and approval process was 21% and increased to 51% among investigators with unreturned inquiries." (Table 8)
Researchers also like approval within four weeks. One of the biggest improvements in the 2007 - 2009 span was the decrease in exempt applications that took more than four weeks to get approval. "The decrease in the percentage of HSBS exempt applications approved in more than 4 weeks was especially large, dropping from 28% to 8% between the 2 survey periods (i.e., 92% were approved within 4 weeks . . .)" (Table 14)
The report explains,
When applications took more than 4 weeks to approve, investigators were much more likely to say their applications were not approved in a timely manner and to be dissatisfied with the review and approval.
Moreover, investigator attitudes towards the review and approval process for their most recent application were strongly associated with the number of weeks to obtain approval. When an application took more than 4 weeks to approve, investigators were much more likely to disagree that the changes required to their application were reasonable or clear and to agree that the changes required made it harder to achieve research goals and objectives and delayed the start of their research. (v)
It seems that the university still has work to do in cutting down the time needed to prepare an application. The report boasts that 74 percent of HSBS exempt applications took 10 hours or less to prepare. True, but 66 percent took more than four hours, which seems unreasonable for a process that should be mostly automatic. And 9 percent of exempt HSBS applications took two weeks or more to prepare. (Table 16)
Most researchers see IRB as a hurdle
79 percent of HSBS researchers agreed that "The IRB process is a hurdle to clear." 59 percent believe that "the IRB interprets regulations too strictly," and only 64 percent agree that the "IRB is an ally in my research." On the other hand, all of these numbers are improved since 2007. (Table 22)
Here's a fun one: more than 87 percent of investiagors--both medical and not--in both 2007 and 2009 agreed that IRB review in general adds to the protection of human subjects. But only 44 percent of researchers believed that the changes made to their projects improved the protection of participants.
One can read this in two ways. Perhaps the IRB is adding needed protections, and researchers are unable to see their own defects, the way that only 1 percent of drivers rate themselves as worse than average. Or it could be that researchers have a better sense of the issues involved in their own research and are just guessing that the IRB does better with other kinds of research. (Table 15)
Perhaps these numbers would improve if IRBs could explain their actions. Only 50 percent of HSBS investigators agreed that the IRB explained the ethical reasons for changes, down from 54 percent in 2007. Only 59 percent said the IRB explained the regulatory reasons. That's pretty bad news: investigators should know why they are being led through hoops. (Table 15)
Many researchers are also unhappy with the training they must complete. The University of Michigan requires researchers to complete its Program for Education and Evaluation in Responsible Research and Scholarship (PEERRS), an online program that takes its content from the mortifyingly stupid CITI Program. Only 63 percent of HSBS researchers agreed that "PEERRS contributes to ethical understanding." Asked if PEERRS was useful to the content of their research, 52 percent of HSBS researchers disagreed. (Table 36)
Michigan is asking good questions, but not enough of them
I am disappointed by two silences in this report. First, the survey did not ask about investigator attitudes toward the new IRB Council, which includes faculty representatives and might be expected to shape attitudes toward the HRPP in general. I imagine that faculty like having a voice in the apparatus that shapes their research, but this survey missed the opportunity to find out.
Second, while the survey asked respondents' gender age, rank, and years at the university, it did not ask their department or primary disciplinary affiliation. The HSBS IRB serves the A. Alfred Taubman College of Architecture & Urban Planning, the College of Engineering, the College of Literature, the Science, the and the Arts, the College of Pharmacy, the Gerald R. Ford School of Public Policy, the Horace H. Rackham School of Graduate Studies, the Law School, the School of Art & Design, the School of Dentistry, the School of Education, the School of Information, the School of Kinesiology, the School of Music, the Theatre & Dance, the School of Natural Resources & Environment, the School of Nursing, the School of Public Health, the School of Social Work, the Stephen M. Ross School of Business, the U-M Transportation Research Institute, and even the Institute for Social Research (ISR), which ran the survey.
Breaking down the investigators into rough groupings (e.g., health sciences, behavioral sciences, social sciences, humanities) would have gone far to make sense of some of the findings here.
Overall, however, the survey is remarkable for two reasons. First, that it was done at all, and second, that it was made public on the Internet. In a system not known for transparency, the Michigan HRPP has aired its laundry--both dirty and clean--in public. By doing so, it gives University of Michigan researchers and administrators a sense of what is working and what needs attention, and it points the way for similar efforts at other institutions.
The survey and its report refute Laura Stark's claim that "like any bureaucracy, the best [IRBs] can aspire to be is well-oiled, smooth-running, and thus silent." No, the best bureaucracies--including those charged with the protection of research participants--can aspire to constant self-examination, accountability, and improvement.
No comments:
Post a Comment