[Voosen, Paul. “Researchers Struggle to Secure Data in an Insecure Age.” Chronicle of Higher Education, September 13, 2013. http://chronicle.com/article/Researchers-Struggle-to-Secure/141591/. (gated)]
As the Chronicle explains,
Rather than trying to evaluate the data security of each experiment coming across the transom—[the University of North Carolina at Chapel Hill] has some 4,500 active studies at one time—[IRBs] now instead ask a few questions: Are you collecting protected health information? Genetic code? Surveys on illegal activity, substance abuse, or sexual behavior?
The boards plug those answers into a formula that generates a security threat level for the study. Given these parameters, the IRB then says, you have a Level 3 study. Go see your designated IT contact to establish proper security.
"At the end of the process," [Daniel K. Nelson, director of UNC's Office of Human Research Ethics] said, "rather than the investigator telling us what they're going to do, and us pretending we know how many bytes of encryption are up to standard, we flipped it."
Harvard has adopted a similar system.
The Chronicle describes this shift by reporting that UNC "had to stop trusting the researchers." Given Nelson's acknowledgment that the IRB and research-ethics office had been "pretending" to dictate appropriate procedures, the article could just as well have reported that the university had to stop trusting IRBs.
As I argued in my Brigham Young University lecture, IRBs are composed of pseudo-experts. The UNC model of referring researchers to real experts marks a significant shift. Though the Chronicle article covers only data security, the same model could apply to the whole range of research risks.
No comments:
Post a Comment
To be published, a comment must include the author's first and last names, and institutional affiliation as appropriate. It must also be responsive to the post to which it is attached. Thanks.
Note: Only a member of this blog may post a comment.