“Mr Mayor,” said K., “you keep calling my case one of the smallest, yet a great many officials have put their minds to it, and while it may have been very small at first, the zeal of officials has made it into a major one. This is unfortunate since I have no ambition to see towers of files about me rise in the air and come crashing down, I just want to work at a little drawing-board in peace as a humble land surveyor.” (Kafka’s The Castle).
“[The document] must be entered in that Department, sent to be registered in this Department, sent back to be signed by that Department, sent back to be countersigned by this Department, ...” (The Circumlocution Office in Dickens’s Little Dorrit).
Navigating the research bureaucracy of any institution can be fraught with complexity for both researchers and those involved in enforcing the rules. Many have suggested reform or easing of bureaucratic burden. The Consortium of the Clinical and Translational Science Awardee institutions, for example, has made streamlining the research process one of its strategic goals. While we appreciate the moral responsibility to protect the rights and welfare of human subjects, we recently encountered institutional red tape that was so entangling and formidable as to make the conduct of even a minimal risk study exceedingly difficult.
In a multisite study we conducted interviews at addiction treatment programs to explore patients’ views about an emerging genetic understanding of addiction. We thought the IRB review process at the Veteran's Administration Medical Center included among our sites would be relatively straightforward; ours was a low-risk interview study with stringent procedures in place to protect participant confidentiality, and our protocol had been approved by IRBs at other sites without difficulty. We never anticipated that it would take a full year to gain IRB approval and that due to further delays resulting from institutional rules and policies we would be able to conduct only six of the 20 planned interviews.
Our experience brought to mind two classic literary portrayals of the perils of bureaucracy: Franz Kafka’s The Castle, in which an opaque and disorienting bureaucracy produces rules and regulations, seemingly from nowhere and unrelated to the task at hand, and the Circumlocution Office in Charles Dickens’s Little Dorrit, where paperwork is endlessly shuffled from one department to another, with the appearance of action but little real progress.
As we set out to gain IRB approval from the VA, we were immediately presented with a vast amount of paperwork, some of which was irrelevant to our interview study. But as we quickly learned, the government operates in a one-size-fits-all universe. One form -- required prior to the release of any data if research participants had a “history of drug abuse, alcoholism or sickle cell anemia or infection with the human immunodeficiency virus” -- proved to be an enigma: no one could tell us how to obtain it.
We labored for months to complete the paperwork, often receiving additional forms and questions along the way. Meanwhile, we struggled to comply with various well-intentioned institutional guidelines, such as the policy that only VA employees may have contact with patients or their data. To comply, our team had to obtain WOC (without compensation) VA appointments, which meant going through the time-consuming steps required to become an employee of the federal government, including completing mandatory VA orientation and training requirements and undergoing fingerprinting and background checks.
A major technical obstacle was the VA security requirement that all data collection and storage -- whether an HIV test result or the responses to an opinion question -- be done on VA-encrypted devices. Since no VA digital recording devices were available, we sought to record our interviews onto a VA laptop. We filled out the proper paperwork, which had to be reviewed and signed in multiple departments, including a “transportation of equipment” form that served no discernible purpose since removal of laptops from the premises was prohibited.
On our next visit, with patient interviews scheduled, we were surprised to learn that we could not use the laptop until we had each acquired a username and password, which required more paperwork and approval. Another well-intentioned obstacle concerned transcription of audio recordings and deidentification of transcripts. VA policy requires that no identifiable data leave the premises, meaning that transcription and deidentification must be done on-site by a VA employee. Since no VA transcriptionist was available, we had to go through the lengthy process of obtaining a WOC appointment for an off-site transcriptionist, as well as travel to the site to correct and de-identify the transcripts.
IRB requirements at the VA also affected our patient interviews. At other sites we provided participants with a concise but thorough explanation of our study’s aims and the risks and benefits of participating, and obtained verbal documentation of consent without the requirement of a signature. Since participants were in addiction treatment for substance use and often spoke about their own experience with addiction, our goal in using oral consent was to minimize the number of documents that included subject identifiers. The VA, however, would not approve verbal documentation of consent.
In addition, the lengthy standard consent form template required participants’ names and social security numbers on each page and signatures on three separate forms. And even though participation in our study was presumed confidential, the VA required each consent form to be scanned into patients’ permanent medical records. While certainly justified for interventional clinical trials, no treating physician should need to know whether a patient was interviewed for our study. Indeed, all the rules “made sense” in some circumstances; the problem was a lack of flexibility that might have allowed the rules to match the risk level and procedures of our study.
The Need for Flexibility
As bioethics researchers we are fully aware of the principles that undergird such rules. The history of abuse of human research participants is clear and there are good reasons to believe that veterans may need special protection. But does the current system actually accomplish its intended goals? We wondered how human subjects protection programs came to lock themselves into impenetrable layers of rules and procedures. Is it possible for systems to create flexibility while maintaining adequate protection of human subjects?
Some of what we encountered at the VA may be an institutional reaction to past scandals. For instance, the strict policies concerning laptop use and data encryption were developed because a number of U.S. federal agencies, including the VA, have been chastised for security breaches when laptops containing unencrypted data were lost or stolen. While special efforts to protect veterans and those with stigmatized disorders such as substance use are clearly important, the emphasis placed on the minute details of the process has created an endless labyrinth in which the main objective of providing protection for research participants is lost. The focus is diverted toward processes and procedures as ends in themselves. We must ask, is it right to make it so hard to do research at one of the U.S.’s most important health systems?
Furthermore, when demands that increase burden without protecting research participants are placed on researchers, important clinical outcomes or public health information may be delayed. Researchers may even forego conducting studies with certain populations. As Bozeman and Hirsch point out, creating layers of rules does not necessarily translate into an enhancement of an IRB’s ethical review process or of human subjects protection. It often has the opposite effect, producing a regulatory system that is too complex to be effectively implemented.
This complexity is amplified when the process has to be repeated at multiple sites. While we dealt with only one VA facility in our study, Joan Beder describes the trials and tribulations of conducting a research study at several VA facilities, each with its own IRB and WOC requirements.
While we on the “outside” were puzzled by what we encountered, those on the “inside,” understaffed and overburdened with mountains of regulatory paperwork, also appeared to be lost in their institution’s bureaucratic maze. Everyone involved in the oversight of our project was well-meaning and worked hard to guide us through the process. But VA employees appeared equally confused and frustrated with the institutional hurdles they were required to lay in our path.
Protecting the rights and welfare of participants should be the primary focus of research governance, and in order to ensure this, institutions must create flexible systems. Differences in types of studies should be recognized -- an interview study will not necessarily raise the same ethical issues as a drug trial -- and institutional rules and regulations should be evaluated and modified to match the methods of each study, points made in a recent Bioethics Forum post. It may even be of benefit to streamline the review of minimal risk research studies. Informed consent paperwork and procedures should reflect the potential harms of the study, focusing on enhancing participants’ understanding, and not on whether the last four digits of their social security number appear on every page. Ironically, such requirements may actually add risk for participants.
Neither Kafka’s castle nor Dickens’s Circumlocution Office completely captures what we experienced. Yet the VA bureaucracy embodies elements of both. As the federal government contemplates changes to human subjects protection requirements, we urge a renewed focus on what is essential to the protection of human subjects. Researchers and regulators alike must have the freedom to engage in moral reflection and exercise judgment about specific cases. Rules and procedures must have flexibility so that researchers can focus on actual harms. Otherwise, researchers and those who enforce the rules will spend their time and energy caught in a seemingly endless bureaucratic labyrinth, mindlessly engaged in compliance paperwork.
Jenny E. Ostergren is a study coordinator and Marguerite E. Robinson is program manager in the Biomedical Ethics Research Unit of the Mayo Clinic in Rochester, Mn. Molly J. Dingel is assistant professor in the Center for Learning Innovation at the University of Minnesota Rochester. Bradley Partridge is a National Health and Medical Research Council postdoctoral research fellow in the Centre for Clinical Research at the University of Queensland in Brisbane, Australia. Barbara A. Koenig is on the faculty of the University of California, San Francisco and is currently a research fellow at the Brocher Foundation in Geneva. The authors wish to acknowledge Stephen Arkin, Rachel Hammer and Jennifer McCormick for their advice and comments on this essay. The project described was funded by the National Institute on Drug Abuse, the National Center on Research Resources, and the Mayo Clinic S.C. Johnson Genomics of Addiction Program.