IRB: Ethics & Human Research

Lapse in Institutional Review Board Continuing Review Approval

The Federal Policy for the Protection of Human Subjects, also known as the Common Rule, requires that institutional review boards (IRBs) conduct continuing review of human research at intervals appropriate to the degree of risk, but not less than once per year.1The primary purpose of this requirement is to ensure that, among other things, risks to participants are being minimized and are still reasonable in relation to the knowledge that is expected to result from the research and to the anticipated benefits, if any, to the subjects.2

Federal regulations make no provision for any grace period extending the conduct of research beyond the expiration date of IRB approval. A lapse in IRB continuing review approval of research occurs whenever an investigator fails to provide continuing review information to the IRB or the IRB has not conducted continuing review and reapproved the research by the expiration date of IRB approval. In such circumstances, all research activities involving human subjects must stop, unless the IRB determines that it is in the best interests of already-enrolled subjects to continue participating in the research. Thus, new participants may not be enrolled, and continuing participation of already-enrolled subjects may be appropriate only when the research interventions hold out the prospect of direct benefit to the participants or when withholding those interventions poses increased risk to them.3

Despite the importance of timely IRB continuing review approval, little is known about the lapse rate of IRB continuing reviews and how frequently investigators continue research activities during the lapse. It is also not clear what factors may contribute to an institution’s rate of lapse in IRB continuing review approval. In 1995, Nightingale cited inadequate or late review of active protocols as one of the most common deficiencies identified by the Food and Drug Administration (FDA).4A 1996 Government Accountability Office (GAO) report found that IRB continuing reviews were typically either superficial or not done at all, and speculated that the underlying cause of this deficiency was that many IRBs were overworked and under-supported by their institutions.5While considerable improvements are evident since the GAO report was published—including stronger federal oversight of research, increased institutional support for IRBs, and improved training for investigators and IRB members6—little is known about the current status of compliance in IRB continuing reviews.

The Department of Veterans Affairs (VA) Health Care System is the largest integrated health care system in the country. Between 2010 and 2013, there were 107 to 108 VA facilities (VA Medical Centers or VA Health Care Systems facilities) conducting research involving human subjects. As part of the quality assurance program, the VA has been collecting quality indicator (QI) data, including IRB continuing reviews, for Human Research Protection Programs (HRPPs) since 2010.7

In the current study, we analyzed VA HRPP QI data from 2010 to 2013, focusing on IRB continuing reviews. Here we report the lapse rates in IRB continuing reviews over a four-year period and whether the size of human research programs or the types of IRBs used (VA or affiliated university IRBs) has any effect on lapse in IRB continuing reviews.

Methods

Data Collection.As part of the VA HRPP quality assurance program, each VA research facility was required to have audits of all informed consent documents (ICDs) conducted annually and regulatory audits of all active human research protocols conducted every three years by qualified research compliance officers (RCOs).8Protocol regulatory audits were limited to a three-year retrospective look at the protocols. Audit tools were developed for the annual ICD as well as triennial protocol regulatory audits.9Facility RCOs were then trained to use these tools to conduct audits throughout the year.

Results of the ICD audits and protocol regulatory audits conducted between June 1 and May 31 of each year were collected through a web-based system from all VA research facilities. Information collected included ICD and Health Insurance Portability and Accountability Act (HIPAA) authorization requirements, IRB and Research and Development Committee (R&DC) initial approval of human research protocols, compliance with selected informed consent requirements, for-cause suspension or termination of human research protocols, research-related serious adverse events, compliance with IRB continuing review requirements, subject enrollment according to inclusion and exclusion criteria, research personnel scopes of practice, and investigator human research protection training requirements. No individually identifiable personal information was collected. As this was a VA quality assurance project and no individually identifiable information was collected, no IRB review and approval of the project was required.10

Data Analysis.All data collected were entered into a computerized database for analysis. When necessary, facilities were contacted to verify the accuracy and uniformity of data reported.

We used the Mantel-Haenszel chi-square test for trend to determine the trend of changes from 2010 through 2013.11For the comparison of two means, the Student’s t test with Bonnferoni correction for multiple comparisons was used to determine the level of significance.12A p value of < 0.05 was considered statistically significant.

Results

Lapse in IRB Continuing Reviews.Table 1 summarizes the data on IRB continuing reviews. The total number of human research protocols requiring IRB continuing reviews was less than the total number of protocols audited, because approximately 20% (19.9%  2.7%, mean  SD) of protocols audited each year did not require IRB continuing reviews. There were 80 facilities with protocols requiring IRB continuing reviews in 2010, 95 in 2011, 93 in 2012, and 99 in 2013.

The rates of lapse in IRB continuing reviews remained relatively high and constant at 6% to 7% from 2010 through 2013. In contrast, less than 0.20% of investigators continued research activities, excluding those activities that were deemed by the IRBs to be in the best interest of already-enrolled subjects, during the lapses.

For the purpose of comparison, we also included QI data on research personnel scopes of practice and human subject protection training requirements. VA policies require all research personnel who participate in human subjects research to have an approved research scope of practice defining the duties that the individual is qualified and allowed to perform for research purposes as well as to complete initial and annual training in the ethical principles and accepted good clinical practices.13The 2010 research personnel scopes of practice and training requirement data were derived from all human, animal, and safety research protocols audited, not just the human research protocols audited. However, we included these data for comparison with the 2011 to 2013 data because nonhuman research protocols audited constituted less than 30% of the total. In addition, based on our on-site routine reviews of the facilities’ HRPPs, animal care and use programs, as well as research safety and security programs, we believed that the lapse rates of research scope of practice and training requirements in these nonhuman research protocols were similar to those with human research protocols.14

As shown in Table 1, the lapse rate of research personnel scopes of practice in 2010 was 7.65%. However, it decreased sharply over the next three years to 0.55% and 0.65% in 2012 and 2013, respectively. The rate of research personnel working outside of their scopes of practice was very low, at less than 0.15%, in 2010 through 2013.

The lapse rate of research personnel training requirements was 5.86% in 2010 and decreased steadily over the next three years to 1.64% in 2013. Lapses in initial training and annual continuing training requirements showed similar trends of improvement over the period between 2010 and 2013. However, lapse in annual continuing training requirements appeared to be largely responsible for the overall lapse in training requirements.

Figure 1 is a comparison of the lapses in IRB continuing reviews, research personnel scopes of practice, and training requirements from 2010 through 2013. It is apparent that when we started to collect VA HRPP QI data in 2010, these three parameters had similar high rates of lapses. However, while lapses in research personnel scopes of practice and training requirements showed marked improvements over the subsequent three years, there was no improvement in lapse in IRB continuing reviews over the same period.

Effect of the Types of IRB Used.Based on the types of IRB used, VA research facilities can be categorized into three groups: those using their own VA IRBs, those using other VA IRBs, and those using affiliate university IRBs as their IRBs of record. We analyzed our data to determine whether the types of IRB used had any effect on lapses in IRB continuing reviews.

As shown in Table 2 (see theIRB: Ethics & Human Researchwebsite), on the average, approximately 48 facilities each year used their own VA IRBs, 12 facilities each year used other VA facilities’ IRBs, and 32 facilities each year used affiliated university IRBs as their IRBs of record. However, the types of IRB used had no effect on lapse rates in IRB continuing reviews.

Effect of the Size of Human Research Programs.We also analyzed our data according to the sizes of facilities’ human research programs. Based on the 2011 VA HRPP quality indicator data, we have previously reported that facilities with a large human research program (with more than 200 human research protocols) did not perform as well as those with medium (between 50 and 200 protocols) or small (less than 50 protocols) research programs.15

As shown in Table 3 (available on theIRB: Ethics & Human Researchwebsite), on the average, approximately 27 facilities each year had a small research program, 35 facilities each year had a medium research program, and 29 facilities each year had a large research program. Facilities with a medium research program had the highest lapse rate in IRB continuing reviews (7.30%), while facilities with a small research program had the lowest lapse rate (3.99%). However, there were no statistically significant differences among the three groups using the Student’s t test with Bonferroni correction for multiple comparisons (after Bonferroni correction for multiple comparison, [n = 3], in order to be considered statistically significant, p value needs to be < 0.017). Thus, the sizes of human research programs had no correlation with the facilities’ IRB continuing review lapse rates.

Facilities with High Rates of Lapse in IRB Continuing Reviews.Since neither the types of IRBs used nor the sizes of human research programs had significant effects on the lapse rates of IRB continuing review, we then focused on those facilities with a high lapse rate, in other words, more than 10%. As shown in Table 4 (see theIRB: Ethics & Human Researchwebsite), on the average, 55 facilities (60.22%) each year reported no lapse (0%) in IRB continuing reviews, 17 (18.27%) and 20 (21.51%) facilities each year reported IRB continuing review lapse rates of > 0% – 10% and > 10%, respectively.

Analysis of facilities with a lapse rate of > 10% from 2010 through 2013 revealed that 25 facilities had a lapse rate of > 10% once from 2010 to 2013, 10 facilities had a lapse rate of > 10% twice in these four years, 6 facilities had a lapse rate of > 10% in three out of four years, and 4 facilities had a lapse rate of > 10% in all four years.

Discussion

   The data presented in this report demonstrate that the lapse rates in IRB continuing review at VA research facilities remained relatively constant at above 6.0% over a four-year period from 2010 through 2013. In contrast, less than 0.20% of investigators continued research activities, excluding those activities that were deemed by the IRBs to be in the best interest of already-enrolled subjects, during the lapses. Thus, the majority of investigators stopped research activities when the IRB approval expired. We also found that the types of IRB used or the sizes of human research programs had no obvious correlation with the facility’s lapse in IRB continuing reviews. However, approximately 20% of facilities with protocols requiring IRB continuing reviews had IRB continuing review lapse rates of > 10% each year. In addition, a number of facilities appeared to be repeat offenders: a total of 10 facilities had IRB continuing review lapse rates of > 10% in at least three out of four years from 2010 through 2013, suggesting systemic problems in these facilities. Consequently, we have required these 10 facilities to develop remedial action plans to improve their IRB continuing reviews.

The lack of improvement in lapse rates for IRB continuing review from 2010 to 2013 is striking, considering that marked improvement was seen during the same period in lapse rates for research personnel scopes of practice and for training requirements even though the three rates were similarly high in 2010. We have previously reported that in a total of 19 VA HRPP performance metrics with all three-year data available, 10 showed no improvement and 9 showed marked improvements over a period of three years from 2010 through 2012. Of the 10 performance metrics that did not show any improvement, all but 2 (lapses in IRB continuing reviews and VA-specific approval requirements for international research and research involving children and prisoners) had very low lapse rates in 2010 (less than 1%), suggesting that further improvement was difficult to achieve.16One could argue that since the protocol regulatory audits, including audits of the lapses in IRB continuing reviews, required a three-year retrospective look at the protocols, the lapse rates reported each year for these continuing reviews might contain results of the prior two years in some protocols. The data for research personnel scope of practice and training requirements concern only one year, so it might take longer to see improvement in lapse rates for IRB continuing reviews than for these other two metrics even if facilities started to improve these reviews in 2011. However, this could not explain the reported lack of improvement here, as the data cover a span of four years.

In the mid-1990s, both the FDA and GAO reported that lapse in IRB continuing reviews was one of the most common noncompliant findings.17Nearly 20 years later, we found that among more than 20 HRPP performance metrics that we are monitoring, lapse in IRB continuing reviews remains one of the highest. We believe that this observation is not unique to VA research facilities, as approximately one-third of VA facilities use affiliated university IRBs as their IRBs of record. In addition, facilities using university IRBs had similar rates of lapse in IRB continuing reviews as those using VA IRBs.

In the 1990s, it was speculated that overworked IRBs with inadequate institutional support might be the cause for the high rates of lapse in IRB continuing review.18This might still contribute to the cause of high lapse rates of IRB continuing review today, even though it is generally believed that IRBs’ workloads and institutional support have improved considerably since then.19We are particularly concerned about the lack of improvement over a span of four years, as each year we provide the results of HRPP quality indicator data to the facilities for quality improvement purposes. In addition, only a relatively small number of facilities are responsible for the overall high rates of lapse in IRB continuing reviews.

The Office for Human Research Protections of the Department of Health and Human Services recommends that in order to comply with the regulatory requirements of and avoid lapse in IRB continuing reviews, a given IRB and the investigator should plan ahead to ensure that continuing review and reapproval of research occurs prior to the end of the approval period specified by the IRB. The IRB should have written procedures that provide sufficient advance notice to the investigator to ensure that the requirements for continuing review are met by the date on which approval would expire. The IRB should develop administrative procedures, such as the use of computerized tracking systems, to minimize any unintended expiration of IRB approval. However, it is the responsibility of the investigator to provide in a timely manner the information the IRB needs to perform its continuing review functions, and any reminder notices from the IRB to the investigator about providing this are a courtesy.20Thus, successful compliance with IRB continuing review requirements depends on the collaboration and cooperation of IRBs and investigators. We hope that providing annual IRB continuing review monitoring data to the facilities will help them develop strategies to improve their compliance rates.

Min-Fu Tsan, MD, PhD,was deputy executive director of the Office of Research Oversight in the Department of Veterans Affairs at the time of this study, andYen Nguyen, Pharm D,is research pharmacist in Office of Research Oversight in the Department of Veterans Affairs.

Table

Tables 2, 3, and 4 are available on theIRB: Ethics & Human Researchwebsite.

Disclaimer

The views presented in this report are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs.

Acknowledgment

The authors wish to thank J. Thomas Puglisi, PhD, the executive director of the Office of Research Oversight in the Department of Veterans Affairs for his support of this project and all VA research compliance officers for their contributions in conducting the audits and collecting the data presented in this report.

References 

1. Department of Health and Human Services. Protection of Human Subjects. Subpart A. Basic HHS Policy for Protection of Human Research Subjects. 45 CFR 46.

2. See ref.1; Department of Veteran Affairs. Requirements for the Protection of Human Subjects in Research.VHA Handbook1200.05, 2010. https://www1.va.gov/vhapublications/.

3. Department of Health and Human Services, Office for Human Research Protections.Guidance on IRB Continuing Review of Research. 2010. https://www.hhs.gov/ohrp/policy/.

4. Nightingale SL. An update from FDA. Plenary address at PRIM&R IRB Conference, 20 October 1995. Boston, Massachusetts.

5. Government Accountability Office.Scientific Research—Continued Vigilance Critical to Protecting Human Subjects. GAO/HEHS-96-72. 1996.

6. Steinbrook R. Improving protection for research subjects.NEJM2002;346:1425-30.

7. Tsan MF, Smith K, Gao B. Assessing the quality of human research protection programs: The experience at the Department of Veterans Affairs.IRB: Ethics & Human Research2010;32(4):16-19; Tsan MF, Nguyen Y, Brooks R. Using quality indicators to assess human research protection programs at the Department of Veterans Affairs.IRB: Ethics & Human Research2013;35(1):10-14; Tsan MF, Nguyen Y, Brooks R. Assessing the quality of VA human research protection programs: VA vs. affiliated university institutional review board.Journal of Empirical Research on Human Research Ethics2013;8:153-160; Nguyen Y, Brooks R, Tsan MF. Human research protection programs at the Department of Veterans Affairs: Quality indicators and program size.IRB: Ethics & Human Research2014;36(4):16-20; Tsan MF, Nguyen Y, Brooks R. Using quality indicators to assess and improve human research protection programs: Experience of the Department of Veterans Affairs.Federal Practitioner. Forthcoming 2015.

8. Department of Veteran Affairs. Research compliance reporting requirements.VHA Handbook1058.01, 2010. https://www1.va.gov/vhapublications/.

9. These tools are available at https://www.va.gov/ORO/Research_Compliance_Education.asp.)

10. Tsan MF, Puglisi JT. Health care operations activities that may constitute research—The Department of Veterans Affairs’s perspective.IRB: Ethics & Human Research2014;36(1):9-11.

11. Woodward M.Epidemiology. Study Design and Data Analysis. Chapman and Hall/CRC, London. 1999.

12. Matthews DE, Farewell VT.Using and understanding Medical Statistics. 2nd edition. S. Karger AG, Basel, Switzerland. 1988.

13. See ref.2; Department of Veteran Affairs. 2010.

14. See ref. 7, Tsan et al.Federal Practitioner. Forthcoming.

15. See ref. 7, Nguyen et al.IRB: Ethics & Human Research2014.

16. See ref. 7, Tsan et al.,Federal Practitionerforthcoming 2015.

17. See refs. 4 and 5.

18. See ref. 5.

19. See ref. 6.

20. See ref. 3.

Tsan M-F, Nguyen Y. Lapse in institutional review board continuing review approval.IRB: Ethics & Human Subject Research2015;37(2):14-19.