Laboratory Interventions to Eliminate Unnecessary Rapid COVID-19 Testing During a Reagent Shortage
Abstract
In the fall of 2020, US medical centers were running out of rapid coronavirus disease 2019 (COVID-19) tests. The aim of this study is to evaluate the impact of an intervention to eliminate rapid test misutilization and to quantify the effect of the countermeasures to control rapid test ordering using a test utilization dashboard.
Interventions were made to preserve a severely limited supply of rapid diagnostic tests based on real-time analysis of a COVID-19 test utilization dashboard. This study is a retrospective observational study evaluating pre- and postintervention rates of appropriate rapid test use, reporting times, and cost/savings of resources used.
This study included 14,462 severe acute respiratory syndrome coronavirus 2 reverse transcriptase polymerase chain reaction tests ordered during the study period. After the intervention, there was a 27.3% decrease in nonconforming rapid tests. Rapid test reporting time from laboratory receipt decreased by 1.47 hours. The number of days of rapid test inventory on hand increased by 39 days.
Performing diagnostic test stewardship, informed by real-time review of a test utilization dashboard, was associated with significantly improved appropriate utilization of rapid diagnostic COVID-19 tests, improved reporting times, implied cost savings, and improved reagent inventory on hand, which facilitated the management of scarce resources during a pandemic.
Introduction
In March 2020, the World Health Organization declared a global pandemic due to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), the etiologic agent of coronavirus disease 2019 (COVID-19).1 Accurate and rapid identification of infected individuals is essential to guide appropriate therapies, determine isolation strategies, optimize patient flow in the hospital, and allow for workforce planning while also informing community surveillance and epidemiology.2 Diagnostic testing is a critical component of the pandemic response strategy to confirm infection and carrier status of patients suspected to be infected by SARS-CoV-2.3 However, persistent and unpredictable global infection rates have periodically led to shortages in test supplies, reagents, instrumentation, personal protective equipment, and personnel. This complication has caused many challenges for hospitals in meeting and managing clinical demand for both COVID-19 and non–COVID-19 testing.4
COVID-19 laboratory testing can be strategically managed using diagnostic test stewardship, which aims to optimize patient care and resource allocation through improved test ordering, collection, processing, reporting, and clinical interpretation.5,6 Diagnostic stewardship improves the quality of clinical care through reliable and standardized decision support for test selection and clinical application while reducing unnecessary testing, phlebotomy, preventable downstream errors, and cost.6-8
Our institution formed a COVID-19 Test Stewardship Committee in the spring of 2020 with broad representation from hospital administration, laboratory leadership, clinical leadership, infection prevention and control, the purchasing and strategic sourcing department, and the data analytics team. By July 2020, US COVID-19 cases were swiftly rising and rapid test kits were becoming scarce. To control rapid test orders and adhere to our hospital’s guidelines for COVID-19 testing, the Test Stewardship Committee and informatics teams implemented electronic clinical decision support (CDS) via computerized provider order entry (CPOE) that automatically mapped COVID-19 laboratory orders to rapid or routine testing based on the test indication that a provider selected via an interface with the laboratory information system (LIS). Providers could select only an indication, not a priority status, for COVID-19 testing.
At the time, there was no centralized resource to track test usage in real time. Early in the pandemic, several health care institutions began reporting the use of data analytic tools to bridge the gap between information systems for expediting COVID-19 management.2,9-13 Our laboratory leadership proposed the creation of a COVID-19 test utilization dashboard to centrally monitor the complete testing process, including CPOE responses. This dashboard went live on the same day that CPOE was implemented. Monitoring test utilization trends with the dashboard enabled rapid identification of opportunities for process improvement.
MATERIALS AND METHODS
Inclusion and Exclusion Criteria
Two months after CPOE and the dashboard were implemented, the hospital ran into challenges with extreme shortages of rapid testing reagents. Data from the dashboard demonstrated that many tests ordered and labeled as routine had been performed on rapid instruments in the laboratory. Further investigation by laboratory leadership revealed that many clinicians were calling the microbiology laboratory after the specimens had been submitted and escalating the priority of their patients’ tests to stat or additionally labeling the specimen with a stat sticker, despite the specimen being originally ordered and labeled as routine. The number of rapid test orders was quickly overwhelming the system, and rapid test reagent inventory was dwindling. The laboratory was unable to adjudicate each stat request due to the staggering number of requests and the shortage of personnel.
Indications and Restrictions for Rapid Testing During Study Period
In October 2020, the real-time analysis of the dashboard was used to guide interventions devised by laboratory, clinical, and infection prevention and control leadership to preserve a severely limited supply of rapid diagnostic tests Table 1.
Date | Project Milestones |
---|---|
July 15, 2020 | A nationwide shortage of rapid SARS-CoV-2 diagnostic test kits and supplies prompted hospital administration to limit rapid tests to priority groups whose clinical care would most benefit from receiving test results within a few hours. |
July 17, 2020 | SARS-CoV-2 diagnostic test orders were updated with electronic clinical decision support via computerized physician order entry responses that automatically mapped priority groups to rapid testing based on reason for testing. |
July 17, 2020 | A COVID-19 test utilization dashboard was created for monitoring, analyzing, and informing diagnostic test stewardship. |
September 2020 | There was an extreme shortage of rapid testing reagents. Dashboard analysis revealed many routine test orders were performed as rapid, prompting investigation of inappropriate utilization. |
October 4-17, 2020 | Designated preintervention period (baseline) |
October 18-31, 2020 | Designated intervention period During this period, the following interventions were made to preserve a severely limited supply of rapid diagnostic tests: 1. An email reminder to ordering providers to follow authorized hospital guidelines for rapid testing 2. Discussions between laboratory and clinical leadership regarding rapid testing practices 3. Verbal messaging to frontline providers in team huddles 4. Discontinuation of manually placed stat stickers on specimen bags 5. In-service for microbiology laboratory staff to conform with priority status on specimen label |
November 1-14, 2020 | Designated postintervention period |
Date | Project Milestones |
---|---|
July 15, 2020 | A nationwide shortage of rapid SARS-CoV-2 diagnostic test kits and supplies prompted hospital administration to limit rapid tests to priority groups whose clinical care would most benefit from receiving test results within a few hours. |
July 17, 2020 | SARS-CoV-2 diagnostic test orders were updated with electronic clinical decision support via computerized physician order entry responses that automatically mapped priority groups to rapid testing based on reason for testing. |
July 17, 2020 | A COVID-19 test utilization dashboard was created for monitoring, analyzing, and informing diagnostic test stewardship. |
September 2020 | There was an extreme shortage of rapid testing reagents. Dashboard analysis revealed many routine test orders were performed as rapid, prompting investigation of inappropriate utilization. |
October 4-17, 2020 | Designated preintervention period (baseline) |
October 18-31, 2020 | Designated intervention period During this period, the following interventions were made to preserve a severely limited supply of rapid diagnostic tests: 1. An email reminder to ordering providers to follow authorized hospital guidelines for rapid testing 2. Discussions between laboratory and clinical leadership regarding rapid testing practices 3. Verbal messaging to frontline providers in team huddles 4. Discontinuation of manually placed stat stickers on specimen bags 5. In-service for microbiology laboratory staff to conform with priority status on specimen label |
November 1-14, 2020 | Designated postintervention period |
COVID-19, coronavirus disease 2019; SARS-CoV-2, severe acute respiratory syndrome coronavirus 2.
Open in new tab
In this observational study, we evaluated the impact of an intervention by laboratory leadership to eliminate unnecessary rapid SARS-CoV-2 diagnostic test use at the peak of the rapid test reagent shortage in the United States and to quantify the effect of the countermeasures to control rapid test ordering. The primary objective was to compare pre– and post–rapid test utilization after the implementation of interventions designed to enhance compliance with testing guidelines. Secondary objectives included evaluating (1) the reporting turnaround time for rapid SARS-CoV-2 diagnostic tests pre- and postimplementation of interventions designed to enforce compliance with guidelines, (2) the cost of rapid SARS-CoV-2 diagnostic testing pre- and postimplementation, and (3) the daily inventory for rapid reagent kits pre- and postimplementation.
Computerized physician order entry–based clinical decision support algorithm for rapid and routine coronavirus disease 2019 (COVID-19) test orders. ED, emergency department.
CDS via CPOE for COVID-19 Test Ordering
RESULTS
COVID-19 Testing
COVID-19 test orders were modified to include electronic CDS via CPOE responses that automatically matched the clinical indication for testing with the appropriate rapid or routine test designation Figure 1. Therefore, providers were responsible only for providing the clinical indication for testing while the priority status was determined through the interface between the selected indication in the electronic medical record (EMR) and the LIS systems. The CPOE questions were disabled for outpatient (OP) test orders because patients seen in OP settings did not qualify for any approved rapid test indications; all OP orders were routed for routine testing. The printable specimen collection label included “ST” for stat or “RT” for routine to expedite test performance routing in the laboratory. While electronic CPOE CDS enabled a single test order to be efficiently triaged for rapid or routine test performance routing, the laboratory still had the ability to change to a specific rapid or routine test instrument.
During the study period, rapid SARS-CoV-2 reverse transcriptase polymerase chain reaction testing was performed on the FilmArray system (BioFire) and GeneXpert Infinity platform (Cepheid) while routine testing was performed on the Panther Fusion platform (Hologic) and Cobas 6800 (Roche Molecular Systems). Multiple test systems were validated to accommodate the need for faster results for clinical care and to prevent service disruption in the event of instrument failure or reagent shortage. Considering the operational limitations at the time, health care providers could expect results from rapid tests within 2 hours from when the laboratory received the specimen and within 24 hours when using routine tests. Our study took place from October to November 2020. There were no local COVID-19 variants identified yet during this time period.
Test cost was estimated based on reagent costs from a publicly available table (https://www.theglobalfund.org/media/10233/covid19_diagnosticsreferenceprices_table_en.pdf).
A COVID-19 test utilization dashboard was created to help laboratory leadership understand the volumes, indications, and turnaround times for COVID-19 testing. Information from all testing instruments, the LIS (Cerner Millennium; Cerner), and EMR (Epic; Epic Systems) flowed into a dashboard established using Tableau software (Tableau).
Relevant data metrics throughout the entire testing process were selected to summarize daily and cumulative rapid and routine test volumes Figure 2, and operational performance rates were displayed. The dashboard was also used to understand if orders were performed on the appropriate instrument in the laboratory, according to the order priority that had interfaced to the LIS based on testing indication, or if a call to the laboratory or misuse of a stat sticker had led to unwarranted stat testing. Use of color coding, bar charts, and line graphs allowed quick comparative analysis of frequency, trends, and outliers.
Based on dashboard analysis, targeted interventions were implemented starting October 18, 2020, to reduce rapid COVID-19 testing practices that did not conform to existing hospital testing guidelines. Laboratory leadership sent a global email reminder to ordering providers that due to the critically low levels of rapid diagnostic test reagents, rapid tests would be strictly limited to only authorized priority groups per hospital guidelines. Laboratory leadership then held discussions with clinical leadership regarding rapid testing practice improvements in areas where the dashboard had identified misutilization occurring. Clinical leaders disseminated this information through verbal messaging to frontline providers in team huddles. This included the discontinuation of manually placed stat stickers on specimen bags. Concurrently, laboratory leaders informed the microbiology laboratory staff to follow the STAT (“ST”) or ROUTINE (“RT”) priority status on specimen labels and to deny verbal requests for rapid tests if they had been ordered as routine.
The primary outcomes investigated for this study were frequency of appropriate and inappropriate rapid test utilization, time between laboratory receipt of specimen and results reporting, cost of tests per patient, and number of days of test inventory available. Conforming utilization of rapid tests was defined as tests ordered and performed as rapid (ie, conforming to the clinical guidelines for rapid testing indications at the time of analysis). Nonconforming test utilization was defined as test orders mapped for routine testing (all OP orders and ED/inpatient orders with CPOE COVID-19 clinical indication responses as “admission from the ED,” “all other indications,” or no response given) that were performed on rapid instruments due to a verbal request from a provider to the laboratory or misuse of a stat sticker. Conforming status could not be determined for test orders with “no data” recorded for the CPOE COVID-19 test indication question.
COVID-19 Laboratory Dashboard
Due to the study’s qualitative design, no formal sample size calculation was performed. All reported findings serve to generate a hypothesis for future research. Summary statistics were reported to describe the study sample. For numerical variables, mean (SD), median (interquartile range [IQR]), and range were reported, and for categorical variables, number (percentage) of units for each category was reported. Sample characteristics were compared between pre- and postintervention periods. The standardized mean/proportion difference was reported for each of the characteristics. Characteristics with a standardized mean/proportion difference value less than 0.1 were considered balanced between the two periods. Performance outcomes (conforming rate of appropriate rapid test utilization, report turnaround time, and cost of tests per patient) were compared before and after the intervention. To detect the association between outcomes and time (pre- vs postintervention), Wilcoxon rank-sum tests were performed for numerical outcomes, and χ 2 test was performed for categorical outcome. Days of inventory on hand were compared before and after intervention. Mean (SD), median (IQR), and range of days of inventory on hand were reported pre- and postintervention, and Wilcoxon rank-sum tests were performed to detect whether there was a significant difference pre- and postintervention for rapid tests. All tests were two-sided. Appropriate tests were selected based on the distribution of data. Significance level was P < .05. All analyses were performed in R version 4.0.2.
Interventions
There were 14,462 total SARS-CoV-2 diagnostic tests performed at our academic hospital clinical laboratory during the 4-week study period: 6,828 tests preintervention and 7,634 tests postintervention. Basic demographic and test ordering locations did not change significantly between the two periods and are shown in Table 2.
Characteristic | Total (n = 14,462) | Preintervention (n = 6,828) | Postintervention (n = 7,634) | Absolute Standard Mean/ Proportion Difference |
---|---|---|---|---|
Patient age, y | ||||
Mean (SD) | 47.07 (23) | 46.49 (23.17) | 47.59 (22.84) | 0.048 |
Median (IQR) | 48 (31, 66) | 46 (31, 65) | 49 (32, 66) | |
Range | (0, 102) | (0, 101) | (0, 102) | |
Patient sex, No. (%) | ||||
Female | 8,514 (59) | 4,004 (59) | 4,510 (59) | 0.009 |
Male | 5,942 (41) | 2,821 (41) | 3,121 (41) | |
Patient population, No. (%) | ||||
Adult | 12,580 (87) | 5,875 (86) | 6,705 (88) | 0.053 |
Pediatric | 1,882 (13) | 953 (14) | 929 (12) | |
Patient visit type, No. (%) | ||||
Outpatient | 10,987 (76) | 5,119 (75) | 5,868 (77) | 0.045 |
Emergency department | 2,160 (15) | 1,069 (16) | 1,091 (14) | |
Inpatient | 1,315 (9) | 640 (9) | 675 (9) | |
CPOE response for test indication, No. (%) | ||||
ED patient being discharged to other facility | 146 (1) | 60 (1) | 86 (1) | 0.192 |
Emergency procedure today | 294 (2) | 136 (2) | 158 (2) | |
Labor and delivery patient | 307 (2) | 119 (2) | 188 (2) | |
Admission from the ED | 1,230 (9) | 627 (9) | 603 (8) | |
All other indications | 1,015 (7) | 427 (6) | 588 (8) | |
No data | 483 (3) | 340 (5) | 143 (2) | |
Outpatient (not asked) | 10,987 (76) | 5,119 (75) | 5,868 (77) | |
Order priority, No. (%) | ||||
Ordered rapid | 747 (5) | 315 (5) | 432 (6) | 0.176 |
Ordered routine | 13,715 (95) | 6,513 (95) | 7,202 (94) | |
Test performance, No. (%) | ||||
Performed rapid | 2,043 (14) | 1,344 (20) | 699 (9) | 0.303 |
Performed routine | 12,419 (86) | 5,484 (80) | 6,935 (91) | |
Test instrument, No. (%) | ||||
GeneXpert Infinity | 1,334 (9) | 1,316 (19) | 18 (0) | 0.833 |
FilmArray | 709 (5) | 28 (0) | 681 (9) | |
Panther Fusion | 1,087 (8) | 268 (4) | 819 (11) | |
Cobas 6800 | 11,332 (78) | 5,216 (76) | 6,116 (80) |
Characteristic | Total (n = 14,462) | Preintervention (n = 6,828) | Postintervention (n = 7,634) | Absolute Standard Mean/ Proportion Difference |
---|---|---|---|---|
Patient age, y | ||||
Mean (SD) | 47.07 (23) | 46.49 (23.17) | 47.59 (22.84) | 0.048 |
Median (IQR) | 48 (31, 66) | 46 (31, 65) | 49 (32, 66) | |
Range | (0, 102) | (0, 101) | (0, 102) | |
Patient sex, No. (%) | ||||
Female | 8,514 (59) | 4,004 (59) | 4,510 (59) | 0.009 |
Male | 5,942 (41) | 2,821 (41) | 3,121 (41) | |
Patient population, No. (%) | ||||
Adult | 12,580 (87) | 5,875 (86) | 6,705 (88) | 0.053 |
Pediatric | 1,882 (13) | 953 (14) | 929 (12) | |
Patient visit type, No. (%) | ||||
Outpatient | 10,987 (76) | 5,119 (75) | 5,868 (77) | 0.045 |
Emergency department | 2,160 (15) | 1,069 (16) | 1,091 (14) | |
Inpatient | 1,315 (9) | 640 (9) | 675 (9) | |
CPOE response for test indication, No. (%) | ||||
ED patient being discharged to other facility | 146 (1) | 60 (1) | 86 (1) | 0.192 |
Emergency procedure today | 294 (2) | 136 (2) | 158 (2) | |
Labor and delivery patient | 307 (2) | 119 (2) | 188 (2) | |
Admission from the ED | 1,230 (9) | 627 (9) | 603 (8) | |
All other indications | 1,015 (7) | 427 (6) | 588 (8) | |
No data | 483 (3) | 340 (5) | 143 (2) | |
Outpatient (not asked) | 10,987 (76) | 5,119 (75) | 5,868 (77) | |
Order priority, No. (%) | ||||
Ordered rapid | 747 (5) | 315 (5) | 432 (6) | 0.176 |
Ordered routine | 13,715 (95) | 6,513 (95) | 7,202 (94) | |
Test performance, No. (%) | ||||
Performed rapid | 2,043 (14) | 1,344 (20) | 699 (9) | 0.303 |
Performed routine | 12,419 (86) | 5,484 (80) | 6,935 (91) | |
Test instrument, No. (%) | ||||
GeneXpert Infinity | 1,334 (9) | 1,316 (19) | 18 (0) | 0.833 |
FilmArray | 709 (5) | 28 (0) | 681 (9) | |
Panther Fusion | 1,087 (8) | 268 (4) | 819 (11) | |
Cobas 6800 | 11,332 (78) | 5,216 (76) | 6,116 (80) |
CPOE, computerized provider order entry; ED, emergency department; IQR, interquartile range.
aThe standardized mean/proportion difference was reported for each of the characteristics. Characteristics with a standardized mean/proportion difference value less than 0.1 were considered balanced between the two periods.
Open in new tab
Dashboard display of cumulative and daily coronavirus disease 2019 (COVID-19) reverse transcriptase polymerase chain reaction (PCR) test volumes during the study period. In the top section, the orders are split into tests performed as rapid (to left of y-axis) and tests performed routine (to right of y-axis). Routine test orders (dark red) that were performed as rapid can easily be identified (as shown in the red box) for further investigation of inappropriate utilization.
The three COVID-19 test indications that qualified for rapid testing (ED patient being discharged to other facility [n = 60, 1%], emergency procedure today [n = 136, 2%], and labor and delivery patient [n = 119, 2%]) totaled 5% (315/6,828) of the indications chosen in the preintervention sample and 6% (432/7,634) in the postintervention sample (ED patient being discharged to other facility [n = 86, 1%], emergency procedure today [n = 158, 2%], and labor and delivery patient [n = 188, 2%]). No data were provided for this question for 340 (5%) preintervention test orders and 143 (2%) postintervention test orders, all of which were ordered on paper and automatically designated for routine testing. The percentage of total tests performed as rapid decreased from 20% (1,344/6,828) in the preintervention period to 9% (699/7,634) in the postintervention period.
Characteristic | Preintervention (n = 1,344) | Postintervention (n = 699) | P Value |
---|---|---|---|
Rapid test conforming status, No. (%) | <.001b | ||
Conforming | 271 (20.2) | 395 (56.5) | |
ED patient being discharged to other facility | 47 | 75 | |
Emergency procedure today | 108 | 134 | |
Labor and delivery patient | 116 | 186 | |
Nonconforming | 787 (58.6) | 219 (31.3) | |
Admission from the ED | 299 | 76 | |
All other indications | 166 | 81 | |
Outpatient | 322 | 62 | |
Unable to determine | 286 (21.3) | 85 (12.2) | |
No data | 286 | 85 | |
Hours from laboratory receipt to reporting | <.001c | ||
Mean (SD) | 2.95 (4.53) | 1.48 (2.16) | |
Median (IQR) | 1.15 (0.98, 1.74) | 1.08 (0.92, 1.48) | |
Range | (0, 23.92) | (0, 25.28) | |
Cost of tests per patient, $ | <.001c | ||
Mean (SD) | 40.46 (10.00) | 107.20 (11.10) | |
Median (IQR) | 39 (39, 39) | 109 (109, 109) | |
Range | (39,109) | (39, 109) | |
Days of rapid test inventory on hand | (n = 14) | (n = 14) | <.001c |
Mean (SD) | 12.95 (6.63) | 52.02 (10.03) | |
Median (IQR) | 12.40 (7.30, 19.65) | 49.95 (47.00, 54.88) | |
Range | (5.20, 20.40) | (39.80, 73.00) |
Characteristic | Preintervention (n = 1,344) | Postintervention (n = 699) | P Value |
---|---|---|---|
Rapid test conforming status, No. (%) | <.001b | ||
Conforming | 271 (20.2) | 395 (56.5) | |
ED patient being discharged to other facility | 47 | 75 | |
Emergency procedure today | 108 | 134 | |
Labor and delivery patient | 116 | 186 | |
Nonconforming | 787 (58.6) | 219 (31.3) | |
Admission from the ED | 299 | 76 | |
All other indications | 166 | 81 | |
Outpatient | 322 | 62 | |
Unable to determine | 286 (21.3) | 85 (12.2) | |
No data | 286 | 85 | |
Hours from laboratory receipt to reporting | <.001c | ||
Mean (SD) | 2.95 (4.53) | 1.48 (2.16) | |
Median (IQR) | 1.15 (0.98, 1.74) | 1.08 (0.92, 1.48) | |
Range | (0, 23.92) | (0, 25.28) | |
Cost of tests per patient, $ | <.001c | ||
Mean (SD) | 40.46 (10.00) | 107.20 (11.10) | |
Median (IQR) | 39 (39, 39) | 109 (109, 109) | |
Range | (39,109) | (39, 109) | |
Days of rapid test inventory on hand | (n = 14) | (n = 14) | <.001c |
Mean (SD) | 12.95 (6.63) | 52.02 (10.03) | |
Median (IQR) | 12.40 (7.30, 19.65) | 49.95 (47.00, 54.88) | |
Range | (5.20, 20.40) | (39.80, 73.00) |
ED, emergency department; IQR, interquartile range.
aConforming test status is defined as a test that was performed on a rapid instrument and ordered with a rapid test priority according to institutional guidelines. Statistical analysis was performed without the undetermined category.
bχ2 test.
cWilcoxon rank sum test.
Open in new tab
After the interventions to reduce inappropriate rapid testing, there was a 27.3% decrease in nonconforming rapid tests (787/1,344 [58.6%] preintervention; 219/699 [31.3%] postintervention; P < .001) Table 3. The rapid test reporting time from receipt in the laboratory decreased by 1.47 average hours from the preintervention period (mean [SD], 2.95 [4.53] hours) to the postintervention period (mean [SD], 1.48 [2.16] hours; P < .001). After the interventions, the cost of rapid tests per patient increased by $66.74 (mean [SD], $40.46 [$10.00] preintervention; mean [SD], $107.20 [$11.10] postintervention; P < .001) due to the need to switch our rapid testing platform to a more expensive option because of reagent shortages. Due to manufacturer-specific supply constraints at the time of testing, rapid tests were mainly performed on the GeneXpert Infinity in the preintervention period (n = 1,316/1,344, 98% of rapid tests) and mainly performed on the FilmArray in the postintervention period (n = 681/699, 97% of rapid tests). The number of days of rapid testing inventory on hand increased by 39.07 days (mean [SD], 12.95 [6.63] days preintervention; mean [SD], 52.02 [10.03] days postintervention; P < .001).
Discussion
Outcomes Investigated
Real-time analytical findings from the COVID-19 test utilization dashboard revealed that nearly 60% of COVID-19 diagnostic tests performed on rapid instruments did not conform to the hospital clinical guidelines for rapid testing at the time of analysis despite the use of a CPOE process designed to avoid overutilization. A common reason for the misutilization was determined to be that many providers were calling the laboratory and requesting the laboratory to change the priority to rapid for a variety of unapproved reasons. The laboratory staff were overextended and did not have resources to investigate or triage each of these cases and instead simply performed the test as rapid. Based on the findings revealed by the dashboard, laboratory administration implemented interventions to preserve a severely limited supply of SARS-CoV-2 rapid diagnostic tests at our medical center, which included an email reminder to ordering providers to follow authorized hospital guidelines for rapid testing, communications between laboratory and clinical leadership regarding rapid testing practices, verbal messaging to frontline providers in team huddles, discontinuation of manually placed stat stickers on specimen bags, and in-servicing for clinical microbiology laboratory staff to conform with testing priority status on the specimen label.
Statistical Analysis
After using the COVID-19 test utilization dashboard to support targeted interventions, laboratory leadership successfully decreased the inappropriate utilization of rapid COVID-19 diagnostic tests by 27.3% and improved the test reporting time from receipt in the laboratory by 1.47 average hours from the preintervention period to the postintervention period. While there was a small change in positivity rate between the pre- and postintervention periods (1.7% and 2.6%, respectively), there was no difference in instrument capacity that could account for the difference in reporting time. Postintervention, a severe limitation of supplies from one manufacturer necessitated replacement of supplies with a competing, more expensive manufacturer, resulting in the cost of rapid tests per patient increasing by $66.74. The increase in test price does not reflect an impact of the interventions but rather denotes the significant rise in costs of rapid test supplies, as severely limited supplies from one manufacturer had to be replaced by supplies from a more expensive manufacturer. Despite the decreased rate of rapid test use from 20% (1,344/6,828) in the preintervention period to 9% (699/7,634) in the postintervention period, the inevitable increase in costs due to switching the test platform to a more expensive option further emphasized the importance of optimizing diagnostic test stewardship for cost control. To estimate potential cost savings, we compared the total cost of rapid testing over both time periods using the Cepheid GeneXpert system. The average cost of all the Cepheid rapid COVID-19 tests available at the time was estimated to be $39 per test. Thus, the total cost of testing was $52,416 preintervention (n = 1,344) and $27,261 postintervention (n = 699), suggesting a savings of $25,155 in reagent costs based on the change in ordering practice. Most important, our medical center did not run out of rapid testing reagents, and the mean days of tests available in the inventory rose from 13 to 52 days on hand, despite continuing local and national reagent shortages. The increased days of rapid test inventory on hand is a direct reflection of the impact of the interventions because the rapid test inventory included supplies for both rapid instruments, and overall reagent allocations from the manufacturer were steadily maintained during the study period. Together, these findings support an overall improvement in the rapid testing process. Fewer specimens submitted for rapid testing allowed for faster results reporting (laboratory receipt to reporting) because instrument operators were more readily available.
Clinical laboratory testing generates a wealth of data that informs roughly 70% of clinical care decisions for all patient populations throughout the health care system.6,8,14,15 LISs are regularly used to manage the analytic phase of testing and drive internal operational performance efficiency.15-17 However, as EMRs become more sophisticated in capturing the provider decision-making process in the pre- and postanalytical test phases, it is necessary to aggregate data across all information systems to better understand total performance efficiency and related outcomes.18,19 As stewards of large amounts of clinical data, laboratories are increasingly using innovative informatic tools to support intelligent decision-making for managing test orders, care modeling, outcomes, costs, and operational and human resources.17,20 With expanding laboratory test menus, more sophisticated technologies with complex result interpretations, rising health care costs, and a transition to value-based reimbursement, there is escalating demand for clinical laboratories to develop their own stewardship programs.8,14,21 Comprehensive analysis of test utilization can provide the objective evidence needed for improving testing algorithms and ordering practices.16
Our study has a few limitations. It was a retrospective observational study that evaluated data pre- and postintervention; therefore, causality could not be determined. The sample selected was restricted to a 4-week period of observation (2 weeks preintervention and 2 weeks postintervention) due to frequently evolving guidelines affecting clinical practice for COVID-19 management. The limited study period was chosen because there were no other identified interventions that would have influenced testing practices. There were some limitations in the quality of the data feeding the dashboard. For some test orders, no data were recorded for the CPOE question response for COVID-19 test indication due to manual order entry, although this was mainly for outpatients for whom rapid testing was prohibited. The reliability of a patient’s true COVID-19 clinical indication for testing matching the chosen CPOE responses could not easily be verified without entering individual medical records due to the complexity of clinical documentation in emergency, inpatient, and outpatient settings. However, we performed a small audit validating CPOE test indications before the study started and did not find evidence of incorrect documentation of order indications in the EMR. This study also did not address the reasons why providers did not follow the clinical decision support.
Performing diagnostic test stewardship through a targeted intervention informed by real-time review of a test utilization dashboard was associated with significantly improved appropriate utilization of rapid diagnostic COVID-19 tests, improved reporting turnaround times, implied cost savings, and improved reagent inventory on hand. The effect facilitated the operational management of scarce resources in response to the rapidly changing needs of the pandemic. The results of this study contribute to a better understanding of the use of new evidence-based intervention tools to optimize the entire testing process and improve laboratory stewardship efforts.
We acknowledge the contributions of all the dedicated laboratory staff who performed testing at New York Presbyterian (NYP) laboratories; laboratory leadership, including Noah Ginsburg; NYP laboratory information system specialists, Bulent Oral, Sarah Russell, and Kelvin Espinal; NYP financial analysts for procurement and strategic sourcing, Rebecca Alper and Kimberlee Ales; and Rutgers University Doctor of Clinical Laboratory Science program leaders, Nadine A. Fydryszewski, PhD, MS, MLS(ASCP), and Elizabeth Kenimer Leibach, EdD, MS, MLS(ASCP)SBB.