Oral Abstract Presentations - SHEA Spring 2015 Conference

advertisement
May 14, 3:30 – 4:30pm Nutcracker Ballroom 1
Compliance to a four item surgical site infection (SSI) prevention bundle reduces the risk of an SSI
Mayke B.G. Koek, MD PhD MSc, Loes C. Soetens, MSc, Titia E.M. Hopmans, BSc, Jan C. Wille, Birgit H.B.
van Benthem, PhD and Sabine C. de Greeff, PhD, National Institute for Public Health and the
Environment, Bilthoven, Netherlands
Background
Surgical site infections (SSIs) are a major complication following surgery. In 2008, the Dutch Hospital
Patient Safety Program (DHPSP) developed a so-called SSI care bundle, consisting of four interventions
aiming at prevention of SSIs: timely antibiotic prophylaxis, normothermia, no hair removal, and hygiene
discipline on the OR. The bundle is implemented for selected types of surgery (index surgeries).
Reporting of compliance to the bundle is incorporated in the SSI incidence surveillance of the Dutch
PREZIES network. This paper describes the reporting of and compliance with the bundle, and
investigates the association between incidence of SSIs and adherence to the SSI bundle.
Methods
PREZIES SSI surveillance data for the index surgeries from 2009 to 2014 were selected. The four
interventions were analysed separately as well as combined. Multilevel Log Binomial Regression was
used to calculate Relative Risks (RR) on SSIs for bundle compliance, taking into account clustering within
specialty and hospitals, and adjusting for the hospitals' participation duration to the DHPSP. Complete
compliance to all 4 interventions was compared to incomplete compliance as well as complete noncompliance.
Results
199927 surgeries were included. Over the years, reporting of the complete bundle increased from 10%
to 57%. Complete bundle adherence among those that were completely reported increased from 9% to
73%. Adherence rates for the 4 interventions separately were higher (figure 1). SSI incidence varied
depending on type of surgery (figure 2). Taking into account differences between specialties and
hospitals, the RR of SSIs for complete compliance was 0.80 (95%CI 0.67 to 0.95) when compared to
incomplete compliance, and 0.82 (0.67 to 1.00) when compared to complete non-compliance. Corrected
for duration of the hospitals participation to the DHPSP, these became 0.64 (0.41 to 0.99) and 0.65 (0.40
to 1.06) respectively.
Discussion
Although compliance increased over time, compliance with the complete bundle can still be improved.
We found a positive and significant association between adherence to the bundle and the reduced risk
of an SSI. If implementation of the bundle is strengthened, a substantial decrease of SSIs can be
achieved for more patients.
A Novel Device to Reduce Airborne Colonies and Particulates at Incision Sites During Implantation of
Prostheses
Rabih O. Darouiche, MD, Baylor College of Medicine, Houston, TX, Sean Self, Nimbic Systems, Stafford,
TX and Daniel O'Connor, PhD, University of Houston, Houston, TX
Background
Airborne microorganisms (CFU) shed on skin squames from operating room personnel contribute to the
incidence of prostheses-related infections. The airborne vector remains largely uncontrolled due to
factors such as numerous personnel, room traffic, and door openings. Methods to reduce the presence
of airborne CFU mitigate environmental factors contributing to such infections. The study objective was
to determine if the novel Air Barrier System (ABS) device, which provides localized isolation of incisions,
would significantly reduce airborne CFU and particulates at incision sites during prostheses
implantations.
Methods
A single-site, RCT enrolled 300 patients who underwent hip arthroplasty, instrumented posterior spine
fusions, and prosthetic vascular graft implantation. Patients were randomly assigned (1:1), to either the
experimental group (ABS device) or control group (no device). Airborne CFU and particulates were
sampled simultaneously at 10-minute intervals via sterile tubings attached at incisions which drew air
onto agar plates and a particle counting device. Particle counts were skewed and underwent Box-Cox
transformations. Counts and distributions of CFU (CFU/m3), transformed particulate counts
(particles/m3), number of surgical staff, and surgery time were compared between study groups, and
associations among CFU, particulate, staff, and surgery time were tested. Generalized linear models with
link functions and distributions were used. A multivariate model tested group differences in CFU after
adjusting for particulate counts, staff, and surgery time. Robust standard errors were computed to
account for clustering of the 10-min intervals within patients.
Results
There were 291 cases with complete data for all variables, comprising 3587 10-min intervals. Both the
CFU and particulate counts were significantly higher in the control group vs. ABS group (Table 1).
Number of people present (p=0.792) and surgery time (p=0.645) did not differ between groups. Counts
of particulate (p<0.025), staff count (p=0.031), and surgery time (p=0.003) all related to CFU counts.
Patients in the ABS group were more likely than in the Control group to have 0 CFUs in a 10-min interval
(0 CFU in 70% vs. 49% of intervals, respectively, OR=0.86, p<0.001) and to have a lower average CFU
count per interval (2.2 vs. 4.4, respectively, p<0.001) after adjusting for particulate count, staff, and
surgery time.
Conclusion
The results indicate that the ABS can decrease the airborne CFU and particulates at surgical sites. A
large, multicenter, RCT is being initiated to assess the impact of this approach on the incidence of
infection in patients undergoing placement of prostheses.
Skin and Nasal Antiseptic Use in the Prevention of Post-Operative Infections
Nicholas A. Flynn, MD, University of Tennessee Nashville Internal Medicine Residency Program,
Nashville, TN and Mark Carr, MD, Assistant Professor of Medicine University of Tennessee Nashville,
Nashville, TN; Head of Infection Control - St. Thomas Midtown Hospital, Nashville, TN
Background
Nasal carriers of Staphylococcus aureus are at increased risk of developing infections after
surgery. Studies have shown application of skin and nasal antiseptic as directed one hour prior to
surgery kills 99.5% of bacteria in the nares, reducing the risk of infection after surgery.
Method
All patients undergoing surgery were included in this study with the exception of patients allergic
to Betadine/iodine. Patients were instructed to apply the 3MTM Skin and Nasal Antiseptic (povodineiodine solution 5% w/w [0.5% available iodine] USP), a preoperative prepping solution designed for use
on skin and nasal mucosa tissue, to both nostrils per manufacturer instructions at least one hour prior to
surgery. Patients were followed postoperatively and monitored for development of infection.
Result
From January 2009 to November 2011, 9,135 patients underwent surgical procedures and were
monitored for post-operative infection. The experimental intervention began in September 2010 with
all surgical patients undergoing pre-treatment with the 3MTM Skin and Nasal Antiseptic. A total of 5,154
surgical patients were followed prior to implementation of the intervention, and 63 (1.22%) of these
patients developed post-operative infection. Following initiation of the intervention, 3,981 surgical
patients were followed with 18 (0.45%) developing post-operative infection. Statistical analysis sought
to determine whether there was an infection rate trend prior to intervention and to compare pre- and
post-intervention infection rates. Statistical analysis was performed using Poisson regression with the
square root of deviance over the degrees of freedom as the scale parameter. Analysis revealed the
following:



No trend in infection rate prior to intervention (p-value 0.18)
Infection rate for any month pre-intervention was 1.04 times infection rate of the previous
month (95% CI, 0.98 to 1.10)
Statistically significant decrease in infection rate following experimental intervention (pvalue 0.0029)
Conclusion
Pre-operative use of 3MTM Skin and Nasal Antiseptic on nasal mucosa prior to surgical intervention
resulted in a statistically significant decrease in post-operative infections.
May 15, 2015 8:30 - 10:00 am Nutcracker Ballroom 1
Frequent Contamination of the Skin and Clothing of Healthcare Personnel during Removal of Personal
Protective Equipment: A Multicenter Evaluation and Educational Intervention
Myreen E. Tomas, MD1, Sirisha Kundrapu, MD, MS2, Priyaleela Thota, MD3, Venkata Sunkesula, MD,
MS2,3, Jennifer Cadnum, BS2,3, Thriveen Sankar Chittoor Mana, MS2, Annette Jencson, CIC3, Michelle T.
Hecker, MD2,4, Amy Ray, MD2,5 and Curtis J. Donskey, MD1,2, (1)Geriatric Research Education and Clinical
Centers, Louis Stokes Cleveland VA Medical Center, Cleveland, OH, (2)Case Western Reserve University,
Cleveland, OH, (3)Cleveland VA Medical Center, Cleveland, OH, (4)Metrohealth Medical Center,
Cleveland, OH, (5)University Hospitals Case Medical Center, Cleveland, OH
Background
Wearing personal protective equipment (PPE) reduces, but does not eliminate, the risk that healthcare
personnel may become contaminated with pathogens. Such contamination may place personnel at risk,
as illustrated by recent cases in which healthcare personnel acquired Ebola virus infection. The risk for
contamination may be particularly high during PPE removal. However, limited information is available
on frequency and routes of personnel contamination during PPE removal.
Methods
In 4 hospitals, simulations were performed in which healthcare personnel used their usual technique to
remove PPE (gloves and gown) that was contaminated with a fluorescent lotion placed either on the
gloves or on the front of the gown. PPE removal was observed to assess if correct technique was used
and a black light was used to identify sites of contamination on skin and clothing. In one facility, we
determined the impact of an educational intervention on frequency of personnel contamination during
PPE removal and evaluated the frequency of contamination during removal of full-coverage PPE in Ebola
virus training sessions.
Results
Of 435 PPE removal simulations, contamination of skin or clothing occurred in 200 (46%), with more
frequent contamination when incorrect technique was used (Figure). The frequency of incorrect
technique did not differ among hospitals (P=0.13) or for different provider types (P=0.26). The hands
and neck were the most frequently contaminated sites. The educational intervention resulted in a
sustained reduction in skin and clothing contamination during glove and gown removal (73% to
5%; P<0.0001). During 25 assessments of PPE removal in Ebola virus training sessions, contamination of
skin or clothing occurred in 2 (8%).
Conclusions
Contamination of the skin and clothing of personnel occurred frequently during PPE removal, even when
no lapses in technique were observed and when only the gown was contaminated. Although less
frequent, contamination also occurred during removal of PPE following protocols designed for care of
patients with Ebola virus infection. Simple educational interventions can significantly reduce the risk for
contamination.
Figure 1. Frequency of skin and clothing contamination with during removal of gloves (A) or gowns (B)
contaminated with a fluorescent lotion
Effect of Contact Precautions on Adverse Events by Patient Report and Chart Review
Michael Liquori, MD1, Lindsay D. Croft, MS2, Preeti Mehrotra, MD3, Hannah R. Day, PhD2,4, Elizabeth
Lamos, MD1, Ryan Arnold, MD5, Eli Perencevich, MD, MS6, Anthony Harris, MD, MPH2 and Daniel
Morgan, MD, MS2,4, (1)University of Maryland Medical Center, Baltimore, MD, (2)Department of
Epidemiology and Public Health, University of Maryland School of Medicine, Baltimore, MD, (3)Boston
Children's Hospital, Boston, MA, (4)VA Maryland Healthcare System, Baltimore, MD, (5)University of
Maryland School of Medicine, Baltimore, MD, (6)University of Iowa Carver College of Medicine, Iowa
City, IA
Background
Contact precautions have been reported to decrease healthcare worker visits and increase the number
of medical harms. Contact precautions may also lead to increased adverse events. Our objective was to
determine the association between contact precautions and adverse events as assessed by patient
report and the standardized, electronic Institute for Healthcare Improvement (IHI) trigger tool.
Methods
A prospective cohort study of inpatients was followed through discharge from January to November
2010 at the University of Maryland Medical Center. Patients were enrolled at admission and frequency
matched by hospital unit and length of stay. Patients were evaluated at admission and on hospital day 3,
7, and 14 (until discharged). At each point, patients underwent a standardized interview to identify
perceived problems with care. After discharge the standardized interview was administered by
telephone. Responses were recorded, transcribed, and coded by two physician reviewers. Additionally,
three physician reviewers performed chart reviews to identify adverse events using the standardized IHI
trigger tool. Events identified by the IHI trigger tool were then compared to the patient responses and
assessed for congruence. Logistic regression analysis was used for our primary analysis of contact
precautions and patient adverse events.
Results
Five hundred twenty eight patients were enrolled, with 296 being frequency matched. Of these 296
frequency matched patients, 84 (28.4%) had adverse events based on chart review. Only 12 of these
adverse events were also reported by patients. Adverse events identified by chart review were not
associated with contact precautions (OR 1.30; 95% CI 0.76-2.21; p=0.33) after adjusting for age greater
than 51 years (OR 1.95; 95% CI 1.13-3.35), length of stay greater than 4 days (OR 2.57; 95% CI 1.054.41), and Charlson comorbidity score (OR 1.04; 95% CI 0.90-1.19). Adverse events reported by patients
were rare and also not associated with contact precautions (OR 3.12; p=0.14).
Conclusions
Patients rarely reported adverse events. Patients on contact precautions were no more likely to
experience adverse events than patients not on contact precautions, regardless of method used to
identify adverse events.
Reinterpreting Compliance with Transmission-Based Precautions: A Sociological Approach
Julia E. Szymczak, PhD, The Children's Hospital of Philadelphia, Philadelphia, PA and Susan E. Coffin, MD,
MPH, Perelman School of Medicine at the University of Pennylvania and the Children's Hospital of
Philadelphia, Philadelphia, PA
Background
Achieving high levels of healthcare worker (HCW) compliance with transmission-based precautions to
prevent the spread of infectious diseases within healthcare facilities remains a challenge. Even after
improving HCW knowledge and providing resources, studies demonstrate that compliance does not
reach acceptably high levels. The objective of this study was to examine HCW perceptions of the barriers
to achieving high levels of compliance with transmission-based precautions.
Method
We conducted 103 in-depth, semistructured interviews with physicians, nurses and respiratory
therapists working at a large academic pediatric hospital. Data were systematically analyzed from a
sociological perspective. Themes were identified using a modified grounded theory approach and their
frequency calculated.
Result
The majority of respondents (95, 92.2%) perceived that competing priorities were the primary barrier to
achieving high levels of compliance with transmission-based precautions. They explained how tension
often arose in their everyday clinical work between the priority to prevent transmission of infectious
diseases and other equally, or even more, pressing concerns. Further analysis revealed 3 types of
competing priorities that threaten compliance: technical, social and organizational. Technical competing
priorities, mentioned by 72 (69.9%) of respondents, are times when transmission-based precautions
come in direct conflict with other therapeutic goals or techniques, such as getting to a patient who is
rapidly decompensating. Social competing priorities, mentioned by 65 (63.1%) of respondents, are times
when transmission-based precautions come in conflict with the social dynamics of interaction with
patients, families and colleagues. For example, worry about provoking fear in young children who are
placed on droplet precautions. Organizational completing priorities, mentioned by 61 (59.2%) of
respondents, are times when HCWs perceive that the organization’s emphasis on cost reduction and
increasing clinical productivity come in conflict with transmission-based precautions, such as when two
patients are placed in a single bed room.
Conclusion
The challenge of achieving reliably high levels of compliance with transmission-based precautions is
more complex than simply a problem of knowledge or resources. This study elaborates the social
dynamics that contribute to noncompliance and suggests that novel interventions are needed to
address this problem.
In the Face of Emerging Diseases: Using Data to Guide Staff Protection
Amber H Mitchell, DrPH, MPH, CPH, International Safety Center, Apopka, FL and Ginger B Parker, MBA,
International Safety Center, Charlottesville, VA
Background
While there has been extensive research on occupational contaminated sharps incidents in healthcare
settings, little research has focused on splashes and splatters of blood and body fluids. This is especially
important in today’s world where the globalization of travel means that previously geographically
isolated infectious microorganisms like Ebola Virus become a real threat to clinicians around the
world. We learned that occupational exposure to blood and body fluids deserves renewed focus and
post-September 11 public health preparedness deserves restoration.
Currently there is not a large evidence base to identify the risks associated with splashes and splatters of
blood and body fluids to mucous membranes (e.g. eyes, nose, and face). Over the last ten years, there
has been swelling activity in Federal, state, and local policy regarding healthcare-associated infections
including not just bloodborne pathogens, but vector- and contact-transmitted pathogens (e.g. MRSA,
c. difficile). There are few focused studies that evaluate and measure the occupational impact that
splashes and splatters have on workers. Enumeration of risk is more important today than ever given
revitalized focus on body fluid exposure, PPE use, and risk of disease.
Method
This study summarizes quantitative occupational incident/exposure data from 27 aggregate hospitals
throughout the U.S. that contribute to the Exposure Prevention Information Network
(EPINet™). Incident data includes reports of blood and body fluid exposures across job category,
exposure type, location, PPE worn, and procedure type.
Result
In the calendar year 2012, there were 175 blood and body fluid exposures reported using EPINet in a 27hospital surveillance aggregate. The majority of incidents were blood or visibly bloody urine (71.4%,
72.4% respectively) to the face/head (76.7%). Of those exposures, 60% were to the eyes with only 5.7%
of employees wearing goggles as a form of PPE; and 26.9% were to intact skin with only 15.5% wearing a
gown or coat as protective apparel. The majority of the reported exposures occurred in the patient’s
room, operating room, or emergency department (33.7%, 20%, and 18.3% respectively). Aside from
acknowleding that they were not wearing PPE, approximately 25% of workers reporting the incident,
indicated that an engineering control, administrative or work practice could have prevented the
exposure/incident. Of those, 72.4% cited some form of eye protection would have prevented the
exposure.
Conclusion
The routine, standardized capture of quantitative occupational exposure and incident data allows us to
identify not only risk, but modalities or controls for prevention or protection of staff. When emerging
infectious diseases like Ebola are now a global reality, capturing this type of data can mean the
difference between reacting to possible or theoretical risk to proactively preventing it.
Comparison of Hand Hygiene Monitoring Methodologies in Clinical Application: ‘My 5 Moments' versus
‘Entry/Exit'
Nai-Chung Nelson Chang, MS1, Andrew Jesson2, Heather Schacht Reisinger, PhD3, Marin Schweizer,
PhD2,4, Margaret Graham, MS5, Daniel Morgan, MD, MS6,7, Lisa Pineles, MA8, Graeme Forrest, MD9,10 and
Eli Perencevich, MD, MS2, (1)University of Iowa College of Public Health, Iowa City, IA, (2)University of
Iowa Carver College of Medicine, Iowa City, IA, (3)CADRE - Iowa City VAHCS and Carver College of
Medicine, University of Iowa, Iowa City, IA, (4)Iowa City VA Healthcare System, University of Iowa, Iowa
City, IA, (5)Iowa City VA, Iowa City, IA, (6)VA Maryland Healthcare System, Baltimore, MD,
(7)Department of Epidemiology and Public Health, University of Maryland School of Medicine,
Baltimore, MD, (8)University of Maryland, Baltimore, MD, (9)Portland VA Med Center, Portland, OR,
(10)Medicine, OHSU, Portland, OR
Objectives
Directly observed hand hygiene monitoring guidelines recommend monitoring opportunities on
“Entry/Exit” or using the WHO “My 5 Moments”. However, few studies have compared the efficiency of
each method in capturing hand hygiene opportunities. We aimed to compare the efficiency of each
method and whether it varies based on level of care: intensive care unit (ICU) vs non-ICU.
Methods
Healthcare worker hand-hygiene compliance data was collected by covert observers from June 4 to
November 13, 2013 in Baltimore, Iowa City and Portland Veterans Affairs Medical Centers. Observations
were conducted in both the surgical/medical wards as well as the ICU in each facility. Data were
collected concurrently using both standards methods: “Entry/Exit” and “My 5 Moments”. Entry/Exit”
was defined as hand hygiene upon entry or exit of a patient room and the “My 5 Moments” were
defined as hand hygiene prior to aseptic tasks and patient contact, and after fluid exposure, patient
contact and touching patient surroundings. Each instance of observation was limited to the first hand
hygiene opportunity for each of the seven time points in patient care as listed above. Analysis were
completed using the z-test and the chi-square test.
Results
2,733 observations were included, during which 2,104 “Entry/Exit” and 1,513 “My 5 Moments”
opportunities were recorded. Observation rates for the “Entry/Exit” method (76.98%, 95%CI 75.40%78.56%) was shown to be significantly higher than “My 5 Moments” (55.36%, 95%CI 53.50%-57.22%)
with p<.0001. The likelihood of observing any “Entry/Exit” opportunities in medical/surgical wards
versus ICUs was similar, OR 0.900 (95%CI 0.74-1.08, p=0.26). For “My 5 Moments”, the odds for
observing any hand hygiene opportunities were much lower in medical/surgical wards compared to
ICUs, OR 0.62 (95%CI 0.53-0.73, p<0.0001).
Conclusions
While “My 5 Moments” is considered the gold standard for hand hygiene monitoring, “Entry/Exit”
appears to be a more viable methodology in the clinical setting, especially when monitoring non-ICU
wards. When selecting between Entry/Exit and My 5 Moments, facilities should consider the efficiency
of each method and might favor one method over the other based on clinical setting, such as ICU vs
non-ICU.
Self-reported Hand Hygiene Practices, and Feasibility and Acceptability of Alcohol-based Hand Rubs
among Village Healthcare Workers in Inner Mongolia, China
Yuan Li, PhD1, Yali Wang2, Daiqin Yan2 and Carol Rao, ScD1, (1)Global Disease Detection Program, United
States Centers for Disease Control and Prevention, Beijing, China, (2)Bayan Nur Infectious Disease
Hospital, Bayan Nur, China
Background
Healthcare-associated infections result in substantial morbidity and mortality worldwide. Good hand
hygiene, including use of alcohol-based hand rubs (ABHR) and hand-washing with soap and water, is
critical to reduce the risk of spreading infections. Limited data are available on hand hygiene practices
from rural healthcare systems in China. We assessed the feasibility and acceptability of sanitizing hands
with alcohol-based hand rubs (ABHR) among Chinese village healthcare workers and their hand hygiene
practice.
Methods
Five hundred bottles of ABHR were given to village healthcare workers who participated in a public
health program in Inner Mongolia Autonomous Region, China. After about one year, standardized
questionnaire surveys were conducted among the public health program participants to collect
information on their work load, availability and usage of hand hygiene facilities, and knowledge, attitude
and practice of hand hygiene.
Results
Three hundred and sixty-nine (64.2%) participants completed the questionnaire. Although 84.5% of the
ABHR recipients believed that receiving the ABHR improved their hand hygiene practice, 78.8% of
recipients would pay no more than $1.5 USD out of their own pocket (actual cost $4 USD). The majority
(76.8%) who provided medical care at patients’ homes never carried hand rubs with them outside their
clinics. In general, self-reported hand hygiene compliance was suboptimal, and the lowest compliance
was “before touching a patient”. Reported top three complaints with using ABHR were skin irritation,
splashing, and unpleasant residual. Village doctors with less experience or who smoked knew and
practiced less hand hygiene.
Conclusions
This study indicates the need of ABHR and hand hygiene training to improve hand hygiene among village
healthcare workers. The overall acceptance of ABHR among the village healthcare workers is high as
long as it is provided to them for free/low-cost, but their overall hand hygiene practice was suboptimal.
Hand hygiene education and training is needed in settings outside of traditional healthcare facilities.
May 15, 2015 10:30 - 12:00 pm Nutcracker Ballroom 1
Epidemiology of Community-Onset and Hospital-Onset Multi-Drug Resistant Escherichia coli Bacteremia:
A Nationwide Cohort Study at Veterans Health Administration System
Michihiko Goto, MD, MSCI1,2, Jennifer McDanel, PhD1,3, Makoto Jones, MD, MS4,5, Bruce Alexander,
PharmD1, Carrie Franciscus, MA1, Kelly K Richardson, PhD1 and Eli Perencevich, MD, MS2,6, (1)Iowa City
VA Medical Center, Iowa City, IA, (2)University of Iowa Carver College of Medicine, Iowa City, IA, (3)601
Highway 6 West 152, University of Iowa, Iowa City, IA, (4)Salt Lake City VA Medical Center, Salt Lake City,
UT, (5)University of Utah, Salt Lake City, UT, (6)Iowa City VA Medical Center, Iowa City VA Medical
Center, Iowa City, IA
Background
Emerging resistance in Escherichia coli is a great concern. We report the nationwide incidence rate
trends and proportion of multi-drug resistant (MDR) isolates in E. colibacteremia over a 9-year period
using Veterans Health Administration (VHA) system data.
Methods
This retrospective cohort includes all E. coli blood culture isolates from patients admitted to acute care
beds of 115 VHA hospitals from 46 states between 2003-2012. If the same patient had multiple
isolates during one admission, then only first isolate was included. Isolates were classified as
community-onset (CO: <48 hours after the admission) or hospital-onset (HO: >=48 hours). Antimicrobial
susceptibility results were extracted from electronic medical records system of VHA. Isolates were
categorized as non-susceptible, if not susceptible to at least one agent in a specific antimicrobial class.
We defined MDR as reported resistance to at least three different classes of antimicrobials, according to
CDC/ECDC definition. Chronological trends were assessed by Cochran-Armitage trend test.
Results
There were 18,212 E. coli blood isolates (CO: 15,475; HO: 2,737) included in the analysis. Of these 5,819
isolates (32.0%) were MDR with HO isolates more likely to be MDR [CO: 4,613 (29.8%) as compared to;
HO: 1,206 (44.1%); p<0.0001]. Incidence rates of CO E. coli bacteremia increased from 2.9 per 10,000
outpatients in 2003 to 3.6 in 2012, while HO bacteremia incidence rates were stable. Incidence rates of
CO MDR E. coli bacteremia showed steady increase over the study period, while HO MDR bacteremia
incidence had a sharp increase in the first half, followed by a slow decline. The proportion of MDR
isolates increased among both of CO (27.8% in 2003-2007; 31.4% in 2008-2012; p <0.001) and HO
isolates (42.7% in 2003-2007; 45.5% in 2008-2012; p<0.01).
Conclusion
Within VHA system, we observed steady increases in incidence rates and proportion of MDR among
CO E. coli bacteremia isolates. Incidence rates of HO E. colibacteremia had been relatively stable,
although there was slow trend towards increasing proportion of MDR isolates among HO isolates.
Further studies are needed to analyze risk factors of MDR E. coli bacteremia, in order to guide proper
empiric therapy for this serious infection.
Prevalence of Invasive Gram-negative Infections due to Carbapenem-resistant Enterobacteriaceae (CRE)
Among Adult Patients in US Intensive Care Units (ICUs)
Thomas Lodise, MD1, Michael Ye, MS2, Shailja Dixit, MS, MPH, MBA, MD2 and Qi Zhao, M.D. MPH3,
(1)Albany College of Pharmacy, New York, NY, (2)Forest Research Inc, Jersey City, NJ, (3)Forest Research
Institute Inc., Jersey City, NJ
Background
CRE is an emerging public health concern but there are scant data on CRE prevalence in ICUs across the
USA. This study assessed prevalence of CRE in ICU vs. non-ICU settings across various US geographic
regions, using data from a large US hospital database.
Methods
Retrospective analysis of the Premier hospital inpatient database. Study period: 7/1/2010-12/31/2013.
Inclusion criteria: 1) age ≥ 18 y; 2) primary or secondary ICD-9 diagnosis at discharge for a complicated
urinary tract infection (cUTI), complicated intra-abdominal infection (cIAI), hospital-associated
pneumonia (HAP), or bloodstream infection (BSI); 3) positive culture for Enterobacteriaceae drawn from
a site consistent with the infection type; and 4) receipt of antibiotic treatment on day of index culture or
≤ 3 days after index culture. Carbapenem resistance was defined as non-susceptibility to meropenem,
imipenem, doripenem, or ertapenem. CRE was also stratified by region, infection type and pathogen.
Results
30,875 patients met study criteria. CRE prevalence data are shown in the table and charts. CRE was
generally higher in ICU vs. non-ICU settings; the difference was most notable in Mid-Atlantic and
Mountain regions (Charts). The highest rates of CRE were noted for Klebsiella spp and Enterobacter
spp (Table).
Conclusion
Overall CRE ranged from 1% to 9% and was higher in ICUs than non-ICUs. The findings highlight the
importance of developing decision support systems to identify patients at high risk for CRE, especially in
ICU settings. This is critically important for clinicians as culture results are typically not available within
the first 3 days.
Pathogen
Citrobacter
E. coli
cIAI, cUTI, HAP,
BSI
Enterobacter
Klebsiella
Serratia
Citrobacter
E. coli
cIAI
Enterobacter
Klebsiella
Serratia
Citrobacter
E. coli
cUTI
Enterobacter
Klebsiella
Serratia
Setting
ICU
Non-CU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
ICU
Non-ICU
Patients (n)
550
589
6,217
12,748
1,854
1,386
4,574
4,318
973
645
46
19
494
254
113
36
234
75
19
4
133
411
2,036
8,809
231
629
877
2,249
76
184
% CRE
2.6
1.2
0.9
0.4
7.7
5.0
7.3
5.6
4.1
4.7
2.2
0.0
1.8
0.4
13.3
8.3
6.8
0.0
0.0
0.0
3.8
1.5
0.7
0.3
5.6
5.3
8.3
5.9
5.3
2.2
Variation in carbapenem-resistant Enterobacteriaceae (CRE) active surveillance and inter-facility
communication among Chicago hospitals
Michael Y. Lin, MD, MPH1, Anne C. Bendelow, BS2, Rosie D. Lyles, MD, MSc2, Karen Lolans, BS1, Mary K.
Hayden, MD1, Robert A. Weinstein, MD1,2 and William E. Trick, MD1,2, (1)Rush University Medical Center,
Chicago, IL, (2)Cook County Health and Hospitals System, Chicago, IL
Background
The CDC's Detect and Protect strategy encourages hospitals to be aware of CRE-positive patients;
strategies include active surveillance and optimizing inter-facility communication for transfers. CRE
detection approaches may vary among short-term & long term acute care hospitals (STACHs & LTACHs).
Since Nov 2013, Illinois hospitals can query a patient's reported CRE status in a public health database /
information exchange, XDRO registry (xdro.org).
Method
All STACHs with ≥10 ICU beds in Chicago and LTACHs in Cook County (Chicagoland) were recruited Jan –
July 2014; each facility designated 1 infection preventionist to answer a 15-item written questionnaire.
We used exact tests for statistical significance.
Result
21 of 24 STACHs and 7 of 7 LTACHs responded. 86% of STACHs and 100% of LTACHs reported ≥1 CRE
admission/month.
CRE admission active surveillance was performed by 2 of 21 (10%) STACHs (high-risk transfers only) and
5 of 7 (71%) LTACHs (all admits), P=0.004.
38% of STACHs reported receiving CRE information on CRE-positive patients “half the time or more” at
transfer versus 71% of LTACHs, P=0.20 (Figure). If communication did occur at patient transfer, STACHs
were notified via medical record (45%), phone call from the transferring infection preventionist (32%) or
care provider (9%), or standardized paper transfer form (27%); LTACHs were notified mostly via medical
record and/or standardized transfer form (71% each).
86% of STACHs and 100% of LTACHs reported access to the XDRO registry; 55% of STACHs and 43% of
LTACHs queried at least once. Most STACHs did not routinely query (59%) or queried occasionally (32%);
none queried every patient. In contrast, 2 of 7 (29%) LTACHs queried all patients at the time of
admission. 96% of hospitals indicated interest in automated CRE querying.
Conclusion
We found heterogeneity in, and a need to improve, CRE detection activity among hospitals. Among
STACHs, CRE active surveillance was uncommon and CRE communication from transferring facilities was
infrequent. Hospitals, including LTACHs, have begun using the XDRO registry to overcome the
shortcomings of traditional inter-facility communication; automation of this process has the potential to
improve communication further.
Modeling Carbapenem Resistant Enterobacteriaceae (CRE) Control – Rapid Regional Collaboration
Produces Greatest Containment
Bruce Y. Lee, MD MBA1, Kim F. Wong, PhD2, Sarah M. Bartsch, MPH1, James A. McKinnell, MD3, Loren G.
Miller, MD MPH3, Chenghua Cao, MPH4, Alexander J. Kallen, MD, MPH5, Rachel Slayton, PhD, MPH6,
Diane Kim, BS4, Shawn T. Brown, PhD7, Nathan Stone, PhD8 and Susan S. Huang, MD MPH4, (1)Public
Health Computational and Operation Research (PHICOR), Johns Hopkins Bloomberg School of Public
Health, Baltimore, MD, (2)University of Pittsburgh, Pittsburgh, PA, (3)Infectious Disease Clinical
Outcomes Research Unit (ID-CORE), Division of Infectious Disease, Los Angeles Biomedical Research
Institute at Harbor-University of California Los Angeles (UCLA) Medical Center, Torrance, CA, (4)Division
of Infectious Diseases and Health Policy Research Institute, University of California, Irvine School of
Medicine, Irvine, CA, (5)Division of Healthcare Quality Promotion, CDC, Atlanta, GA, (6)CDC, Atlanta, GA,
(7)Pittsburgh Supercomputing Center, Pittsburgh, PA, (8)Pittsburgh Supercomputing Center (PSC),
Carnegie Mellon University, Pittsburgh, PA
Background
Carbapenem Resistant Enterobacteriaceae (CRE) are resistant to all but salvage antibiotics. CRE emerged
in California in 2010, particularly in long term care facilities. Strategies for containment are uncertain.
Methods
We used our existing Regional Healthcare Ecosystem Analyst (RHEA) agent-based model of Orange
County (OC), CA to simulate the spread of CRE throughout OC's 28 hospitals and 74 nursing homes (NH).
The literature and a hospital survey of CRE cases from 2010-2013 helped parameterize CRE prevalence
and intra-ward/unit transmission. Our team used RHEA to simulate an intervention (hospital-based
rectal screening of all inter-facility transfers and contact precautions for all known cases in both
hospitals and NH) under three CRE control scenarios: 1) base case (no intervention), 2) hospitaltriggered response whereby the intervention was implemented when a hospital cumulatively identified
10 known carriers, and 3) a county-triggered regional response whereby the intervention was applied in
all facilities once 10 hospitals identified any case of CRE.
Results
RHEA simulation experiments forecasted that unabated emergence of CRE in 2010 in CA would lead to a
10-year prevalence of CRE carriers (known and undiagnosed) of ~30% in LTACs, ~2.5% in hospitals, and
~10% in NH. Both individualized hospital and county-wide regional screen and isolate strategies
markedly reduced the amplification, with regional responses producing a greater and more rapid
dampening in both hospitals (including LTACs) and NH (Figure 1).
Conclusion
Simulation models highlight how intensive inter-facility screening by hospitals and contact precautions
by hospitals, LTACs, and nursing homes can blunt amplification of CRE in a region. Coordination of
county-wide cooperative strategies is an important public health intervention.
A Successful Control Program of a Carbapenem-resistant Enterobacteriaceae (CRE) in a Brazilian
Intensive Care Unit (ICU)
Claudia Vallone Silva, RN, Hospital Israelita Albert Einstein, São Paulo, Brazil
Key words: K. pneumoniae, Carbapenem-resistant Enterobacteriaceae, Infection control.
Introduction
Carbapenem-resistant Enterobacteriaceae (CRE) is increasingly prevalent pathogens worldwide and has
the potential to spread within healthcare facilities. Outbreaks and endemic dissemination of KPCproducing K. pneumoniae pose a challenge to infection control.
Objective
Report an outbreak of colonization and infection related to CRE in an ICU in a Brazilian hospital and
describe the major measures implemented to control and prevent.
Methods
This study was conducted in a 40-bed medical surgical ICU at Hospital Albert Einstein, Brazil. During
culture surveillance aiming at identifying multi resistant microorganisms, an infection control nurse
observed an increase of CRE. We investigated clonal relatedness for epidemiological comparison using
pulsed-field gel electrophoresis (PFGE).
Results
In July/August 2014 we identified a significant increase in the number of patients colonized (71%) or
infected with CRE carrying gene bla KPC (96.4% K.pneumoniae). The rate ratio demonstrated a significant
difference in the following periods 0.65 from Jan-April and 2.47 from May-Aug 2014, which confirmed
the outbreak. The PFGE revealed genetic diversity among the 18 CRE isolates, which belonged to nine
PFGE types. Five strains of CRE shared the same A PFGE type and six strains of CRE shared the same E
PFGE type, suggesting nosocomial transmission (fig.1). During this period, a control team was organized
and included infectious disease physicians, infection control nurses and ICU staff. A contact precaution
audit was carried out and it was observed non-adherence to: hand hygiene (51%), use of the gown
(25%), use of gloves (17%), and in 13% the disinfect product was not available. The team established
outbreak control strategies in September 2014 (graph.1). All patients admitted to ICU were tested to
identify CRE carrying gene bla KPC to determine the acquisition rate, establish contact precaution of
positive patients, raise healthcare workers' awareness on hand hygiene, contact precautions and
environmental disinfection. After implementing outbreak control measures we observed a significant
decrease rate.
Conclusion
Cooperation among multidisciplinary team was vital for the design of appropriate infection control
measures of CRE.
Fig.1 PFGE analysis of genomic DNA from isolates of CRE
Graph.1 Carbapenem-resistant Enterobacteriaceae (CRE) carrying gene bla KPC rates at ICU
May 15, 2015 3:30 - 4:30pm Nutcracker Ballroom 1
A Novel, Sporicidal Formulation of Ethanol for Glove Disinfection to Prevent Clostridium difficile Hand
Contamination during Glove Removal
Myreen E. Tomas, MD1, Michelle Nerandzic, BS2, Jennifer Cadnum, BS3, Thriveen Sankar Chittoor Mana,
MS3 and Curtis J. Donskey, MD1, (1)Geriatric Research Education and Clinical Centers, Louis Stokes
Cleveland VA Medical Center, Cleveland, OH, (2)Louis Stokes Cleveland VA Medical Center, Cleveland,
OH, (3)Case Western Reserve University, Cleveland, OH
Background
Clostridium difficile spores may be acquired on the hands of healthcare personnel during removal of
contaminated gloves. Disinfection of gloves prior to removal could therefore be an effective strategy to
reduce the risk for hand contamination. We tested the hypothesis that a novel, sporicidal formulation of
ethanol would be effective for rapid disinfection of C. difficilespores on gloves.
Methods
Reduction of toxigenic C. difficile spores inoculated on gloves of volunteers was compared after 30 or 60
second exposures to the sporicidal ethanol formulation, 70% ethanol, and 1:10 or 1:100 dilutions of
household bleach; the solutions were applied both as a liquid solution and as a wipe. We also examined
the efficacy of the sporicidal ethanol formulation for elimination of spore contamination from the gloves
of healthcare personnel interacting with C. difficile infection (CDI) patients or their environment. To
determine the potential for the disinfectants to damage clothing, the solutions were applied to pieces of
colored cloth.
Results
In 30 and 60 second liquid applications, the sporicidal ethanol formulation reduced spore levels by 1.4
logs and 2 logs, respectively (P<0.0001); 30 and 60 second wipe application resulted in 2 and >2.5 log
reductions, respectively (Figure 1). 70% ethanol applied as a liquid or wipe resulted in a <1 log reduction
in spores. Reductions achieved with a 1:100 dilution of bleach were equivalent to the sporicidal ethanol
solution, whereas a 1:10 dilution was more effective (>3 log reduction). However, both bleach solutions
stained clothing, while the sporicidal ethanol solution did not. The sporicidal ethanol solution was as
effective as a 1:100 dilution of bleach for elimination of spore contamination acquired on gloves of
healthcare personnel.
Conclusions
A novel, sporicidal ethanol formulation was effective in rapidly reducing C. difficile spores on gloved
hands and did not damage clothing. Further studies are needed to test whether use of the sporicidal
ethanol solution will reduce healthcare personnel hand contamination during removal of contaminated
gloves.
Figure1. Effectiveness of acid alcohol for removal of C. difficile spores from gloves
Opting out of Clostridium difficile Infections
Antonia L Altomare, DO, MPH1,2, Eileen A. Taylor, BSN1, Peter Solberg, MD1,2 and John Mecchella, DO,
MPH1,2, (1)Dartmouth-Hitchcock Medical Center, Lebanon, NH, (2)Geisel School of Medicine at
Dartmouth, Hanover, NH
Background
Clostridium difficile infections (CDI) are a major cause of morbidity and mortality, and while most
healthcare-associated infections have been declining, the rate of CDI is on the rise. Hand hygiene and
Contact Precautions (CP) are the mainstay for preventing the spread of Clostridium difficile (C.
diff) spores. While we regularly track the number of CDI, we have not had an efficient way to track and
ensure the use of CP.
Method
Dartmouth-Hitchcock is a 400 bed rural academic medical center located in Lebanon, NH. A preliminary
analysis found that CP was ordered for fewer than 50% of instances when a provider ordered a stool
study for C. diff. We developed an opt-out electronic order panel which pre-selects the order for CP at
the time the stool study for C. diff is ordered. Our aim was to improve the usage of CP in the setting of
suspected CDI.
Result
We analyzed 1157 inpatient encounters in our pre-intervention period (3/1/12-4/16/13) and 1008 in our
post-intervention period (4/17/13-4/1/14). We found an increase in the proportion of patient
encounters for which CP was ordered when C. diff testing was ordered from 21% pre-intervention to
96% post-intervention (p<0.001). The proportion of CP orders placed within 2 hours of the stool study
order improved from 9% to 95% (p<0.001). Post-intervention, we identified 48 failures (CP order placed
≥ 2 hours after stool study order) by 38 ordering providers who “opted out” of the order panel. While
most providers only opted out a single time, 6 providers were repeat offenders, with one provider
opting out on 5 different occasions. The majority of failures were resident providers (63%).
Conclusion
Institution of an electronic opt-out order significantly increased the proportion of patients who had CP
ordered in the setting of suspected CDI. While we acknowledge that simply ordering CP does not
necessarily reflect its implementation, it is the first step in alerting all care team members of
recommended isolation precautions. We expect that with proper hand hygiene and an increased use of
CP, the rate of healthcare-associated CDI will decline. Further education will be needed for those
providers who continue to opt-out of best practices.
Attributable Cost of Clostridium difficile Infection in Hospitalized Pediatric Patients
Preeti Mehrotra, MD1, Jisun Jang, MA1, Courtney Gidengil, MD, MPH1,2 and Thomas Sandora, MD, MPH1,
(1)Boston Children's Hospital, Boston, MA, (2)RAND, Boston, MA
Background
Attributable cost of inpatient Clostridium difficile infection (CDI) in adults has been estimated to be
$3,000-$15,000 per episode. However, the cost of pediatric CDI has not been determined and may differ
because severe disease and poor outcomes are less common in children. We sought to measure the
attributable cost of CDI in hospitalized pediatric patients.
Methods
Using the 2012 Kids’ Inpatient Database (KID) from the Agency for Healthcare Research and Quality
(AHRQ), we identified secondary diagnoses of CDI using the International Classification of Diseases,
Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic code of 008.45. Patients between 2 and 18
years of age who survived until discharge were analyzed. Charges were converted to cost using the KID
cost-to-charge ratio. All of the analyses were weighted, clustered and stratified appropriately based on
the sampling design of the KID. Propensity scores were calculated based on previously identified
predictors of CDI (bacterial and fungal infections; malignancy; solid organ transplant; primary and
secondary immunodeficiencies; cystic fibrosis; inflammatory bowel disease; presence of gastrostomy
tube). Controlling for propensity score and additional potential predictors for CDI-unrelated cost
(surgical procedures; extracorporeal membrane oxygenation; dialysis; central catheter placement; and
mechanical ventilation), we performed a regression model with a log transformed cost to estimate
attributable cost of CDI.
Results
We identified 5,484 pediatric hospitalizations (0.34%) associated with CDI. Median unadjusted cost of
hospitalization for children with CDI was $19,309.40 (interquartile range (IQR): $8,198.31 - $54,123.22)
versus $4,584.92 (IQR: $2,628.38 - $8,769.64) for children without CDI. After controlling for propensity
scores and predictors for cost, the mean cost for pediatric hospitalizations associated with CDI was
188.12% greater than those without CDI (P < 0.001).
Conclusions
Hospitalizations associated with CDI for children had almost triple the cost compared to those without.
Given the high economic burden associated with pediatric CDI, it should remain an infection prevention
priority.
Optimizing National Healthcare Safety Network Laboratory-Identified Event Reporting for Hospital-Onset
Clostridium difficile Infection: Does Delayed Diagnosis Make a Difference?
Michael J Durkin, MD, Arthur W. Baker, MD, Kristen V Dicks, MD, Luke Chen, MBBS, MPH, Daniel J
Sexton, Sarah Lewis, MD, MPH, Deverick Anderson, MD, MPH and Rebekah Moehring, MD, Duke
University Medical Center, Durham, NC
Background
Rates of hospital-onset (HO) Clostridium difficile infection (CDI) are higher using the Laboratory ID
(LabID) method compared to traditional surveillance. Delay in sending laboratory tests in patients with
community-onset diarrhea may be a modifiable reason for discordance.
Methods
We performed a prospective observational cohort study of patients admitted to 29 community hospitals
over 6 months from the Duke Infection Control Outreach Network. HO CDI cases were identified using
both traditional and LabID surveillance definitions to quantify the total number of discordant HO CDI
cases. “Delayed diagnosis” LabID HO CDI cases were defined as those with diarrhea onset within 3 days
of admission but delay in laboratory testing 4 or more days after admission. We calculated a “corrected”
LabID surveillance rate by subtracting the delayed diagnosis cases from the total LabID identified cases
and then dividing by 10,000 patient-days (ptd). We calculated and compared incidence rates according
to traditional surveillance, LabID surveillance, and “corrected” LabID surveillance for the cohort as a
whole and for each hospital. Finally, we ranked individual hospitals by the three calculated rates to
determine the effect of delayed diagnosis cases on hospital rankings.
Results
A total of 425 LabID and 311 traditional HO CDI events were observed over 708,551 ptd. The median
LabID HO rate per 10,000 ptd was 5.70 vs. 3.40 for traditional surveillance (p<0.001). There were 126
(26%) discordant HO CDI events between the two surveillance methods; the majority were due to delay
in diagnosis (n=106; 85%). Differences between the two surveillance methods were also observed when
we ranked hospitals by HO CDI rates.
The rate and rank differences dissolved after removal of cases due to delayed diagnosis. For example,
median “corrected” LabID HO CDI rate was 3.40 and identical to the median traditional surveillance rate.
Similarly, “corrected” LabID HO CDI rates and rankings approximated traditional surveillance rates and
rankings for individual hospitals.
Conclusion
Interventions that reduce delays in diagnostic testing for community-onset diarrhea may lead to LabID
HO CDI rates similar to traditional CDI surveillance.
May 16, 2015 8:30 - 10:00am Nutcracker Ballroom 1
Lack of Patient Understanding of Hospital Acquired Infection Data on CMS Hospital Compare
Max Masnick, BA1, Daniel Morgan, MD, MS1,2, John D. Sorkin, MD, PhD3, Elizabeth Kim, MSPH1, Jessica P.
Brown, PhD1, Penny Rheingans, PhD4 and Anthony Harris, MD, MPH1, (1)Department of Epidemiology
and Public Health, University of Maryland School of Medicine, Baltimore, MD, (2)VA Maryland
Healthcare System, Baltimore, MD, (3)Veterans Affairs Maryland Healthcare System Geriatrics Research,
Education, and Clinical Center, Baltimore, MD, (4)Department of Computer Science and Electrical
Engineering, University of Maryland Baltimore County, Baltimore, MD
Background
Public reporting of hospital quality data is a key element of healthcare reform in the US. Data for
hospital-acquired infections (HAIs) can be especially complex to understand, but no prior studies
quantify the public's ability to understand these data. We assessed the interpretability of HAI data as
presented on CMS Hospital Compare among patients who might benefit from access to these data.
Methods
We randomly selected inpatients at a large tertiary referral hospital from June to September 2014.
Participants compared hospitals using hypothetical HAI data presented in four different ways, including
in the same tabular formats used by CMS. Data included numerators, denominators, SIRS, and
interpretations of the SIR for the two hospitals. Accuracy of comparisons was assessed.
Results
Participants (n=110) correctly identified the better of two hospitals when given written descriptions of
the HAI measure 72% of the time (95% CI 66%, 79%). Adding the underlying numerical data to the
written descriptions reduced correct responses to 60% (95% CI 55%, 66%). When the written HAI
measure description was not informative (identical for both hospitals), 50% answered correctly (95% CI
42%, 58%). When no written HAI measure description was provided and hospitals differed by
denominator, 38% answered correctly (95% CI 31%, 45%).
Conclusion
Current public HAI data presentation methods may be inadequate. When presented with hypothetical
numeric HAI data, study participants incorrectly compared hospitals based on HAI data >40% of the
time. Further research is needed to identify better ways to convey these important data to the public.
Figure 1. Percent correct answers (bars indicate 95% CI) among study participants when comparing
hypothetical HAI data for two hospitals. Y-axis labels indicate presentation method for HAI data.
"Written descriptions" summarize SIRs.
Comparative Risk of Bloodstream Infection among Various Types of Intravascular Catheters
Payal K. Patel, MD, Sharon B. Wright, MD, MPH, Linda M. Baldini, RN and Graham M. Snyder, MD, SM,
Beth Israel Deaconess Medical Center, Boston, MA
Background
Few published studies have compared the catheter-associated bloodstream infection (CA-BSI) rate
attributable to various catheter types within the same population. The National Healthcare Safety
Network surveillance definition for these infections may not capture nuances of those truly attributable
to an intravascular catheter due to misattribution or underestimation. In this study, we sought to
qualitatively compare the incidence of CA-BSI when considering CA-BSI due to any catheter.
Methods
In this retrospective observational study, institutional databases were used to analyze all unique patient
first intensive care unit (ICU) admissions 4/2009 – 3/2014. The insertion and removal time were
recorded for all catheter types except peripheral intravenous catheters (PIV). Catheter duration was
counted using status at midnight. CA-BSI were included in the analysis if occurring within 14 days of ICU
discharge. Follow-up was censored at ICU discharge or CA-BSI outcome. CA-BSI incidence was calculated
serially per 1,000 catheter days for study subpopulations with ≥1 of each specific catheter type, and
using CA-BSI due to any catheter type and due to the implicated catheter type.
Results
22,803 unique first ICU admissions were analyzed, including 12,953 (56.8%) with ≥1 non-PIV placed
during follow-up. 101 CA-BSI occurred due to any catheter type. The incidence of CA-BSI due to any
catheter and due to the implicated catheter type among study subpopulations is presented in the table.
Additionally, 9 CA-BSI were PIV-associated.
CA-BSI per
CA-BSI
1,000 catheter-days
Catheter Catheter Catheter Catheter
Catheter-days
specific catheter type
present implicated present implicated
Central venous catheter
23582
59
31
2.50
1.31
Peripherally-inserted central catheter
12677
38
27
3.00
2.13
Hemodialysis
5393
29
9
5.38
1.67
Port
1390
2
0
1.44
0
Arterial line
32742
74
22
2.26
0.67
Other large-bore procedural catheters
8668
20
3
2.31
0.35
Study subpopulation, by
Conclusions
Data from this observational study suggest there is possible misattribution, including underestimation,
of CA-BSI using the current surveillance definition, particularly for hemodialysis, arterial, and procedural
catheters.
Descriptive epidemiology of patients with central vascular lines in the outpatient setting at an academic
tertiary care center
Steven S Spires, MD1, Mickie Miller, LCSW2, Katie Koss, RN, MSN2, Patty W Wright, MD1 and Thomas
Talbot III, MD, MPH1, (1)Vanderbilt University School of Medicine, Nashville, TN, (2)Vanderbilt Home
Care Services, Inc., Nashville, TN
Background
An increasing number of medical conditions are treated in outpatient settings, resulting in more vascular
devices being used in the outpatient population. Data on central line utilization and central lineassociated bloodstream infections (CLABSIs) in the outpatient setting are limited. In an effort to better
understand this population, we performed a retrospective descriptive analysis.
Methods
We retrospectively collected data on patients discharged from Vanderbilt University Medical Center
with a central vascular catheter (CVC) who were admitted into the care of our affiliated home care
service agency. All patients who left the hospital with a CVC in place, requiring home health skilled
nursing provided by our affiliated home care agency from 7/1/2012 to 9/30/2013 were
reviewed. Patients were identified first by surgical codes suggesting a CVC had been implanted or
inserted, then by querying the electronic clinical documentation database for annotation indicating
skilled nursing for infusion therapy had been performed. We defined outpatient CLABSI utilizing the
National Healthcare Safety Network (NHSN) definition of CLABSI with a modified timeframe of 2
calendar days after discharge to identify outpatient.
Results
Overall, 917 admissions into home care met our initial screening criteria and 177 admissions were
excluded after further review (CVC placed at an outside facility or there was no information on the
CVC). These 740 admissions represented 654 unique patients. The majority of CVCs, 81%, were
peripherally inserted central catheters (PICC). The CVCs were accessed frequently, with 79% having
scheduled access of at least once daily and 24% requiring access for 3 or more times a day. In total 167
(26%) patients required an unexpected healthcare encounter due to their CVC, 45 in the emergency
department and 122 in an outpatient clinic. There were 20 CLABSIs identified in 18 unique
patients. Time to event differed depending on the type of line.
Conclusion
While allowing earlier discharge, CVCs and outpatient infusion therapy are not without a small but likely
underappreciated morbidity. Better surveillance practices are necessary to fully understand the
healthcare burden associated with outpatient CVCs.
Is There an Association Between Patient Safety Culture Measures and Catheter-associated Infections?
Results from Two National Collaboratives
Jennifer Meddings, MD, MSc1, Heidi Reichert, MA1, M. Todd Greene, PhD1, Helen McGuirk, MPH1, Nasia
Safdar, MD, PhD2,3, Sarah L. Krein, PhD, RN1,4, Russell Olmsted, MPH, CIC5, Sam Watson, MSA, MT
(ASCP), CPPS6, Barbara Edson, RN, MBA, MHA7, Mariana Lesher, MS7 and Sanjay Saint, MD, MPH1,4,
(1)University of Michigan, Ann Arbor, MI, (2)University of Wisconsin School of Medicine and Public
Health, Madison, WI, (3)The William S. Middleton Veterans Affairs Medical Center, Madison, WI,
(4)Veteran Affairs Ann Arbor Healthcare System, Ann Arbor, MI, (5)St. Joseph Mercy Health System, Ann
Arbor, MI, (6)Michigan Health & Hospital Association, Okemos, MI, (7)Health Research & Educational
Trust, Chicago, IL
Research objective
The Agency for Healthcare Research and Quality has funded national efforts to reduce hospital rates of
central-line associated blood stream infection (CLABSI) and catheter-associated urinary tract infection
(CAUTI). These efforts combine evidence-based interventions with socio-adaptive approaches to foster a
culture of safety. We hypothesized that hospital units with better safety culture scores would have
lower infection rates.
Study design
Prospective cohort study of acute care intensive care units (ICUs) and non-intensive care units (nonICUs) participating in the CLABSI or CAUTI projects. CLABSI and CAUTI data were collected at baseline
and quarterly post-implementation using the National Healthcare Safety Network definition of catheterassociated infections per 1000 catheter days. Safety culture was quantified using items from 3 culture
instruments: Readiness Assessment (RA), Hospital Survey on Patient Safety Culture (HSOPS) and Team
Checkup Tool (TCT). Safety culture data was collected at baseline for all three instruments and again
roughly one year later for HSOPS, quarterly for TCT.
Methods
17 culture items focusing on leadership, teamwork and clinical care were selected for analysis based
upon a priori hypotheses to have a significant association with CLABSI and CAUTI rates in ICUs and nonICUs. Multilevel models were used to predict outcome rates over time as a function of culture scores
measured throughout the project. Due to multiple testing, p<0.01 criteria was used.
Results
1487 units from 903 hospitals (CLABSI) and 1030 units from 952 hospitals (CAUTI) were considered for
analysis. Response rates for the 3 culture instruments ranged from 20-24% for CLABSI and 38-43% for
CAUTI. Infection rates declined during the projects (41% decline for CLABSI, 14.5% decline for CAUTI). No
significant associations were found between safety culture measures and outcome rates.
Conclusions
Using 3 instruments of safety culture in 2 national collaboratives for prevention of CLABSI and CAUTI, we
found no association between culture measures and device-associated infection rates. However, the
validity of these safety culture measures remains unclear when assessed periodically, at the unit level,
with low response rates.
Reducing central line-associated bloodstream infections through a socioadaptive intervention focused on
maintenance practices
Christopher Crnich, MD, PhD1,2, Erin Bailey, MS1, Rosa Mak, MS3, Timothy Hess, PhD4, Nasia Safdar, MD,
PhD4, Stevens Linda, DNP, RN3, Dawn Berndt, RN, MS3, Lyndsy Huser, MSN, RN3, Elise Arsenault
Knudsen, MS, RN3, Wendy Curran, MSN, RN3, Betts Troy, MSN, RN3, Diane Mikelsons, MSN, RN3 and
Deborah Soetenga, MS, RN3, (1)University of Wisconsin School of Medicine & Public Health, Madison,
WI, (2)William S. Middleton Memorial VA Hospital, Madison, WI, (3)University of Wisconsin Hospital and
Clinics, Madison, WI, (4)University of Wisconsin School of Medicine and Public Health, Madison, WI
Background
Efforts to reduce central-line associated bloodstream infections (CLABSI) have largely focused on
improving central line insertional practices in the ICU. However, a majority of CLABSIs now occur outside
the ICU, most likely through gaps in maintenance practices. An ongoing quality improvement initiative at
our hospital created an opportunity to examine the impact of non-insertional interventions on facility
CLABSI rates.
Methods
Unit-specific and housewide CLABSI rates were tracked monthly and cross-sectional assessments of
maintenance practices were performed during the study (01/2011 - 08/2014). A new needleless
connector as well as several preventative interventions (including: 1) education; 2) post-event review
process; 3) CHG bathing; and 4) a line-maintenance culture change process [LMCC]) were introduced
during the study period. Control charts were used to identify shifts in the hospital trended CLABSI rate.
Multivariate Poisson regression models using a first-order autoregressive covariance structure were
used to assess impact of different interventions on observed CLABSI rates.
Results
Despite the introduction of a number of preventative strategies, the hospital's trended CLABSI rate
remained stable (2.0 per 1,000 device-days) through November 2013. Implementation of the LMCC
intervention was associated with improvements in observed maintenance practices and a downward
shift in the hospital's trended CLABSI rate (1.2 per 1,000 device-days). In multivariate analyses, there
was a statistically significant association between the LMCC intervention and observed CLABSI rates
(rate ratio = 0.59, P = 0.005). Based on our model, none of the other interventions implemented during
the study period were associated with significant changes in facility CLABSI rates.
Conclusions
These results demonstrate that a culture change intervention centered on line maintenance practices
can reduce hospital CLABSI rates. Our next steps are to compare results of staff interviews and practice
observations on high- and low-performing units. These data will be critical for achieving a better
understanding of how this socioadaptive intervention achieved its success and to better disseminate our
findings to other settings.
May 16, 2015 10:30 - 12:00pm Nutcracker Ballroom 1
Middle East Respiratory Syndrome Corona virus infection among Healthcare Workers in a Tertiary Care
Center in Saudi Arabia
Ahmed Hakawi, MD, King Fahad Medical City, Riyadh, Saudi Arabia
Background
To describe the socio-demographic and clinical characteristics and outcome of infection due
to Middle East Respiratory Syndrome Corona virus among healthcare workers in King Fahad
Medical City, Riyadh, Saudi Arabia.
Methods & Materials
This is a retrospective chart review of healthcare workers infected with MER CoV from April,
2014 to June 2014. The IRB of KFMC approved the study.
Throat-swab, nasopharyngeal swab , tracheal-aspirate, or bronchoalveolar-lavage specimens
were obtained and were placed in viral transport medium (Vircell) and they were subjected
to real-time reverse-transcriptase–polymerase-chain-reaction (RT-PCR) amplification of
consensus viral RNA targets (upE and ORF1b) to detect MERS CoV .
Results
During the outbreak of MER CoV infection that occurred at KFMC between April, 2014 and
June 2014, 23 staff members were infected. Seventy per cent of the cases were female
nurses with no prior chronic medical conditions. Staff members from critical areas were
worst affected (table 1) and 74% had fever at presentation. Interestingly, gastrointestinal and
to a lesser degree articular features were prominent among the cases, in addition to
respiratory symptoms (table 2). Fifteen (65%) cases got admitted, while the remaining were
isolated at home or charity building specifically designated for infected asymptomatic staff or
those with mild disease. Five were admitted to the intensive care unit (ICU). Mean length of
stay for those admitted into the hospital was 9 days (5-17days), while for those that stayed in
the charity building was 10 days (3-19 days). The mean period of infectivity (time period
between first positive PCR and first negative PCR was 10 days (6-21 days). Treatment was
mainly supportive. Only one staff member died, while 96% recovered.
Conclusion
Prior reports of MERS CoV infections were among the aged population with chronic medical
conditions. However, secondary cases among healthcare workers were less severe.
Identification of mild and asymptomatic cases among healthcare workers and
implementation of strict infection control measures can decrease the risk of transmission.
Ebola Virus Disease, Infection Control and Environmental Safety in a Biocontainment Unit
Jay B. Varkey, MD1, Colleen S. Kraft, MD1, Aneesh K. Mehta, MD1, G. Marshall Lyon, MD, MMSc1, Sharon
Vanairsdale, MS, APRN, ACNS-BC, NP-C, CEN1, Patricia L. Olinger, RBP2and Bruce S. Ribner, MD, MPH1,
(1)Emory University School of Medicine, Atlanta, GA, (2)Emory University, Atlanta, GA
Background
In 2014, four patients with Ebola Virus Disease (EVD) were treated in the Serious Communicable
Diseases Unit (SCDU) at Emory University Hospital. Strict infection control practices were implemented
to avoid environmental contamination and prevent nosocomial transmission of EVD to healthcare
workers.
Methods
As part of standard operating protocol while caring for patients with EVD, all high touch surfaces were
wiped at least every four hours using commercially available hospital disinfectant wipes. Body fluids that
contacted the patient care area were immediately contained and disinfected using a quaternary
ammonium compound (MicroChem, Westborough, MA).
After patients with EVD were discharged from the hospital, but before terminal decontamination using
vaporized hydrogen peroxide, multiple environmental samples were obtained from high-touch surfaces
using a dacron-tipped swab moistened with bacteriostatic saline. Environmental swabs were tested for
the presence of Ebola virus (EBOV) using RT-PCR (Biofire Defense, Salt Lake City, UT).
All health care providers who entered the SCDU, laboratory technologists, and anyone managing the
waste stream were required to measure their temperature and complete a symptom questionnaire
twice daily.
Results
All 9 samples obtained on 3 separate dates tested negative for EBOV by RT-PCR. Samples tested
included patient’s personal belongings (phone, tablet computer), high-touch areas (call button and
bedrails) and the bathroom environment (commode, toilet seat, sink handles, etc).
Forty-three direct care providers, laboratory technologists and environmental services personnel
underwent twice daily temperature checks and symptom monitoring. No employee developed Ebola
virus disease.
Conclusions
Ebola virus disease can be managed successfully without causing environmental contamination or
occupation related infections with appropriate protocols and preparations. Meticulous attention to
infection control practices by a highly motivated, trained and competent staff is critical to the safe care
of patients with EVD.
Vancomycin Consumption and Resistance Trends in Staphylococcus aureus Across the Military Health
System
Michael E. Sparks, PhD1, Uzo Chukwuma, MPH2, Robert Clifford, PhD1, Emma Schaller2, Paige Waterman,
MD3, Charlotte Neumann, BSN2, Michael Julius, PMP1, Mary Hinkle, MD1 and Emil Lesho, DO1, (1)Walter
Reed Army Institute of Research, Silver Spring, MD, (2)EpiData Center Department, Navy and Marine
Corps Public Health Center, Portsmouth, VA, (3)Global Emerging Infections Surveillance, Armed Forces
Health Surveillance Center, Silver Spring, MD
Background
Staphylococcus aureus is one of the most important community and healthcare associated pathogens.
Having a vancomycin minimum inhibitory concentration (MIC) >1.5µg/mL, although still in the
susceptible range, is prognostic for mortality in S. aureus bacteremia, regardless of resistance to
methicillin or the treatment administered.
Objectives
We sought to: 1) determine whether an upward trend in incidence of isolates having therapeutically
problematic vancomycin MICs was occurring; 2) learn if resistance patterns correlated with vancomycin
usage across a large national healthcare system; and 3) generate near-term predictions for problematic
isolate rates.
Materials and Methods
Monthly counts of de-duplicated isolates having a vancomycin MIC = 2µg/mL (problematic) and MIC ≤
1.5µg/mL (tractable) were tabulated from Jan 2010 through Dec 2014. Rates of problematic isolates
per 1,000 isolates tested were analyzed (“pMIC data”). Local regression-based decomposition
disaggregated series into season, trend and remainder components, and Mann-Kendall tests checked for
upward or downward monotonic trends. An autoregressive integrated moving average (ARIMA) model
fit to the pMIC data was used for forecasting. pMIC data was simultaneously compared with the
monthly vancomycin prescription rate per 1,000 inpatient encounters during Jan 2010 through Dec 2013
(“usage data”). Spearman correlation measured the association between series.
Results and Conclusions
An upward trend in the aggregate pMIC data was observed with good statistical support (τ = 0.67; p =
1.54E-8); this was also seen in the series' trend component (τ = 0.63; p ≤ 2.22E-16). The ARIMA-based
predictive model suggested problematic isolate rates of 228 and 254 by Dec 2015 and 2016,
respectively, exceeding 2014's mean average rate of 205 by 11% and 24% (see Figure 1). A downward
trend in the usage data was seen, both in the aggregate series (τ = -0.56; p = 8.77E-5) and its trend
component (τ = -0.86; p ≤ 2.22E-16). Figure 2 presents trend lines for the pMIC and usage data, which
were negatively correlated (ρ = -0.83; p < 2.2E-16). Vancomycin consumption does not seem to explain
vancomycin MIC trends in S. aureus, thereby warranting study of the roles of other antibiotics.
Figure 1:
Figure 2:
Multi-Site MRSA Colonization: Single Strains Predominate but Multiple Strains More Likely with
Healthcare-Associated vs. Community-Associated MRSA
Michael S. Calderwood, MD, MPH1,2, Christopher A. Desjardins, PhD3, Michael Feldgarden, PhD4,
Yonatan Grad, MD PhD2,5, Diane Kim, BS6, Raveena Singh, MS6, Samantha J. Eells, MPH7,8, James A.
McKinnell, MD7,9,10, Steven Park, MD PhD6, Ellena Peterson, PhD11, Loren G. Miller, MD MPH7,10 and
Susan S. Huang, MD MPH6, (1)Harvard Medical School and Harvard Pilgrim Health Care Institute,
Department of Population Medicine, Boston, MA, (2)Brigham and Women's Hospital, Division of
Infectious Diseases, Boston, MA, (3)Broad Institute of MIT and Harvard, Cambridge, MA, (4)National
Center for Biotechnology Information, National Institutes of Health, Bethesda, MD, (5)Center for
Communicable Disease Dynamics, Department of Epidemiology, Harvard School of Public Health,
Boston, MA, (6)Division of Infectious Diseases and Health Policy Research Institute, University of
California, Irvine School of Medicine, Irvine, CA, (7)Infectious Disease Clinical Outcomes Research Unit
(ID-CORE), Division of Infectious Disease, Los Angeles Biomedical Research Institute at Harbor-University
of California Los Angeles (UCLA) Medical Center, Torrance, CA, (8)Department of Epidemiology, UCLA
Fielding School of Public Health, Los Angeles, CA, (9)Torrance Memorial Medical Center, Torrance, CA,
(10)David Geffen School of Medicine, Los Angeles, CA, (11)Department of Pathology and Laboratory
Medicine, University of California, Irvine School of Medicine, Irvine, CA
Background
MRSA colonization at multiple body sites is associated with a higher risk of MRSA infection. In addition,
non-nasal colonization is important in transmission. This study compares the genetic relatedness of
MRSA isolates collected concurrently from nasal and non-nasal sites.
Methods
We analyzed isolates from Project CLEAR, a clinical trial of patients with MRSA carriage or infection
during a hospital stay. Participants were randomized to post-discharge education plus 6 months of serial
decolonization with 5 day courses of chlorhexidine baths and nasal mupirocin 2x/month (intervention)
vs. education alone (control). Participants had swabs of their nares, throat, skin, and any wound at
recruitment and months 1, 3, 6, and 9. Isolates from those completing all visits by June 2013 were
sequenced and assayed for single nucleotide polymorphism (SNP) variation. Multiple site isolates from
the same visit were compared. Isolates differing by ≤10 SNPs were considered the same.
Results
974 MRSA isolates were sequenced from 328 participants, of which 318 (97%) had at least one positive
MRSA nasal culture. Of these 318 participants, 36 (11%) were colonized concurrently in both the nares
and at least one non-nasal site. Over the course of the study, we collected a total of 88 paired isolates
from these 36 participants, 21 pairs from 12 participants in the intervention group and 67 pairs from
24 participants in the control group. We found the same MRSA strain in 80% percent of nares vs. throat
isolates (n=35, 95% CI 67-93%), 84% of nares vs. skin isolates (n=40, 95% CI 73-95%), and 78% of nares
vs. wound isolates (n=9, 95% CI 51-100%). Clonal complex 8 (USA 300/500) strains were more likely to
match than CC5 (USA 100) strains (96% vs. 71%, p=0.04) when comparing nares vs. skin isolates. This
difference between CC8 and CC5 strains was also present, albeit not significant, when comparing nares
vs. throat isolates (88% vs. 74%, p=0.42).
Conclusion
We found a high degree of genetic relatedness between MRSA strains isolated concurrently from nasal
and non-nasal body sites on the same patient, but strains isolated from the skin were more likely to
differ from those isolated from the nares for hospital-associated MRSA (CC5) compared to communityassociated MRSA (CC8).
Risk Factors for Persistent Colonization with Methicillin-Resistant Staphylococcus aureus
Valerie C Cluzet, MD1, Pam Tolomeo, MPH1, Jeffrey S Gerber, MD, PhD, MSCE2, Susan E. Coffin, MD,
MPH2, Theoklis E Zaoutis, MD, MS2, Irving Nachamkin, DrPH, MPH1, Warren B. Bilker, PhD1 and Ebbing
Lautenbach, MD, MPH, MS1, (1)University of Pennsylvania Perelman School of Medicine, Philadelphia,
PA, (2)Perelman School of Medicine at the University of Pennylvania and the Children's Hospital of
Philadelphia, Philadelphia, PA
Background
Colonization with methicillin-resistant Staphylococcus aureus (MRSA) can be classified as intermittent or
persistent. Persistent carriers have a higher risk of developing infection and may serve as a source of
transmission.
Objective
To identify factors associated with persistent MRSA colonization
Methods
We conducted a prospective cohort study between 1/1/2010 and 12/31/2012 at five adult and pediatric
academic medical centers. Adults and children presenting to ambulatory settings with an acute
community-onset MRSA skin and soft tissue infection (SSTI) (i.e. index cases), along with their household
members, performed self-sampling for MRSA colonization every two weeks for six months. Clearance of
colonization was defined as two consecutive sampling periods with negative cultures. Subjects who did
not meet this definition by the end of the study period were considered persistently colonized. Subjects
with persistent MRSA colonization were compared to those without on the basis of demographics,
antibiotic exposure and household member colonization status at baseline. Factors associated with
persistent colonization were determined using logistic regression analysis.
Results
A total of 243 index cases were enrolled, of which 48 (19.8%) met the definition of persistent
colonization. In multivariable analyses, persistent colonization was associated with older age (odds ratio
(OR) for each 10 years, 1.19, 95% confidence interval (CI), 1.03-1.37), prescription of mupirocin within
14 days of SSTI diagnosis (OR, 2.38, 95% CI, 1.08-5.27) and increased proportion of household members
with MRSA colonization at baseline (OR for each 10%, 1.15, 95% CI, 1.02-1.29). Conversely, subjects
with persistent colonization were less likely to have been treated with clindamycin for the MRSA SSTI
(OR, 0.32, 95% CI, 0.15-0.70).
Conclusions
Nearly 20% of patients with MRSA SSTI will remain persistently colonized. MRSA colonization among
household members may contribute to persistence of colonization, indicating a possible role for total
household decolonization. The specific effect of clindamycin on MRSA colonization needs to be further
elucidated.
Determination of Risk Factors for Recurrent Methicillin-resistant Staphylococcus aureus Bacteremia in a
Veterans Affairs Healthcare System Population
Justin Albertson, MS1, Jennifer McDanel, PhD2, Ryan Carnahan, PharmD MS1, Elizabeth Chrischilles, PhD1,
Eli Perencevich, MD, MS3, Michihiko Goto, MD, MSCI4, Lan Jiang, MS5, Bruce Alexander,
PharmD5 and Marin Schweizer, PhD6, (1)University of Iowa, Iowa City, IA, (2)601 Highway 6 West 152,
University of Iowa, Iowa City, IA, (3)Iowa City VA Medical Center, Iowa City VA Medical Center, Iowa City,
IA, (4)SW54-GH, University of Iowa Hospitals and Clinics, Iowa City, IA, (5)Iowa City VA Medical Center,
Iowa City, IA, (6)University of Iowa Carver College of Medicine, Iowa City, IA
Background
Recurrent methicillin-resistant Staphylococcus aureus (MRSA) bacteremia is associated with significant
morbidity and mortality and is a potential target for quality improvement. We aimed to identify
important risk factors for recurrent MRSA to assist clinicians in identifying high-risk patients for
continued surveillance and follow-up.
Method
In this retrospective cohort study, we examined patients with MRSA bacteremia at 122 VA medical
facilities from 2003 to 2010. Recurrent bacteremia was identified by a positive blood culture between 2
days and 180 days after index hospitalization discharge. Subset analyses were performed to evaluate
risk factors for early-onset recurrence (2-60 days after discharge) and late-onset recurrence (61-180
days after discharge). An additional subset analysis was performed in which the time period of
recurrence was limited to 7-180 days after the index discharge. Risk factors were evaluated using Cox
proportional hazards regression.
Results
A total of 1,159 (6.3%) out of 18,425 patients had recurrent MRSA bacteremia. The median time to
recurrence was 63 days. Longer duration of index bacteremia, increased severity of illness, receipt of
only vancomycin, community-acquired infection, and several comorbidities were risk factors for
recurrence. Congestive heart failure, hypertension, and rheumatoid arthritis/collagen disease were risk
factors for early-onset recurrence but not late-onset recurrence. Geographic region and cardiac
arrhythmias were risk factors for late-onset recurrence but not early-onset recurrence. Results did not
change when the limited time period of recurrence was used.
Conclusion
Risk factors for recurrent MRSA bacteremia included comorbidities, severity of illness, duration of
bacteremia and receipt of only vancomycin. Awareness of risk factors may be important at patient
discharge for implementation of quality improvement initiatives including surveillance, follow-up, and
education for high-risk patients.
May 16, 2015 3:30 - 4:30pm Nutcracker Ballroom 3
Healthcare-Associated Infections in U.S. Nursing Homes: Results from a Prevalence Survey Pilot
Lisa LaPlace, MPH1, Lauren Epstein, MD1, Deborah Thompson, MD, MSPH, FACPM2, Ghinwa Dumyati,
MD3, Cathleen Concannon, MPH4, Gail Quinlan, RN, MS, CIC5, Jane Harper, MS, RN6, Linn Warnke6, Ruth
Lynfield, MD6, Meghan Maloney, MPH7, Richard Melchreit, MD7, Nimalie D. Stone, MD, MS1 and Nicola
Thompson, PhD, MS1, (1)Centers for Disease Control and Prevention, Atlanta, GA, (2)New Mexico
Department of Health, Santa Fe, NM, (3)Department of Health, University of Rochester, Rochester, NY,
(4)Emerging Infections Program, Rochester, NY, (5)CCH - Emerging Infections Program, Rochester, NY,
(6)Minnesota Department of Health, Saint Paul, MN, (7)Connecticut Department of Public Health,
Hartford, CT
Background
The national burden of healthcare-associated infections (HAIs) among residents of US nursing homes
(NHs) is unknown. Lack of detailed medical record documentation and surveillance capacity are barriers
to performing infection surveillance in NHs. Our objective was to pilot the point prevalence survey (PPS)
method for HAI surveillance using the CDCs Emerging Infections Program (EIP).
Methods
A single-day PPS of HAIs was conducted in 9 NH from 4 EIP sites. In each NH, facility staff collected
resident demographic and risk factor data on the survey date. EIP staff conducted retrospective chart
reviews of residents to collect lab test data, and infection signs and symptoms. HAIs were defined using
revised 2012 McGeer definitions. HAI prevalence per 100 residents and by resident characteristics were
calculated.
Results
Among 1272 eligible residents (median age 85 years, 30% male, 14% short stay, 8% device use), the HAI
prevalence was 5.3 per 100 residents (95% CI 4.0-6.4%) with 67 residents having 70 HAIs. Most were
gastrointestinal tract (n= 26; 37%), skin, soft tissue and mucosal (n= 21; 30%), or respiratory tract (n=16;
23%) infection. Most HAIs (70%) were in long-stay residents without devices, but HAI prevalence was
higher in short-stay residents and those with devices (Table). Device use was 3-times higher among
short-stay (19.1%) than long-stay residents (6.2%); when stratified by resident stay, device use remained
associated with HAIs in short-stay (p=0.016), but not long-stay residents (p=0.549).
Resident characteristic, n
Stay: Long, 1089
Short, 183
Device: No, 1170
Yes*, 102
Indwelling urinary catheter, 55
Vascular device, 25
Ventilator or tracheostomy, 3
PEG/J tube, 30
HAI prevalence
4.7
8.7
4.8
10.8
16
16
0
0
X-square p-value
0.031
0.018
Conclusion
Given a prevalence of 5.3% there is a large burden of HAIs among the ~1.5 million residents receiving
care in NHs each day. Device use was significantly associated with HAIs in short-stay but not long-stay
residents, indicating the epidemiology and prevention of HAIs in these groups differs. These findings will
be used to inform surveillance efforts to estimate HAI burden in NHs.
Transmission of MRSA to Healthcare Personnel Gowns and Gloves during Care of Nursing Home
Residents
Mary-Claire Roghmann, MD, MS1, J. Kristie Johnson, PhD1, John D. Sorkin, MD, PhD1, Patricia
Langenberg, PhD1, Alison Lydecker, MPH1, Brian Sorace, BS1, Lauren Levy, JD, MPH1and Lona Mody, MD,
MSc2, (1)University of Maryland School of Medicine, Baltimore, MD, (2)University of Michigan School of
Medicine, Ann Arbor, MI
Objective
To estimate the frequency of MRSA transmission to gowns and gloves worn by healthcare personnel
(HCP) interacting with nursing home residents in order to inform infection prevention policies in this
setting
Design
Observational study
Setting and Participants
Residents and HCP from 13 community-based nursing homes in Maryland and Michigan
Methods
Residents were cultured for MRSA at the anterior nares and perianal or perineal skin. HCP wore gowns
and gloves during usual care activities. At the end of each activity, a research coordinator swabbed the
HCP’s gown and gloves.
Results
403 residents were enrolled; 113 were MRSA colonized. Glove contamination was higher than gown
contamination (24% vs. 14% of 954 interactions, p<0.01). Transmission varied greatly by type of care
from 0% to 24% for gowns and 8% to 37% for gloves. We identified high risk activities (OR >1.0, p< 0.05)
including: dressing, transferring, providing hygiene, changing linens and toileting the resident. We
identified low risk activities (OR <1.0, p< 0.05) including: giving medications and performing glucose
monitoring. Residents with chronic skin breakdown had significantly higher rates of gown and glove
contamination.
Conclusions
MRSA transmission from MRSA positive residents to HCP gown and gloves is substantial with high
contact activities of daily living conferring the highest risk. These activities do not involve overt contact
with body fluids, skin breakdown or mucous membranes suggesting the need to modify current
standards of care involving the use of gowns and gloves in this setting.
Risk Factors for Infection or Colonization with Carbapenem-Resistant Klebsiella pneumoniae in a Longterm Acute Care Hospital
John P Mills, MD1, Naasha J Talati, MD, MSCR1, Kevin Alby, PhD2 and Jennifer Han, MD, MSCE1,
(1)University of Pennsylvania, Philadelphia, PA, (2)University of Pennsylvania Health System,
Philadelphia, PA
Background
Carbapenem-resistant Klebsiella pneumoniae (CRKP) infections are associated with high mortality rates
and have become increasingly prevalent in long-term acute care hospitals (LTACHs). Few data exist on
the epidemiology of CRKP in the LTACH setting. The objective of this study was to identify risk factors
for infection or colonization with CRKP in LTACH patients.
Methods
A case-control study was conducted in a university-affiliated LTACH from July 2008 to July 2014. All
patients with clinical cultures positive for CRKP were included as cases, and all patients with clinical
cultures with carbapenem-susceptible Klebsiella pneumoniae served as controls. A multivariable logistic
regression model was developed to identify risk factors for infection or colonization with CRKP.
Results
Ninety-nine (45%) of 222 patients with a K. pneumoniae clinical culture during the six-year study period
were colonized or infected with CRKP. The median age of study patients was 71 years and 48% were
male. Mean length of stay prior to culture isolation was 12.3 days and 9.3 days for cases and controls,
respectively (P=0.32). The source of culture for cases were: 56 (57%) respiratory, 40 (40%) urine, 3 (3%)
blood. The source of culture for controls were: 69 (56%) urine, 42 (34%) respiratory, 10 (8%) blood, and
2 (2%) wound. On multivariable analyses, the following risk factors for CRKP infection or colonization
were identified: solid organ or hematopoietic stem cell transplantation (odds ratio [OR] 5.1, 95%
confidence interval [CI] 1.22 – 20.79, P=0.03), mechanical ventilation (OR 2.6, 95% CI 1.24 –
5.28, P=0.01), fecal incontinence (OR 5.8, 95% CI 1.52 – 22.02, P=0.01), meropenem use (OR 3.6, 95% CI
1.04 – 12.07,P=0.04), vancomycin use (OR 2.9, 95% CI 1.18 – 7.32, P=0.02), and metronidazole use (OR
4.2, 1.28 – 13.95, P=0.02).
Conclusion
Nearly half of all study LTACH patients with infection or colonization with K. pneumoniae had cultures
with CRKP. Patients with prior transplantation, mechanical ventilation, fecal incontinence, and exposure
to meropenem, vancomycin, and metronidazole represent a population in LTACHs at increased risk for
infection or colonization with CRKP. These high-risk groups may represent populations who merit active
surveillance for CRKP.
Designing a ‘change-score' metric to characterize antibiotic resistant Gram-negative bacilli (R-GNB)
colonization in post-acute care facilities
Sara E McNamara, MPH1, Mohammed Kabeto, MA1, Lillian Min, MD1 and Lona Mody, MD2, (1)University
of Michigan, Ann Arbor, MI, (2)University of Michigan and Ann Arbor VA Healthcare System, Ann Arbor,
MI
Background
Prevalence of resistant Gram-negative bacilli (R-GNB) surpasses methicillin-resistant Staphylococcus
aureusand vancomycin-resistant enterococci rates in post-acute and long-term care facilities,
particularly among residents with indwelling devices. R-GNB colonization is often transient, and there
are many different species of R-GNB, which makes understanding this diverse group of organisms
challenging. We sought to design a ‘change-score’ to characterize this dynamic nature of R-GNB
colonization.
Methods
As a part of the Targeted Infection Prevention cluster-randomized study, we evaluated the presence of
R-GNB among residents with indwelling devices in 12 facilities. Cultures from multiple anatomic sites
were obtained at enrollment, at15 days, and every 30 days thereafter. The gain or loss of each R-GNB
species at each follow-up visit was scored as 1 change. The change-score was calculated as the total
number of changes occurring during follow-up.
Results
304 residents with indwelling devices were cultured at more than 1 visit (34,567 follow-up days). 112
(37%) residents had no change in R-GNB colonization during follow-up (100 (33%) were never colonized,
12 (4%) were persistently colonized over the entire follow-up); 102 (34%) had 1 to 3 R-GNB changes, and
90 (30%) had ≥4 changes. Residents with ≥4 R-GNB changes were younger, more functionally impaired,
had a longer follow-up time, were more likely to have multiple indwelling devices, a history of R-GNB,
and any antibiotic use, infection, hospitalization or wounds during follow-up. Adjusting for facility-level
clustering and follow-up time, age (RR 0.98, 95% CI (0.97, 0.99), p<0.001) and functional status (RR 1.06,
95% CI (1.01, 1.11), p=0.01) were associated with R-GNB change-scores.
Conclusions
We describe a metric to further characterize R-GNB epidemiology in post-acute and long-term care
facilities. Using this change-score, we show that R-GNB colonization is dynamic with multiple episodes of
R-GNB gain and loss. Common risk factors for R-GNB increased with increasing change-score. This
change-score metric may be useful in designing infection prevention strategies in this frail and
functionally impaired population with prolonged stays in an institutional setting.
May 17, 2015 8:30 - 10:00am Fantasia Ballroom G
Impact of Service Age on the Effectiveness of Reprocessing for Flexible Gastrointestinal Endoscopes
Marco Bommarito, PhD1,2, Julie Stahl, BS2 and Dan Morse, MS2, (1)3M, St. Paul, MN, (2)3M Infection
Prevention Division, St. Paul, MN
Background
The effectiveness of cleaning and disinfecting patient-used endoscopes was characterized by measuring:
Adenosine Tri-Phosphate (ATP), colony forming units (CFUs), and protein. The study included two
phases and an intervention. The intervention consisted of replacing all scopes in the inventory (in
service for many years) with brand new devices.
Method
For both phases, the concentrations of three markers in a total of 27 colonoscopes and 28 gastroscopes
reprocessed (using the same protocol) in the endoscopy suite of an East Coast Hospital were measured
at three points during reprocessing: pre-manual cleaning (after bedside flushing), post-manual cleaning,
and after high level disinfection (HLD). ATP levels inLog(RLUs) (Relative Light Units) were determined
using a luminometer. Protein concentrations in mg/mL were determined using a BCA assay. Total
aerobic Log(CFUs) were determined by culture.
Result
Figure 1 (Phase 1, old endoscopes) and Figure 2 (Phase 2, new endoscopes) show the mean amounts of
ATP in Log(RLUs) (A), Log(CFUs) (B) and protein in mg/mL (C) found for each reprocessing step by type of
scope. Also shown are the Log reductions for the ATP and CFU components achieved for manual
cleaning and HLD (D, asterisks show statistically significant reductions). Error bars are one standard
error and raw data values are in parenthesis. After manual cleaning, Phase 2 endoscopes had 0.58
logs lower ATP levels than Phase 1 endoscopes (p<0.0001), similarly Phase 2 scopes had 0.94
logs lower levels of CFUs contamination compared to scopes in Phase 1 (p<0.0001). Furthermore, 95%
of the endoscopes were culture positive after manual cleaning in Phase 1 compared to 68% in Phase
2. After HLD, Phase 2 scopes had 0.44 logs lower ATP levels than Phase 1 endoscopes (p<0.0001). 14%
of the endoscopes were culture positive after HLD in Phase 1 compared to 9% in Phase 2. Protein
results were highly correlated with the ATP results for each process step.
Conclusion
Replacing endoscopes after many years of service with brand new devices resulted in significant
improvements in the effectiveness of reprocessing indicating that the age of a device is a very significant
variable in the outcome of this workflow.
Defining Ventilator Associated Conditions in Neonates & Children
Noelle Cocoros, DSc, MPH1, Ken Kleinman, ScD1, Gregory P. Priebe, MD2, James E Gray, MD, MS3, Gitte
Larsen, MD, MPH4, Latania Logan, MD5, Philip Toltzis, MD6, Sarah B. Klieger, MPH7, Michael Klompas,
MD1,8 and Grace Lee, MD, MPH1,9, (1)Harvard Pilgrim Health Care Institute & Harvard Medical School,
Boston, MA, (2)Children's Hospital Boston, Boston, MA, (3)Beth Israel Deaconess Medical Center,
Boston, MA, (4)University of Utah Healthcare, Salt Lake City, UT, (5)Rush University Medical Center,
Chicago, IL, (6)Rainbow Babies & Children's Hospital, Cleveland, OH, (7)Children's Hospital of
Philadelphia, Philadelphia, PA, (8)Brigham and Women's Hospital, Boston, MA, (9)Boston Children's
Hospital, Boston, MA
Background
National surveillance definitions for ventilator-associated conditions (VAC) in adults were developed and
implemented beginning in 2013. We sought to develop a pediatric VAC (PVAC) definition suitable for use
in neonates and children by exploring the ability of potential definitions to discriminate patients with
worse outcomes.
Methods
We retrospectively identified consecutive children ≤18 years ventilated for ≥1 day in pediatric, cardiac or
neonatal ICUs in 5 US hospitals. We evaluated alternative thresholds for increases in daily minimum
FIO2 (by 0.20, 0.25, 0.30) or daily minimum mean airway pressure (MAP) (by 4, 5, 6, 7 cm H20). As with
the adult definition, we sought increases sustained for ≥2 days after ≥2 days of stable or decreasing
settings. We matched up to 4 patients without a PVAC to each patient with a PVAC and used Cox
proportional hazard models with frailties to estimate the association between hospital mortality, length
of stay and duration of ventilation with PVAC, stratified by ICU type.
Results
Our cohort included 10,209 hospitalizations and 77,751 ventilator days from 4 NICUs, 4 PICUs and 3
CICUs. All PVAC definitions were significantly associated with greater risk for death. Hazard ratios (HR)
for death varied from 1.6 to 6.8 depending on thresholds and ICU type. PVAC was also associated with
prolonged hospitalization and ICU stay, particularly for PICU and CICU patients, and lower hazard for
extubation (i.e. longer duration of ventilation; HR for extubation, 0.23 to 0.59) among survivors. For the
FIO20.20/MAP4 definition (i.e. increase in minimum daily FIO2 by 0.20 or MAP by 4), PVAC rates in the
full cohort ranged from 3.3 to 4.6 per 1,000 ventilator days depending on the ICU type; in comparison,
the FIO20.30/MAP7 definition yielded PVAC rates of 1.1 to 1.3 per 1,000 ventilator days.
Conclusions
Pediatric patients with PVAC are at substantially higher risk for mortality and morbidity in PICUs, CICUs
and NICUs, regardless of thresholds used. Tradeoffs in sensitivity, specificity and opportunities for
improvement will need to be considered when selecting a PVAC definition for use in national
surveillance.
Genomic analysis demonstrates that decolonization with chlorhexidine and mupirocin reduces both
colonization with existing MRSA strains and re-colonization with new MRSA strains
Michael S. Calderwood, MD, MPH1,2, Christopher A. Desjardins, PhD3, Michael Feldgarden, PhD4,
Yonatan Grad, MD PhD1,5, Diane Kim, BS6, Raveena Singh, MS6, Samantha J. Eells, MPH7,8, James A.
McKinnell, MD7,9,10, Steven Park, MD PhD6, Ellena Peterson, PhD11, Loren G. Miller, MD MPH7,10 and
Susan S. Huang, MD MPH6, (1)Brigham and Women's Hospital, Division of Infectious Diseases, Boston,
MA, (2)Harvard Medical School and Harvard Pilgrim Health Care Institute, Department of Population
Medicine, Boston, MA, (3)Broad Institute of MIT and Harvard, Cambridge, MA, (4)National Center for
Biotechnology Information, National Institutes of Health, Bethesda, MD, (5)Center for Communicable
Disease Dynamics, Department of Epidemiology, Harvard School of Public Health, Boston, MA,
(6)Division of Infectious Diseases and Health Policy Research Institute, University of California, Irvine
School of Medicine, Irvine, CA, (7)Infectious Disease Clinical Outcomes Research Unit (ID-CORE), Division
of Infectious Disease, Los Angeles Biomedical Research Institute at Harbor-University of California Los
Angeles (UCLA) Medical Center, Torrance, CA, (8)Department of Epidemiology, UCLA Fielding School of
Public Health, Los Angeles, CA, (9)Torrance Memorial Medical Center, Torrance, CA, (10)David Geffen
School of Medicine, Los Angeles, CA, (11)Department of Pathology and Laboratory Medicine, University
of California, Irvine School of Medicine, Irvine, CA
Background
MRSA carriers are at higher risk of MRSA infection. Decolonization with chlorhexidine and mupirocin
decreases MRSA colonization.
Methods
We examined the genetic relatedness of MRSA nasal isolates collected from Project CLEAR, a
randomized clinical trial of recently hospitalized MRSA carriers and MRSA infected patients. Participants
were randomized to education alone or education plus 6 months of serial (twice monthly)
decolonization with 5 days of chlorhexidine bathing and nasal mupirocin, and had multiple sites
swabbed for MRSA culture at recruitment and follow-up visits at 1, 3, 6, and 9 months. We analyzed
nasal isolates from patients who completed all visits through June 2013. MRSA isolates underwent
whole genome sequencing and were assayed for single nucleotide polymorphism (SNP) variation.
Isolates that differed by ≤10 SNPs were considered to represent the same MRSA strain. Fisher's exact
tests were used to compare differences between groups.
Results
We found MRSA nares colonization at enrollment in 112/235 (48%) of the intervention group and
130/244 (53%) of the control group (p = 0.24). At the 1-month visit, 18 (16%) in the intervention group
were colonized with the same MRSA strain in their nares, compared with 53 (41%) in the control group
(p <0.01, Figure). In addition, 7% of patients in both groups were colonized with a new MRSA strain (8 in
the intervention group, 10 in the control group). By the 6-month visit, 7 (6%) in the intervention group
were colonized with the same MRSA strain in their nares, compared with 37 (28%) in the control group
(p <0.01). The number of patients colonized with a new MRSA strain was 10 (9%) in the intervention
group, compared with 26 (20%) in the control group (p-value 0.02). There was no difference in samestrain nasal carriage in the intervention group 3 months after stopping the decolonization intervention
(13 patients at month 9 vs. 7 patients at month 6, p-value 0.24).
Conclusion
Decolonization with chlorhexidine/mupirocin reduced same strain MRSA nares colonization compared
to controls in a long-term trial of MRSA carriers. No significant rebound in same-strain colonization
occurred 3 months after stopping decolonization. In addition, decolonization reduced nares colonization
with new MRSA strains.
Comparison of explicit criteria for determining appropriateness of antibiotic prescribing in nursing homes
Christopher Crnich, MD, PhD1,2, Jill Miller1, Mozhdeh Bahrainian, MS1 and Sowmya Adibhatla1,
(1)University of Wisconsin School of Medicine & Public Health, Madison, WI, (2)William S. Middleton
Memorial VA Hospital, Madison, WI
Background
Antibiotic use is a major driver of antibiotic resistance in nursing homes (NHs) and there is increasing
interest in understanding how much antibiotic overuse contributes to this problem. Many studies of
appropriateness of antibiotic prescribing in NHs have employed rule-based criteria. The explicit criteria
most commonly employed in these studies are the McGeer and Loeb criteria. To our knowledge, level of
agreement between these two sets of criteria is unknown.
Method
We performed a comparative analysis of the McGeer and Loeb criteria using data abstracted from the
health records in five community NHs. Antibiotic courses initiated in the hospital and emergency
department (ED) were excluded from further analysis. Appropriateness of antibiotics initiated for a
urinary tract infection (UTI), respiratory tract infection (RTI) and skin/soft tissue infection (SSTI)
indication was assessed separately using both the McGeer and Loeb criteria. The overall level of
agreement between the two criteria was assessed using kappa statistics, stratified by indication for
initiating antibiotics (UTI vs. RTI vs. SSTI).
Result
524 of the observed 1108 antibiotic courses (47.3%) were initiated in the hospital or ED. These events,
as well as 60 antibiotic courses (5.4%) prescribed for non-applicable conditions, were excluded from
further analysis. Of the 504 evaluable antibiotic courses initiated in the NH, resident signs and symptoms
meeting either the McGeer or Loeb criteria were documented in less than half the cases (48.2%). Overall
level of agreement between the two sets of explicit criteria was low (k = 0.35) but varied by prescribing
indication: 1) UTI (k = 0.56); 2) SSTI (k = 0.35); and 3) RTI (k = 0.258). The greatest discrepancy in
appropriateness was seen with SSTI where 68% of events were appropriate when using Loeb criteria but
only 31% were appropriate when using McGeer criteria.
Conclusion
Documentation supporting the appropriateness of antibiotic prescribing in NHs is frequently absent,
regardless of explicit criteria employed. However, overall level of agreement between the McGeer and
Loeb criteria is low. Delineating which set of criteria has greater utility for monitoring quality of
antibiotic prescribing in NHs requires additional study.
Comparing hospital onset bacteremia to Central Line-Associated Bloodstream infection as a Hospital
Quality Measure; is it Time for a New Quality Measure?
Clare Rock, MD, MS1, Kerri Thom, MD, MS2, Anthony Harris, MD, MPH2, Shanshan Li, PhD3, Daniel
Morgan, MD, MS2, Aaron Milstone, MD MHS4, Brian Caffo, PhD5 and Surbhi Leekha, MBBS, MPH2,
(1)Department of Medicine, Division of Infectious Diseases, Johns Hopkins University, Baltimore, MD,
(2)Department of Epidemiology and Public Health, University of Maryland School of Medicine,
Baltimore, MD, (3)Department of Biostatistics, Indiana University Fairbanks School of Public Health,
Indianapolis, IN, (4)Johns Hopkins University, Baltimore, MD, (5)Department of Biostatistics, Bloomberg
School of Public Health, Johns Hopkins University, Baltimore, MD
Background
Central line associated bloodstream infection (CLABSI) is an important quality measure tied to hospital
reimbursement, yet the measure suffers from subjectivity and inter-rater variability. Further, with
decreases in CLABSI nationally, its power to discriminate between hospitals is potentially compromised.
A common scenario is hypothetical Hospital A’s reported CLABSI Standardized Infection Ratio (SIR) of 1.2
(95% Confidence Interval (CI) 0.8, 2.1), and Hospital B’s CLABSI SIR of 0.8 (95% CI 0.5, 1.3). With these
large overlapping CIs, how is it possible to discern if these SIRs and hospital infection prevention
performance are different from each other? The objective of this study was to evaluate hospital-onset
bacteremia (HOB), i.e., any positive blood culture obtained 48 hours post-admission, as a substitute for
CLABSI. We assessed the association between CLABSI and HOB rates in ICUs, and their power to
discriminate between ICUs. We hypothesized that HOB is associated with CLABSI, and has greater power
to discriminate between ICUs.
Methods
We conducted a multisite cohort study via the SHEA research network. Hospitals performing ICU CLABSI
surveillance provided monthly CLABSI and HOB rates for 2012 and 2013. A Poisson regression model was
used to assess the association between the two rates. We compared the power of the two measures to
discriminate between ICU using SIRs with 95% CIs. A measure was defined as having greater power to
discriminate if more of the SIRs (with surrounding CIs) were different from 1 (the benchmark SIR).
Results
80 ICUs from 16 hospitals in the US and Canada reported a total of 663 CLABSIs, 475,420 central line
days, 11,280 HOB and 966,757 ICU patient days. HOB was strongly associated with CLABSI; an absolute
change in HOB of 1 / 1,000 ICU patient days was associated with a 2.5% change in CLABSI rate (P<0.001).
For the 18 medical and neonatal ICUs in our study, the CLABSI SIR was different from 1 in 4/18 cases
(22%), compared to 12/18 cases (67%) for HOB (Fisher’s exact P= 0.02).
Conclusions
HOB and CLABSI rates are strongly associated. HOB has advantages over CLABSI including a greater
power to discriminate between ICU performances. Consideration should be given to using HOB as an
outcome measure in infection prevention quality.
Download