Small Area Variation in the Public

advertisement
Small Area Variation in the Public
Health Response to H1N1
Glen P. Mays, PhD
Department of Health Policy and Management
Fay W. Boozman College of Public Health
University of Arkansas for Medical Sciences
Acknowledgements
Funded by CDC grant to NC Preparedness &
E
Emergency
Response
R
R
Research
hC
Center
t
Also supported by RWJF Public Health PBRN Program
Coauthors and Collaborators
John Wayne,
y
PhD, UAMS
Cammie Marti, MPH, UAMS
James Bellamy, MPH, UAMS
Mary Davis, DrPH, NC Institute for Public Health
Brittan Woods, MPH, NC Institute for Public Health
Edward L.
L Baker,
Baker MD
MD, MPH
MPH, NC Institute for Public Health
North Carolina Division of Public Health
p
leaders and staff
Local health department
Motivation
The 2009 national N1H1 outbreak provided a
chance to explore variation in public health
response
The role of accreditation was of particular interest
to our research team
Do accreditation and performance standards
promote standardization, interoperability,
response capacity?
Two vehicles for study: PERRCs and PBRNs
PERRCs and PBRNs
PBRN
NCC
1st PBRN cohort (n=5)
2nd PBRN cohort (n=7)
(
)
PERRCs (n=11)
The Need for PracticePractice-Based Research
Examine effectiveness, impact, and value in realworld practice settings rather than controlled or
simulated settings
Position research to address information needs
and decision challenges of practicing professionals
Detect and characterize emerging issues rapidly
Accelerate the research-to-practice-and-back
translation cycle
The Challenges of PracticePractice-Based
R
h on Preparedness
P
d
Research
Few
existing,
valid
F
i ti
lid and
d reliable
li bl metrics
ti
Public health emergencies
g
as rare events
Heterogeneity in the scale, scope and timing of
e ents across comm
events
communities
nities
Heterogeneity
g
y in risk for events
Difficulties in collecting data during event response
Theory of action: preparedness &
accreditation
dit ti
H1N1 as a Stress Test
Common
perceived
threatt ffacing
C
i d th
i llocall public
bli
health systems across the US
Large uncertainties about the magnitude, severity,
g of the threat at local levels
and timing
Large uncertainties about the effectiveness and
al e of specific response strategies
value
Research Objectives
Describe the nature and timing of the public
health response to H1N1 outbreak in NC
(and KY)
Test for differences in local response between
accredited and non-accredited health agencies
Identify factors that facilitated and inhibited
H1N1 response activities
Use findings to create After Action Reports
(AARs) and identify improvement areas for
public health agencies
Study Design & Methods
Observational, cross-sectional study of 9
communities selected to contrast accreditation
status
Structured interviews capture key elements of
the nature & timing of investigation & response
Factor analysis used to group survey items into
domains and construct composite measures of
scope and timing
Multivariate models used to test for differences
by accreditation status
status, controlling for domain
and community fixed effects
Accreditation in North Carolina
Initial accreditation 2014
Initial accreditation 2013
Initial accreditation 2012
Initial accreditation 2011
Initial accreditation 2009, reaccreditation 2014
Initial accreditation 2008, reaccreditation 2013
Initial accreditation 2007, reaccreditation 2012
Initial accreditation 2006, reaccreditation 2011
,
Initial accreditation 2005, reaccreditation 2010
Initial accreditation 2004, reaccreditation 2009
SOURCE: http://nciph.sph.unc.edu/accred/about_nclhda/progress.htm
Study Communities
Accredited
Non‐Accredited
Measuring H1N1 Response
Nelson,, Lurie,,
Wasserman,
Zakowski 2007
Measuring H1N1 Response
Review existing assessment approaches
– Homeland Security Exercise and Evaluation Program
– CDC H1N1 Model Case Report Form
Brief Delphi process with preparedness experts
Desktop review of draft instrument
with leaders of five public health
PBRNs
Piloting and refinement in initial
study community
Measuring H1N1 Response
Initial closed-form responses obtained via
structured in-person interviews with designated
preparedness coordinators at local health
departments
On-site “look-back” focus groups held with full
range of LHD partners involved in H1N1
response effort:
– Confirm & refine closed-form responses
– Collect qualitative information on
response decision-making and action
Data collected August-September
August September 2009
Survey Items
NUMBER OF ITEMS
DOMAIN
SCOPE: was activity was activity TIMING: Days TIMING: Days
performed?
since outbreak*
Planning
45
‐‐
Communication
105
14
Incident command
Surveillance &
Surveillance & Investigation
9
4
21
6
Response and mitigation
Response and mitigation
27
13
Total
207
37
*Initial outbreak onset defined as 15April2009; other timing measures based on local events (e.g. receipt of case report)
Example Survey Items
Item
Planning: local plan was in place for enforcing isolation and quarantine orders
Pct/Mean
89% scope
Communication: physician guidelines were disseminated about acquisition of supplies
44% scope
Incident command: local EOC was activated
43% scope
Investigation: days to initiation of hospital i i
d
i ii i
fh i l
case‐finding activities
16.5 timing
Response: notification sent via health alert R
tifi ti
t i h lth l t
network 33% scope
Mitigation: contact notification initiated
67% scope
Analytic strategy
Problem: small # communities, large # measures
Desire to summarize patterns across measures, but also
maximize power to detect differences across
communities
g
Need signal
enhancement and noise reduction
Several analytic strategies are possible:
–
–
–
–
–
Factor analysis to group measures into domains
Multiple Indicators Multiple Causes Model
IRT Models
Rasch Models
Bayesian Hierarchical Latent Variable Models
Analytic strategy: BHLV
Assume an unobserved, latent quality of H1N1 response
exists in each community θi
Assume multiple indicators of quality (q) are correlated
within each of j domains of activity (Dj)
True value of performance on indicator q in community i
is g
given by:
y
Logit (tqi) = f ( a0 - intercept/baseline rate for indicator
comm
aqθi - association between q and latent quality in comm.
ajDj - association between q and domain of activity
aAAi - association between q and accreditation status
sq - random error )
Landrum MB et al. 2000. Analytic methods for constructing cross-sectional profiles of
health care providers. Health Services & Outcomes Research Methodology 1:23-47
100.0%
90.0%
Sco
ope (%)
80.0%
Scope of Activities ((%))
70
Timing of activities (days)
60
70.0%
50
60.0%
40
50.0%
40.0%
30
30.0%
20
20.0%
10.0%
0.0%
10
0
Timiing (days
s after outbreak))
Scope and Timing of H1N1 Response Activities:
Composites
from All 9 Communities
p
Scope and Timing of H1N1 Response Activities:
by
y Agency
g
y Accreditation Status
90.0%
80.0%
S
Scope of R
Response
e (%)
70.0%
60.0%
50.0%
40 0%
40.0%
30.0%
20.0%
10.0%
Accredited
Non-accredited
0.0%
15.0
20.0
25.0
30.0
35.0
40.0
45.0
Timing of Response (days after 15April09)
50.0
55.0
Multivariate--adjusted Scope of H1N1 Activities
Multivariate
100 0%
100.0%
Nonaccredited
90.0%
Accredited
80 0%
80.0%
70.0%
60.0%
50.0%
40.0%
30.0%
20.0%
10.0%
0.0%
Planning
Communication
p<0.01
p<0.01
Incident
command
p<0.01
Investigation
Response/
migitation
p<0.05
p<0.05
Controlling for domain-level heterogeneity and community-level heterogeneity
Multivariate--adjusted Timing of H1N1 Activities
Multivariate
100
90
Nonaccredited
Accredited
Da
ays after outbrea
ak
80
70
60
50
40
30
20
10
0
Communication
Incident command
p<0.05
Investigation
Response/ migitation
p<0.01
Controlling for domain-level heterogeneity and community-level heterogeneity
Further analytic issues
Nonrandom selection of agencies into accredited
vs. unaccredited
dit d status
t t
Agencies
g
undergoing
g g accreditation p
prior to 2010
were larger, had greater financial and human
resources
Two-stage Heckman selection correction model
used to test and correct for selection bias
41-62% of differences in scope and timing
attributable to selection
Conclusions and implications
Wide variation in the scope and timing of local
public health responses to H1N1
Accredited
A
dit d agencies
i iimplemented
l
t dab
broader
d
scope of responses
Accredited agencies implemented IC and
investigation activities more rapidly
Accreditation may confer and detect
enhanced capacity
p
y for
H1N1 response
Limitations
Cross-sectional analysis
Self-reported measures subject to recall bias
and
d role
l specificity
ifi it
Process rather than outcome measures
Timing indicators measured in days, not hours
Limited opportunity to examine how agencies
react and respond to accreditation
What’s next
Ancillary study of local variation in vaccine
distribution
di t ib ti
Larger-scale
g
data collection on p
preparedness
p
activities and capacities in LHDs
– state
statewide
de in NC
C
– propensity-matched comparison group
outside NC
of US agencies
g
Longitudinal data collection before and after
accreditation
Difference-in-difference estimation to support
stronger inferences
Research design for longitudinal analysis
Preaccreditation
Group
Early NC agencies
Postaccreditation
Opost Opost Opost
Late NC agencies
Opre Opre
Opost Opost Opost
Propensity-matched
comparison
i
agencies
i
outside NC
Cpre Cpre
Cpost Cpost Cpost
Effect = (Opost - Opre) - (Cpost - Cpre)
Download