Learning from H1N1 about public health systems Michael A. Stoto

advertisement
Learning from H1N1 about
public health systems
Michael A. Stoto
AcademyHealth PHSR methods session
June 28,
28 2010
Linking Assessment and
Measurement to Performance
in PHEP Systems
Lessons learned one year out

A year after the 2009 H1N1 outbreak, many of us
g about “lessons learned”
have thought



News articles, editorials, etc.
Harvard/Georgetown LAMPS





Federal, state and local health departments AAR/IPs
School closings
Surveillance
MRC unit effectiveness
State and local lookbacks
Other PERRCs
Linking Assessment and
Measurement to Performance
in PHEP Systems
Surveillance and Epidemiology:
What can we learn from H1N1?


Impact of investments in lab capacity, notification,
y
surveillance
and syndromic
Limitations of case-based surveillance for
characterizing severity, populations at risk, and
situational awareness


Ch ll
Challenges
off early
l detection
d t ti in
i the
th “fog
“f off war””



 population-based statistical surveillance
Outbreak of a new pathogen intrinsically characterized
by uncertainty that takes weeks to months to resolve
 expect uncertainty
Measuring public health system’s capability to detect
outbreaks and characterize pathogen
goes
beyond
Linking
Assessment
and
Measurement to Performance
current surveillance and lab capacity
measures
in PHEP
Systems
School closing:
What can we learn from H1N1?

Excess variability

Whether to close,
close triggers,
triggers when to reopen
What’s meant by closure
Who decides

Goals






Limit spread in the community
Protecting children from consequences
React to staff shortages or children kept home
PHEP system improvement
i
will
ill require
i



Clarification of goals
Cl ifi ti off authority
Clarification
th it
Expectation of uncertainty
Linking Assessment and
Measurement to Performance
in PHEP Systems
Questions for discussion

What’s a “lesson learned”?



How do we “learn” it (rigor)?


The truckk we planned
Th
l
d to use for
f the
h vaccine
i …
Plan for extended events with many uncertainties
Statistical approaches vs.
vs probing case studies
Is this research (generalizable)?

Or quality improvement
Linking Assessment and
Measurement to Performance
in PHEP Systems
New approaches for
PHEP research

Realistic Evaluation
(P
(Pawson
& Tilley,
Till 1997)

“Realist Synthesis”
y
(Pawson, Evaluation, 2002)

“Science
Science of
Improvement”
(Berwick JAMA,
(Berwick,
JAMA 2008)

“Positive deviance”
Linking Assessment and
Measurement to Performance
in PHEP Systems
Public Health Emergency Preparedness
(PHEP) system capabilities






Surveillance and
p
gy
epidemiology
Disease control and
prevention
Mass care
Communication with
p
the public
Information sharing
within the “public
health system
system”
Leadership and
management
Linking Assessment and
Measurement to Performance
in PHEP Systems
Approaches to measuring PHEP

Unstructured assessments
A you prepared?
Are
d?

Inventories & capacity
assessments
Did you do what’s recommended?

Actual events
How did it ggo?

Drills and exercises
How would it go?
Typically
address
capacityit
building
Typically
address
functional
capabilities
Linking Assessment and
Measurement to Performance
in PHEP Systems
Learning from exceptional events:
Critical events and positive deviants
Identifying exceptional events
 Valid
lid outcome measures
 Reliable measurement
 Adjusting for covariates
What led to the +/- outcomes (n of 1 study)?
Linking Assessment and
Measurement to Performance
in PHEP Systems
Influenza A (H1N1) 2009
vaccination coverage
among children and
adults January 2010
adults,
State
RI
MA
Children
84.7%
60.3%
Hi h i k adults
High-risk
d lt
24 0%
24.0%
35 9%
35.9%
MRC Toolkit for flu
fl clinics



Performance Scales
1. ICS awareness
2. Confidence in own role and as part of a team,
3. Confidence in interacting with the client/patient
4. Motivation driven by community service
5. Motivation driven by personal/career development
Personal motivation scale
 Unit 4: 55.4 (SD=21.3)
 Unit 7: 68.5 (SD=20.1)
Level of confidence in own role and in working as part of a team
 Unit 1: 88.5 (SD = 4.0)
 Unit 2: 67.9 (SD = 26.5)
Linking Assessment and
Measurement to Performance
in PHEP Systems
 Unit 10: 73.3 (SD = 16.9)
10
5
0
-5
-10
0
-1
15
Differe
ence in a
average factor s
score
MRC Toolkit for flu clinics:
Comparing 2009-10 to 2008-9 volunteers
Incident
Command
System
Motivation and
Community
Engagement
Motivation and
Personal
Development
Confidence
in
Confidence
in
Linking Assessment
and
Own
Role and toInteracting
Measurement
Performancewith
Team
Patients
in PHEP Systems
Learning from exceptional events:
Critical events and positive deviants
What led to the +/- outcomes (n of 1 study)?
 HSEEP AAR/IPS
/
 Root cause analysis
 Facilitated lookbacks
 Context + Mechanism = Outcome (C+M=O)
 Critical event registries
Linking Assessment and
Measurement to Performance
in PHEP Systems
Problems with typical HSEEP AARs




Focus on Target Capability Lists (TCL)
 Many TCLs are capacities rather than capabilities
 One TCL, “observation” at a time
 Assumes that the plan applies to the situation
 Vaccine available all at once,
once demand high
 NIMS model for communication
More focus on form than probing objective analyses
 Root causes triggered by EEG, but not in AAR
 “Lessons learned” optional
Improvement plans
 Often missing altogether
 Little analysis
Linking Assessment and
Prepared by emergency planners/responders
Measurement to Performance
in PHEP Systems
Facilitated Look Backs (RAND TR320, 2006)



Facilitated, post-event discussion with PH leaders
and key staff
staff, and relevant community stakeholders
Open, candid, no-fault systems-level analyses by
asking
ki a diverse
di
group off participants
i i
involved
i
l d to
critically evaluate their management of the event
Led by an independent, objective facilitator, trace



series of events that unfolded
key decisions that were made by various stakeholders
how decisions were perceived and acted upon by others
Linking Assessment and
Measurement to Performance
in PHEP Systems
Facilitated Look Backs (RAND TR320, 2006)
Linking Assessment and
Measurement to Performance
in PHEP Systems
Fall 2009 H1N1 issues (Arlington VA)

School clinic operations







Staffing
Use of volunteers
ICS
Scheduling
Vaccine storage
g and
handling
Data collection
Quality assurance

Consent/communications




“Backpack”
Backpack communication
Call center
Communication about
 second dose
 vaccine risks
Coordination with Schools



Consent forms
Document vaccine receipt
MOUs
Linking Assessment and
Measurement to Performance
in PHEP Systems
Selected findings (Arlington, VA)

School clinic operations



C
Communications
i ti
between
b t
public
bli health
h lth andd schools
h l



Consistency of teams
School configuration
Regarding overall schedule and changes
Proper and consistent use of ICS
Public communications


Priority groups, school vs. public clinics
Call center issues
Linking Assessment and
Measurement to Performance
in PHEP Systems
Selected findings (Arlington, VA)

Consent forms





Electronic, separate first and last names
Electronic
Central tracking database
Lesson learned: Need
Separate injections and mist
for management
More local  state input
database at the start
C
Consent
t process



Shorten time from signing to administration
Pre-vaccination review by PH staff
Quality control at time of vaccination
Linking Assessment and
Measurement to Performance
in PHEP Systems
Fall 2009 H1N1 issues (Massachusetts)

Vaccine




Surge issues
Distribution
 N
N-95
95 supplies
 Allocation
 Antivirals
 Delays
 Communication
Ad i i t ti
Administration
 With the public
 Vaccine types
 Priority
y groups
g p
 Clinic setup
p and
management
 Safety issues
Prioritization
 With public health partners
 Risk
Ri k groups
 “Regional” coordination
 Health and
emergency
g y workers
Linking Assessment and
Measurement to Performance
in PHEP Systems
Interviews, focus groups & look backs

Information gathering




Key
ey informants
o a ts interviews
te v ews
Focus groups with MDPH staff
Look-back meetings: Amherst, Boston, Brockton
Questions



What went well? Why?
Wh t could
What
ld have
h
gone better?
b tt ?
 What could have been done differently?
 What were the underlying problems?
What systems changes are needed to improve
future pperformance?
Linking Assessment and
Measurement to Performance
in PHEP Systems
Major strengths (“success stories”)



Increased #s of vaccination sites and vaccinators
 New
Ne partnerships with
ith schools and pro
providers
iders
> 3.7 million doses of vaccine delivered
 37% of the population
 60% of those aged
g 6 months to 17 yyears
 28% of those 18 years and older
 51% of initial priority groups
N-95 mask guidance
Linking Assessment and
Measurement to Performance
in PHEP Systems
Areas for improvement (draft)

Would have improved the 2009 response


For future events


More transparency regarding priority groups,
groups registration
process, vaccine allocations, etc.
More proactive role regarding vaccine safety
System improvements


Improve communications within the “public health system”
Maintain and build relationships with
 healthcare providers, community health centers
 Schools,, other trusted channels of communication
Linking Assessment and
Measurement to Performance
in PHEP Systems
“Lessons learned” (might be generalizable)





Managing when vaccine supply is uncertain
and
d distributed
di t ib t d over time
ti
Balancing
g clear policies
p
with flexible
implementation
More “local”
local involvement in decision
decision-making
making
and program management
Increasing transparency
Building trusting relationships
Linking Assessment and
Measurement to Performance
in PHEP Systems
Methodological challenges



Inconsistent beliefs about what actually happened
How to put people at ease to learn from what
happened, especially given
 Long and complex histories
 Existing relationships
 Personalities
How to identify
 Root
R
causes
 Credible recommendations/improvement plans
Linking Assessment and
Measurement to Performance
in PHEP Systems
Realistic Evaluation (Pawson and Tilley)
Science of Improvement (Berwick)
Context: particular public health
system and event
Mechanisms: situational
awareness mass dispensing
awareness,
dispensing, policy
guidance, communication with the
public & with the public health
system, leadership
Outcomes:
Reduction in cases and deaths, work
days lost, social disruption, etc.
Context
Mechanism
Outcome
C+M=O
Linking Assessment and
Measurement to Performance
in PHEP Systems
What is generalizable (in context)?

Refining “middle range theories”



MRC driver diagram
Robust databases to manage the vaccination process




Evidence, or at least credible expert opinion
Screening and consent process
2nd dose
Geographic coverage
More transparency and “local” involvement in



Decision-making
D
ii
ki
Program management
Public communication
Linking Assessment and
Measurement to Performance
in PHEP Systems
MRC driver diagram
g
(1
( st draft))
Outcome
1° drivers
2 ° drivers
Logistical
constraints
(day care,
MRC member
mobilization and
participation
Process changes
Logistical
i i l
solutions
transportation)
Member
motivation
Recruitment
retention efforts
Role assignment
Improved
MRC unit
effectiveness
Effect
MRC member
performance at
flu clinics
clinics,
PODs, health
fairs, etc.
Member
knowledge
and skills
Relationships
l i hi
with strategic
partners
Drives
Training
Organizational
development &
strategic
planning
Cause
What is generalizable (in context)?

Refining “middle range theories”



MRC driver diagram
Robust databases to manage the vaccination process




Evidence, or at least credible expert opinion
Screening and consent process
2nd dose
Geographic coverage
More transparency and “local” involvement in



Decision-making
D
ii
ki
Program management
Public communication
Linking Assessment and
Measurement to Performance
in PHEP Systems
(n=15)
Franklin County
Dukes
County
Linking Assessment and
Measurement
to Vineyard
Performance
Martha’s
in PHEP Systems
Realist Synthesis
(P
(Pawson,
E l ti 2002)
Evaluation,



Statistical approaches
that don’t
acknowledge C and M
variability will fail
Case studies don
don’tt
generalize
K
Knowledge
l d
accumulates in
successive
i
observations of
C+M O
C+M=O
Realistic evaluation of PHEP
based on critical events


Embrace a wider range of scientific methodologies
 RCTs,
RCTs OXO designs,
designs and other statistical
methods  credible evidence of C+M=O
L
Learn
from
f
practitioners
titi
about
b t “middle
“ iddl range
theories”
 Stakeholders as survey “respondents” 
teacher/learner interviews to identify C+M=Os
 Reconsider practitioner stakeholder bias
 Hotwashs  facilitated lookbacks
Linking Assessment and
Measurement to Performance
in PHEP Systems
PHEP critical event registry


Critical incident registries: use root cause analysis to
 encourage
g thoughtful
g
analysis
y of routine and rare events
 enable sharing of lessons learned to other settings
Protocol to facilitate consistency and depth of reports
 Narrative description of event
 Probing assessment of system’s PHEP capabilities
 assessment (situational
( i i l awareness))
 policy development, assurance (mass care)
 information sharing
sharing, communication
communication, leadership
 Attention to successful and unsuccessful aspects of response
 Identification of lessons learned/middle range theories
Linking Assessment and
Measurement to Performance
in PHEP Systems
Questions for discussion



What’s a “lesson learned”?
 Valid and reliable performance measures to
identify outliers and outcomes
 “Middle range
g theories” re C+M=O
How do we “learn” it (rigor)?
 Wider range
g of scientific methodologies
g to learn
from experience, and practitioners
 Hotwashs  facilitated lookbacks
Is this research or quality improvement?
 Academics with IRBs vs. practitioners
Linking Assessment and
Measurement to Performance
in PHEP Systems
Acknowledgements
This presentation was developed in collaboration with a number of
partnering organizations, and with funding support awarded to the
Harvard School of Public Health Center for Public Health
Preparedness under cooperative agreements with the US Centers for
Disease Control and Prevention (CDC) grant number
5P01TP000307-01
5P01TP000307
01 (Preparedness and Emergency Response
Research Center). The content of this presentation as well as the
views and discussions expressed are solely those of the authors and
do not necessarily represent the views of any partner organizations,
organizations
the CDC or the US Department of Health and Human Services nor
does mention of trade names, commercial practices, or
organizations imply endorsement by the U.S.
U S Government.
Government
The authors are also grateful for support from the O’Neill Institute
for National and Global Health Law at Georgetown University.
University
Linking Assessment and
Measurement to Performance
in PHEP Systems
Download