Slides - Healthcare Analytics Summit

advertisement
Session #20
How to Drive Clinical Improvement That Get Results
Tom Burton
And the Catalyst Academy Education Team
What is a Clinical Program?
• Organized around care delivery processes
• Permanent integrated team of clinical and
analytics staff
• Creates a iterative continuous learning
environment
• Focus is on sustained clinical outcome
improvement (not revenue growth)
• Not a Clinical Service Line (although you can
Leverage Service Lines as a good start)
2
Organizational AGILE Teams
• Permanent teams that meet weekly
• Integrated clinical and technical members
• Supports multiple care process families
Women & Children’s Clinical Program Guidance Team
MD Lead
RN SME
Pregnancy
Knowledge
Manager
MD Lead
RN SME
Normal Newborn
MD Lead
RN SME
Gynecology
Data
Architect
Guidance Team MD lead
RN, Clin Ops Director
Application
Administrator
Subject Matter Expert
= Data Capture
= Data Provisioning & Visualization
= Data Analysis
=
3
Incorporating the most effective
learning methods
Teach Others - 90%
Practice by Doing- 75%
Discussion Group- 50%
Demonstration- 30%
Audiovisual- 20%
Reading- 10%
Lecture- 5%
% represents average
0
information retained through
the particular learning method
50
100
‒ Duke University
4
Session Objective
4 Learning Experiences
Clinical Programs that Get Results Principles
 Choose the right initiative
 Understand variation
 Improve data quality
 Choose the right influencers
5
Choose the right initiative
6
Deal or No Deal Exercise
7
DEAL or NO DEAL
8
First Principle
• Picking an improvement opportunity randomly
is like playing traditional DEAL or NO DEAL
• You might get lucky
• Choosing the loudest physician or the
choosing based on non-data driven reason
can dis-engages other MDs and use scarce
analytical resources on projects that may not
be the best investment
• It takes about as much effort to work on a
large process as it does on a small process
9
Analytic
System
Pareto Example: Resources Consumed
Key Findings:
• 50% of all in-patient resources are represented by 7 Care Process Families
• 80% of all in-patient resources are represented by 21 Care Process Families
80%
Cumulative %
50%
% of Total Resources Consumed for each
clinical work process
7 CPFs
21 CPFs
Number of Care Process Families
(e.g., ischemic heart disease, pregnancy, bowel disorders, spine, heart failure)
10
10
Analytic
System
Mean Cost per Case = $20,000
Dr. J.
15 Cases
$60,000 Avg. Cost Per Case
$35,000
x 25x cases
= =
$40,000
15 cases
$875,000
opportunity
$600,000
opportunity
Total Opportunity = $600,000
Total Opportunity = $1,475,000
Total Opportunity = $2,360,000
Total Opportunity = $3,960,000
Cost Per Case, Vascular Procedures
11
Improvement Approach - Prioritization
High
3
1
Variability
# of
Cases
# of
Cases
Poor Outcomes
Excellent Outcomes
4
Poor Outcomes
Excellent Outcomes
2
# of
Cases
# of
Cases
Poor Outcomes
Excellent Outcomes
Poor Outcomes
Excellent Outcomes
Low
Low
12
Resource Consumption
High
12
Improvement Approach - Prioritization
High
3
1
Variability
# of
Cases
# of
Cases
Poor Outcomes
Excellent Outcomes
4
Poor Outcomes
Excellent Outcomes
2
# of
Cases
# of
Cases
Poor Outcomes
Excellent Outcomes
Poor Outcomes
Excellent Outcomes
Low
Low
13
Resource Consumption
High
13
Y- Axis = Internal Variation in Resources Consumed
Internal Variation versus Resource Consumption
3
1
4
2
Bubble Size = Resources
Consumed
X Axis = Resources Consumed Bubble Color = Clinical Domain
14
DEAL or BETTER DEAL
15
15
Understand Variation
16
The Popsicle Bomb Exercise
Timer
1M
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
10
11
0
1
2
3
4
5
6
7
8
9
When you’re finished note
your time and enter it in the
HAS app – Poll Question 1
17
Variation in Results
• Corp Analytics – shows results
18
Less Effective Approach to improvement:
“Punish the Outliers”
Focus on
Minimum
Standard
Metric
Mean
# of
Cases
# of
Cases
Poor Outcomes
Excellent Outcomes
Poor Outcomes
Excellent Outcomes
Current Condition
Option 1: “Punish the Outliers” or
“Cut Off the Tail”
•
•
Strategy
• Set a minimum standard of quality
• Focus improvement effort on those
not meeting the minimum standard
Significant Volume
Significant Variation
1 box = 100 cases in a year
19
Effective Approach to improvement:
Focus on “Better Care”
Focus on
Best Practice
Care Process
Model
Mean
# of
Cases
# of
Cases
Poor Outcomes
Excellent Outcomes
Poor Outcomes
Excellent Outcomes
Current Condition
Option 2: Identify Best Practice
“Narrow the curve and shift it to the right”
•
•
Strategy
• Identify evidenced based “Shared Baseline”
• Focus improvement effort on reducing
variation by following the “Shared Baseline”
• Often those performing the best make the
greatest improvements
Significant Volume
Significant Variation
1 box = 100 cases in a year
20
Round 2
Timer
1M
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
10
11
0
1
2
3
4
5
6
7
8
9
When you’re finished note
your time and enter it in the
HAS app – Poll Question 2
21
Reduced Variation in Results
• Corp Analytics – shows results
22
Improve Data Quality
23
The Water Stopper Exercise
24
Information Management
Fix it Here
= Subject Matter Expert
DATA CAPTURE
= Data Capture
• Acquire key data elements
• Assure data quality
• Integrate data capture into operational
workflow
Knowledge Managers (Data
quality, data stewardship and
data interpretation)
= Data Provisioning
= Data Analysis
Application Administrators
(optimization of source systems)
DATA ANALYSIS
DATA PROVISIONING
• Interpret data
• Discover new information in the data
(data mining)
• Evaluate data quality
• Move data from transactional systems into
the Data Warehouse
• Build visualizations for use by clinicians
• Generate external reports (e.g., CMS)
Not Here
Data Architects
(Infrastructure, visualization, analysis, reporting)
Not Here
25
25
25
Data Capture Quality Principles
• Accuracy
 Does the data match reality?
 Example: Operating Room Time Stamps
• Timeliness
 What is the latency of the data capture?
 Example: Billing data delay; end of shift catch-up
• Completeness
 How often is critical data missing?
 Example: HF Ejection Fraction
26
Challenges with Data “Scrubbing”
Analyst time spent on re-working
scrubbing routines
Root cause never identified
Early binding vs. late binding –
what you consider dirty data may
actually be useful for others
analyzing process failures.
Using data to punish vs. data to
learn – punish strategy promotes
hiding the problem so clinicians
don’t look bad
27
Choose the right influencers
28
Paul Revere's ride Exercise
29
Revere vs. Dawes
Paul Revere
"Revere knew exactly which
doors to pound on during his
ride on Brown Beauty that April
night. As a result, he awakened
key individuals, who then
rallied their neighbors to take
up arms against the British.”
William Dawes
"In comparison, Dawes did not
know the territory as well as
Revere. As he rode through
rural Massachusetts on the
night of April 18, he simply
knocked on random doors. The
occupants in most cases
simply turned over and went
back to sleep."
Diffusion of Innovations (Free Press, 2003) by Everett M. Rogers
30
Early adopters. Recruit
early adopters to chair
improvement and to lead
implementation at each site.
(key individuals who can
rally support)
Innovators. Recruit
innovators to redesign care delivery
processes (like
Revere)
N = number of individuals in group
N = number needed to influence group
(but they must be the right individuals)
late
majority
early
majority
Innovators
N
The Chasm
early
adopters
laggards
(never adopters)
* Adapted from Rogers, E. Diffusion of Innovations. New York, NY: 1995.
31
Guidance Team
(Prioritizes Innovations)
Early Adopters
W&N
Innovators
OB
Innovators
Newborn
GYN
•
•
•
Meet quarterly to prioritize allocation of technical staff
Approves improvement AIMs
Reviews progress and removes road blocks
•
•
•
Meet weekly in iteration planning meeting
Build DRAFT processes, metrics, interventions
Present DRAFT work to Broader Teams
•
•
•
Broad RN and MD representation across system
Meet monthly to review, adjust and approve DRAFTs
Lead rollout of new process and measurement
W&N
Small Teams
(Designs Innovation)
Innovators
OB
Broad Teams
(Implements Innovation)
Early Adopters
W&N
Innovators
OB
Early Adopters
W&N
W&N
32
Organizational AGILE Teams
•
•
•
•
Permanent teams
Integrated clinical and technical members
Supports multiple care process families
Choose innovators and early adopters to lead
Early Adopters
Women & Children’s Clinical Program Guidance Team
MD Lead
RN SME
Innovators
Pregnancy
Knowledge
Manager
MD Lead
RN SME
Normal Newborn
MD Lead
RN SME
Gynecology
Data
Architect
Guidance Team MD lead
RN, Clin Ops Director
Application
Administrator
Subject Matter Expert
= Data Capture
= Data Provisioning & Visualization
= Data Analysis
=
33
How to identify innovators
and early adopters
• Ask
 Innovators (inventors)
- Who are the top three MDs in our group who are
likely to invent a better way to deliver care?
 Early Adopters (thought leaders)
- When you have a tough case who are the top three
MDs you trust and would go to for a consult?
• Fingerprinting selection process
 Invite innovators to choose identify their top three
MD choices from the early adopters to lead the
Clinical Program
34
Conclusion – TEACH OTHERS
35
Teach Others Exercise




Timer
Deal or No Deal
-
Choose the right initiative
-
Prioritize based on process size and variation
Popsicle Bomb
-
Understand variation
-
Measure variation and standardize processes
1M
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
10
11
0
1
2
3
4
5
6
7
8
9
Water Stopper
-
Improve data quality
-
Fix the problem at the source
Paul Revere’s Ride
-
Choose the right influencers
-
Identify Innovators and Early adopters to
accelerate diffusion of innovation
1M
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
10
11
0
1
2
3
4
5
6
7
8
9
Take 1 minute and describe the purpose of each
exercise to your neighbor, then swap and let
them teach you
36
Exercise Effectiveness Q1
Overall, how effective were the exercises in
explaining the principles?
1) Not effective
2) Somewhat effective
3) Moderately effective
4) Very effective
5) Extremely effective
37
Exercise Effectiveness Q2
How effective was the Deal or No Deal Exercise
at teaching the principle of prioritizing based on
process size and variation?
1) Not effective
2) Somewhat effective
3) Moderately effective
4) Very effective
5) Extremely effective
38
Exercise Effectiveness Q3
How effective was the Popsicle Bomb Exercise
at teaching the principle of understanding
variation and standardizing processes?
1) Not effective
2) Somewhat effective
3) Moderately effective
4) Very effective
5) Extremely effective
39
Exercise Effectiveness Q4
How effective was the Water Stopper Exercise
at teaching the principle of fixing data quality
issues at the source?
1) Not effective
2) Somewhat effective
3) Moderately effective
4) Very effective
5) Extremely effective
40
Exercise Effectiveness Q5
How effective was the “Paul Revere Ride”
exercise at teaching the principle of choosing
the right influencers based on their capabilities
as innovators and early adopters?
1) Not effective
2) Somewhat effective
3) Moderately effective
4) Very effective
5) Extremely effective
41
Exercise Effectiveness Q6
Are you interested in running these same
exercises in your organizations?
a) Yes
b) No
42
Analytic
Insights
Questions &
Answers
A
Session Feedback Survey
1. On a scale of 1-5, how satisfied were you overall with this session?
1)
2)
3)
4)
5)
Not at all satisfied
Somewhat satisfied
Moderately satisfied
Very satisfied
Extremely satisfied
2. What feedback or suggestions do you have?
3.
On a scale of 1-5, what level of interest would you have for
additional, continued learning on this topic (articles, webinars,
collaboration, training)?
1)
2)
3)
4)
5)
No interest
Some interest
Moderate interest
Very interested
Extremely interested
44
Download