Measuring and Reporting Patients’ Experiences with Today ’

advertisement
Today’
Today’s Objectives
Measuring and Reporting
Patients’ Experiences with
Their Doctors
Process, Politics and Public
Reports in Massachusetts
Melinda Karp
MHQP Director of Programs
June 26, 2006
Stakeholders at the MHQP “Table”
• Provider Organizations
– MA Hospital Association
– MA Medical Society
– 2 MHQP Physician Council
representatives
• Government Agencies
– MA EOHHS
– CMS Region 1
• Employers
– Analogue Devices
• Health Plans
– Blue Cross Blue Shield of
Massachusetts
– Fallon Community Health Plan
– Harvard Pilgrim Health Care
– Health New England
– Tufts Health Plan
• Consumers
– Exec. Director Health Care for All
– Exec. Director NE Serve
• Academic
– Harris Berman, MD, Board Chair
“1st Generation”
Generation” Questions: Moving
MDMD-Level Measurement into Practice
9 Provide brief background on MHQP as important
context for measurement and reporting efforts
9 Describe evolution of MHQP agenda for
measuring and reporting patient experiences—
key methods questions in moving from research
to large scale implementation
9 Describe stakeholder perspectives and decision
points around key reporting issues
The Evolution of MHQP’
MHQP’s Patient
Experience Measurement Agenda
2002-2003
™Demonstration project in partnership with
The Health Institute (Funded by
Commonwealth and RWJF)
2004-2005
™Development of viable business model for
implementing statewide patient experience
survey
2005-2006
™Fielding and reporting of statewide survey
Sample Size Requirements for Varying
PhysicianPhysician-Level Reliability Thresholds
• Is there enough performance variability to justify measurement?
• What sample size is needed for highly reliable estimate of
patients’ experiences with a physician?
• How much of the measurement variance is accounted for by
physicians as opposed to other elements of the system (practice
site, network organization, plan)?
• What is the risk of misclassification under varying reporting
frameworks?
1
Allocation of Explainable Variance:
DoctorDoctor-Patient Interactions
Allocation of Explainable Variance:
Organizational/Structural Features of Care
100
100
80
Doctor
62
74
60
70
77
Site
84
23
36
39
80
Doctor
60
Network
40
Plan
20
38
25
45
16
8
tr
us
t
Organizational
Access
Visit-based
Continuity
Integration
Pa
t
ie
nt
tr
ea
tm
en
t
n
ea
lt h
H
W
ho
le
-
In
te
rp
er
so
na
l
at
io
or
ie
nt
pe
rs
on
pr
om
ot
io
n
n
0
om
m
un
ic
at
io
Network
Plan
16
0
C
77
56
20
29
22
Site
40
Source: Safran et al. JGIM 2006.
Source: Safran et al. JGIM 2006.
Risk of Misclassification
Risk of Misclassification
Substantially
Below Average
Substantially Below Average
Average
Below Average
Average
Substantially
Above Average
Above Average
Substantially Above Average
MEASURE RELIABILITY (αMD)
0.9
50
19.7
0.8
50
28.5
3.3
50
2.2
0
50
17.6
3.2
50
0
0
11.1 50
8.8
0.4
50
27.0
11.2
50
0.4
0
2.3
0
MEASURE RELIABILITY (αMD)
0.9
50
0.01
0
50
0.01
0
0.8
50
0.6
0
50
0.5
0
0.7
50
2.4
0
50
2.4
0
0.7
50
33.0
17.3
50
14.7
2.0
50
32.0
17.4
50
0.6
50
36.4
22.5
50
19.9
4.7
50
35.4
22.8
50
5.4
0.1
50
38.7
27.7
50
25.2
8.7
50
27.3
50
9.7
0.4
0.5
0
52.9
10th ptile
64.6
76.3
50th ptile
90th ptile
88.0
38.3
100
0
52.9
58.5
64.6
70.8
10th ptile
25th ptile
50th ptile
75th ptile
100
76.3
90th ptile
Source: Safran et al. JGIM 2006; 21:13-21.
Source: Safran et al. JGIM 2006; 21:13-21.
MHQP 2005 Statewide Survey
•
Physician-level survey format
•
Site-level sampling to support site-level reporting
•
Estimated samples required to achieve > 0.70 site-level reliability
Site Reliability Chart: Integration
Sample sizes needed to achieve site
reliability for integration domain.
Practice
Size
(Number of
Doctors)
3
4
5
6
7
8
A
Site-Level Reliability
B
C
D
> 0.7
0.5 – 0.69 0.34 – 0.49
< 0.34
38
48
57
65
72
79
16-37
20-47
24-56
28-64
31-71
33-78
8-15
10-19
12-23
14-27
16-30
17-32
7
9
11
13
15
16
2
Setting the Stage for Public Reporting:
Key Issues for Physicians
• What measures get reported
• How measures get reported
Percent of Sites with AA-Level Reliability by
Measure and SurveySurvey-Type
Framework for Public Reporting
Adult
PCP
%
Pediatric
%
Communication
98
97
Knowledge of patient
91
86
Health Promotion
46
97
Integration of care
79
61
MD – Patient Interactions
Integration of Care
78.7
80.5
½
Organizational/Structural Features of
Care
Access
99
100
Visit-based continuity
100
100
Office Staff
95
99
Clinical Team
37
86
Willingness To Recommend
62
59
79.6
15th ptile
84.3
86.1
87.9
85.2
50th ptile
89.7
½
½
88. 8
85th ptile
3
Summary and Implications
•
With sufficient sample sizes, data obtained using C/G CAHPS
approach yields data with MD- and site-level reliability >0.70
•
For site-level reliability, number of MDs per site influences
required sample sizes
•
Risk of misclassification can be held to <5% with by
– Limiting number of performance categories
– Creating buffer (“zone of uncertainty”) around performance
cutpoints
•
The Continuing Evolution…
Evolution…
2006-2007
• Engagement around QI activities
– Participation in Commonwealth Fund grant to study highest
performing practices
– Grant proposal to Physician Foundation to develop and pilot
integrated clinical-patient experience QI curriculum
• Determining achievable benchmarks
• Fielding of Specialist Care Survey in 2006/2007
• Repeat Primary Care Survey in 2007
Trade-offs are likely around data quality standards (e.g.,
acceptable “risk”) vs. data completeness
For more information …
Melinda Karp, Director of Programs
mkarp@mhqp.org
617-972-9056
www.mhqp.org
4
Download