Reporting Post Graduate Medical Program Performance

advertisement
Using an Institutional Report Card to
Support Evidence-Based GME Decision
Making
Conference Session: SES46
20101ACGME Annual Education Conference
Ann Dohn, MA, DIO, Alice Edler, MD, MPH, MA (Educ),
Nancy Piro, PhD, and Bardia Behravesh, EdD,
Program Managers/Ed Specialists
Department of Graduate Medical Education
Stanford Hospital & Clinics
AGENDA
• This workshop session will:
– Discuss the need for comparative programmatic
evaluation
– Describe existing “report card/scorecard” models from
industry that could be used in our institutions
– Present an example of the “Stanford Report Card” for
comparative program evaluation
– Facilitate exercises that will support participants in
developing “report cards” and their uses based on
individual needs.
Session Objectives
• At the end of this session, participants will be
able to:
1. Understand the basis of organizational
performance assessment
2. Describe some different models for
organizational report cards
3. Identify key areas to include in GME “Report
Cards”
4. Understand key considerations for using and
distributing programmatic evaluation data.
Stanford Background
Stanford University Medical Center currently
sponsors 82 ACGME-accredited training programs
with over 1000 enrolled residents and fellows.
Stanford University Medical
Center Mission
• Dedication to pursuing the highest quality of patient care and
graduate medical education, recognizing that one of its major
responsibilities is the provision of organized educational
programs.
– Support of quality graduate medical education programs and
excellence in residency training and research.
– Guidance and supervision of the resident while facilitating the
resident’s professional and personal development and ensuring
safe and appropriate care for patients.
– Provision of adequate funding of graduate medical education to
ensure support of its faculty, residents, ancillary staff, facilities, and
educational resources to achieve this important mission.
– Ensuring that all of its graduate medical education programs meet
or exceed the Institutional and Program Requirements promulgated
by the Accreditation Council for Graduate Medical Education.
Why Do This?
• We know we’re great…..our residents love us!!
– Every Program Director will tell you so…
Can We Wait?
• Can we afford to be slow moving?
• Can we wait for ACGME site visits?
• Can we wait for Internal Reviews?
But….
• Our goal is a five year ACGME cycle
• Internal reviews at the 2-1/2 year mark…
• A lot can happen in 2-1/2 years
We Think We Need This
• ACGME and Institutions are increasingly holding
DIOs and GME Committees accountable for their
utilization of institutional resources.
• Actions / decisions must be based on
documented real-time analyses of needs.
DIOs
need to be able to
make evidencebased programmatic
decisions based on
comparative data
Few Models Exist Today
For GME
• Prior to the era of outcome competency,
educational quality was perceived solely as test
score measurement and credentialing
accomplishment.
– with the introduction of core competency education
medical educators, learners and patients are
demanding a more holistic approach to quality
medical training.
Few Models Exist Today
For GME
• The concept of Institutional Accountability is
relatively new.
– Until the ACGME Outcome Project, there was no
centralized curriculum oversight in GME, unlike
medical schools or UME.
The Report Card Vision
• In 2005, Stanford hired
its first PhD in GME
– The vision was to
develop tools to
construct evidence
based decision-making
for Graduate Medical
Education consistent
with our mission
– “We needed a Report
Card”…
SIGH….
• It wasn’t as easy as first thought!
Our First Attempt …
Background on Institutional
Report Cards
• Government and Industry Models
– Multiple models exist and can be used as per
specific purpose:
•
•
•
•
•
•
GRPA (Government Performance and Results Act)
Organizational Report Cards
Balanced Scorecard
Benchmarking
Program Evaluations
Social Indicators
Report Card vs. Balanced Scorecard
Org
Focus
Regular
Data
Collection
External
Assessment
Data
External
Transaudience
Formation
Aligned
with
mission
statement
Report
Card
+
+
+
+
+
+
Balanced
Score
Card
+
+
-
+
-
+
Which Model to Choose?
• We needed a model that was:
– Organizationally focused and managed
– Track record of effective use
– Fit our existing structure with multiple programs and
organizations
– Flexible enough to be adapted for use on an annual
basis – not an accreditation cycle-Regular Data
Collection
– “Easily digestible” internal and external
measurement dimensions
Our Choice
Balanced Scorecard
Framework
in an Organization
Report Card
(Scorecard) Tool
Best of Both Worlds
Stanford Hospital & Clinics
Report Card
• The SHC Report Card is built on the Balanced
Scorecard conceptual framework for
translating an organization’s vision into a set
of performance indicators distributed among
four perspectives adapted for GME:
1.
2.
3.
4.
Resident Perception Measurements
Program Processes
Learning Outcomes
Financial/Growth
Stanford Hospital & Clinics
Report Card
1. Resident Perception Measurements
“Guidance and supervision of the resident while
facilitating the resident’s professional and personal
development and ensuring safe and appropriate care for
patients.”
2. Program Processes
“Ensuring that all of its graduate medical education
programs meet or exceed the Institutional and Program
Requirements promulgated by the Accreditation Council
for Graduate Medical Education.”
Stanford Hospital & Clinics
Report Card
3. Learning Outcomes
“Support of quality graduate medical education
programs and excellence in residency training and
research.”
4. Financial/Growth
“Provision of adequate funding of graduate medical
education to ensure support of its faculty, residents,
ancillary staff, facilities, and educational resources
to achieve this important mission.”
The Balanced Scorecard Approach
• The Balanced Scorecard is a performance
measurement and performance management
system developed by Robert Kaplan and David
Norton (1992, 1996)
– adopted by a wide range of leading edge
organizations, both public and private.
(“The Balanced Scorecard--Measures That Drive Performance,” Harvard Business Review, Jan-Feb
1992; and “The Balanced Scorecard-Translating Strategy into Action,” Harvard Business School
Press, 1996)
Stanford Hospital & Clinics Report Card
• Indicators are designed to measure SHC’s progress
toward achieving its vision; other indicators are
designed to measure the long term drivers of success.
• Through the balanced scorecard, SHC :
– Monitors its current performance (finances, resident
satisfaction, learning outcomes and program process
results)
– Monitors its efforts to improve processes, educate
residents
– Enhances its ability to grow, learn and improve the quality
of its fellowship and residency educational programs.
Balanced Scorecard
Strategic Perspectives
How do our
residents see
us?
Institutional /
Financial Growth
Are we putting
our resources
in the right
places?
Resident
Are our
programs
excelling?
Mission
Vision
Strategy
Learning
Program
Processes
Do we continue
to improve
(outcomes)?
Measurement Across The Continuum
• PRE: Measuring events that occur before the
trainee arrives
– NRMP Results
• PERI: During Residency
– ACGME Survey
• POST: After they have left training to start
their career.
– Alumni Survey
Selection of Report Card Measures
PRE
RESIDENT
PERCEPTIONS
PRE Positions
• # Applicants/Open
• Match Depth
• % Top Medical Schools
FINANCIAL /
GROWTH
• Core Competency SelfAssessment
POST
• Alumni Survey
•Overall Satisfaction
•Recommendation of Program
•Teaching Quality
•Curriculum Quality
•Educational Leadership
•Wellness Index
ACGME Survey
•Compliant Responses
•
•
•
•
•
•
PROGRAM
PROCESSES
LEARNING
OUTCOMES
PERI
PERI
POST
GME Internal HS Survey
Faculty Eval of program
Resident Eval of program
Faculty Publications
Duty Hr Violations
ACGME Cycle Length
# ACGME Citations
• ITSE Scores
• Annual Resident
Publications
• Annual Resident
Presentations
• # Safety Incident Reports
•
•
•
•
# Res in Program
Grants Awarded
Subspecialties/Program
Physical Space/Facilities
• Specialty Board
Scores
• Core Competency
Post Assessment
“Voice of the Residents”
RESIDENTS
Program A
Program B
Program C
Program D
Program E
Program F
Program G
Program H
Program I
Program J
Program K
Program L
Program M
Program N
Program O
Program P
Program Q
Program R
PRE
PERI
# Applicants/
Open Positions
Match Depth
% Top Medical
Schools
HS Survey:
Overall
Satisfaction
>20:1
> 2 sd
>90%
>=5.0/6.0
POST
HS Survey:
Recommend
Program?
HS Survey:
Teaching
Quality
HS Survey:
Curriculum
Quality
HS Survey:
Educational
Leadership
HS Survey:
Wellness
Index
ACGME
Survey
% ?-Compliant
Responses
Alumni
Survey:
Overall
Satisfaction
>=5.0/6.0
>=5.0/6.0
>=5.0/6.0
>=5.0/6.0
>=5.0/6.0
>80
>80%
Program Processes
PROGRAM
Program A
Program B
Program C
Program D
Program E
Program F
Program G
Program H
Program I
Program J
Program K
Program L
Program M
Program N
Program O
Program P
Program Q
Program R
PERI
Faculty
program Eval's
Resident
program Evals
# Faculty
Publications
(last 5 years)
Total # Duty Hr
violations
ACGME
Cycle Length
ACGME
Cycle Length
* New program
# ACGME Citations
Last RRC Review
>=5.0/6.0
>=5.0/6.0
>Inst Avg
0
>=Current
Inst Avg
>=2.0 yrs
0
Program Matrix
Medical
Patient
Knowledge Care
Patient
safety notes
Resident
publications
ITE and
board
scores
Growth in #
of
subspecialty
programs
Duty hours
violations
Program
evaluations
Teaching
quality
Curriculum
quality
Wellness
score
ACGME
survey
compliance
Practice-Based learning
and Improvement
Professionalism Interpersonal and
Communication Skills
SystemsBased
Practice
Learning Outcomes
LEARNING
Program A
Program B
Program C
Program D
Program E
Program F
Program G
Program H
Program I
Program J
Program K
Program L
Program M
Program N
Program O
Program P
Program Q
Program R
PRE
Core Competency
Self Assessment
Baseline
PERI
POST
ITSE Scores
Annual Resident
Publications
Annual Resident
Presentations
> Nat Avg
> Inst Avg
> Inst Avg
# Valid & Serious
Safety Incident Specialty Board
Reports
Scores
>0
> Nat Avg
Core Competency
Post Assessment
> Pre Score
Financial / Growth
FINANCIAL
Program A
Program B
Program C
Program D
Program E
Program F
Program G
Program H
Program I
Program J
Program K
Program L
Program M
Program N
Program O
Program P
Program Q
Program R
PERI
# of Res in program
Grants Awarded
# Subspecialties /
Program
Expansion in Clinical
Programs
>Prior 10YR Avg
>Inst Avg
>Prior 10YR Avg
>Prior 5 years
Case Study - Stanford
• How the DIO uses the Report Card
How Do We Use this Data?
• Look at Indicators that are Resident Driven –
“Voice of the Resident”
– Is there a discrepancy between the voice of the
resident and the other indicators?
• Would the majority of the residents not choose the
program again yet the Board Scores are high?
How Do We Use this Data?
• How Do the Programs Compare Against Each
Other?
• How do they compare against their ACGME
Cycles?
What’s Next?
• GME Staff Brainstorming Session
• The Why’s – Why are programs where they
are?
• Where do we need to focus our resources?
Presenting the Data
• Program Directors Monthly Forum
– Protect the Name of the Program
• Growth and Change not Blame
Presenting the Data
• Individual Meetings with Program Directors
– Share Complete Data
Action Planning
• GME Staff working with Program Directors
• Sharing Findings with GMEC and
Administration
Political Fallout
• No Program Director wants to be at the
bottom…
• Defensiveness
• Bragging Rights
And by the way… this will help
you answer:
COMMON INSTITUTIONAL REVIEW DOCUMENT
Question 30b:
“Describe how the sponsoring institution monitors
that each program provides effective educational
experiences for residents that lead to measureable
achievement of educational outcomes of the ACGME
competencies.”
Questions
Download