Postgraduate Medical Training – Evaluation and Audit

advertisement
Postgraduate Medical Training – Evaluation and
Audit
Copenhagen Nov 2013
Professor Wendy Reid
Medical Director Health Education England
Past- Vice-President, Education, RCOG
© Royal College of Obstetricians and Gynaecologists
UK Specialty Training &
Education Programme
Professional authenticity
A model of clinical competence
Does
Shows how
Knows how
Knows
Behaviour =
Skills +
attitude
Cognition =
knowledge
Miller GE. The assessment of clinical skills/competence/performance.
Academic Medicine (Supplement) 1990; 65: S63-S67.
UK Specialist training programme
• Basic – years 1&2, part 1 MRCOG
• Intermediate – years 3,4&5, part 2 MRCOG
• Advanced – years 6&7, requires 2 ATSMs
minimum, career development and
‘independent’ competencies
• 19 core modules, subject based, includes
professional skills and leadership
Basic Training
• Exposure to the specialty
• Basic emergency obstetric and gynaecology
skills
• Understanding role of the doctor
• Team work – multi professional, develop
leadership skills
• Pass Part 1 MRCOG
Intermediate training
• Builds on basic skills
• Leadership – clinical, administrative
• Competences for normal practice i.e. day to day
obstetrics, emergency gynae and core gynae
skills
• Pass Part 2 MRCOG
• Workplace-based assessments
• More clinical responsibility, labour ward
leadership and acute gynaecology, develop
interests and choose advanced modules
Advanced training
• Core continues throughout programme!
• Advanced Training Skills Modules
(minimum x2)
• Designed to produce a workforce for the
service and give individuals scope to
develop clinical expertise in specific area
• New ATSMs in development, some
academic, some ‘professional’
Advanced training skills
modules (ATSMs)
• Fetal Medicine
• Benign Vaginal Surgery
• Advanced Labour Ward
Practice
• Advanced Lap surgery for the
excision of benign disease
• Benign Gynaecological
Surgery: Laparoscopy
• Labour Ward Lead
• Benign Gynaecological
Surgery: Hysteroscopy
• Maternal Medicine
• Colposcopy
• Advanced Antenatal Practice
• Vulval Disease
• Acute Gynaecology and Early
Pregnancy
• Abortion Care
• Gynaecological Oncology
• Sexual Health
• Subfertility and Reproductive
Endocrinology
• Menopause
• Urogynaecology
• Paediatric and Adolescent
Gynaecology
• Benign Abdominal Surgery
• Medical Education
• Domestic violence
Workplace Based Assessments
• All trainee grades in UK
• Varied names but similar principles
• Ongoing challenge of ‘formative vs
summative’
• Monitoring through Royal Colleges/Faculties
Professional authenticity
Testing formats
Does
Performance/
hands on
assessment
Shows how
Knows how
Knows
Written/
Computer
based
assessment
Miller GE. The assessment of clinical skills/competence/performance.
Academic Medicine (Supplement) 1990; 65: S63-S67.
Drivers for WBA
• New curricula – trainees need to prove
‘competence’
• GMC- the regulator ( and the public) want
explicit evidence of competence
• Professional examinations do not test ‘real
life’ skills and performance
• Learning from other systems
• One way of evaluating quality of training
UK experience of WBA
• Began with Foundation Programme (years 1&”
after graduation)
• Launched 2005, integrated assessment
process
• Regardless of post or geography
• Outcomes collated by Sheffield University
• Each training area (Deanery) informed of
‘outliers’
• Large cohort
• Centralised faculty training
Specialty training
•
•
•
•
•
From end of F2 to CCT
New curricula, launched August 2007
Assessment tools based on FP
Many ‘in development’ and specialty specific
Trainees in mixed programmes, mostly using log
books to capture evidence of progress
• Most curricula mandate ‘minimum numbers of
assessments’
• Summarised annually in ARCP (previously RITA)
Challenges of WBA in Specialty Training
• Does it really measure the doctors?
• Are we sure we are measuring the right
things?
• How often do they need to be done?
• Are they a good measure of continued
competence?
• How do we involve patients?
• How do we ensure trainers are trained and
have the time to do WBAs properly?
• To provide QA takes large numbers – poor
reliability
Other tools for QA of Training
• Longitudinal analysis of MRCOG results –
cohort comparison, demographic data
required
• Trainee doctor ‘user’ surveys
• Trainee feedback at end of training episodes
• Population wide survey of trainee doctors by
the GMC
Whole QA system
Formal requirements in UK for
Training QA
• Royal College annual report to GMC –
specialty specific
• Deanery (regional) annual report of Education
and Training – all specialties
• Trainees must complete the Annual GMC
survey
• All curriculum changes assessed by GMC
• All examination changes and examination data
submitted to the GMC
Whole QA system
• “The GMC expects medical schools and deaneries to
demonstrate compliance with the standards and
requirements that it sets. To do this, they will need to work in
close partnership with the medical Royal Colleges and
Faculties, NHS trusts and health boards and other LEPs. This
means that QM should be seen as a partnership between
those organisations because it is only through working
together that medical schools, deaneries, Royal Colleges and
Faculties, with LEPs, can deliver medical education and
training to the standards required.” (GMC Quality
Improvement Framework, para. 29)
Whole QA system
• “The GMC quality assures medical education and training
through the medical schools and deaneries but day-to-day
delivery is at LEP level. This delivery involves medical staff,
medical education managers, undergraduate and
postgraduate medical centre staff, other health professions
and employers. Clinical placements, student assistantships,
individual foundation programme and specialty including GP
training are delivered through careful supervision and
assessment by specialists in the relevant discipline advised
and overseen by regional and local staff from the UKFPO, the
Academy of Medical Royal Colleges and the relevant medical
Royal College or Faculty.” (GMC Quality Improvement
Framework, para. 46)
Role of medical royal colleges
• Set curriculum and workplace-based assessments for trainee
doctors according to GMC standards
• Set criteria for progression between stages of training
• Engage with a range of stakeholders to assure quality of
training, particularly 16 UK deaneries
• Provide fora for making policy, sharing best practice and
developing training requirements as clinical practice develops
• Provide specialist faculty development
• Assure the quality of individual trainees (recommendation for
CCT/CESR, MRCOG)
Role of medical royal colleges
• Colleges can also raise concerns about
patient or trainee safety directly with the
GMC or CQC
• Colleges work together on national
medical education policy through
Academy
Governance
• College committees agree national policy on various
aspects of specialty education (e.g. exams,
curriculum, ARCP)
• Network of College Tutors coordinate training in
individual Trusts
• Specialist educational management and leadership
roles created in Colleges (e.g. committee chair)
• Heads of Deanery Specialty Schools jointly appointed
with Colleges
• Colleges report to GMC via Annual Report
QA processes
• ARCP – colleges send specialist assessor to assure
deanery process for progressing trainees
• Quality visits – colleges provide specialist assessor on
request to join deanery visit team
• CCT/CESR(CP) and CESR – recommendation of
individual doctor to GMC for inclusion on specialist
register
• Examination – standard-setting
• Curriculum approval – changes to curricula approved
by GMC
Data on quality
• GMC Trainee Survey
• ARCP outcome data – summary of achievements
annually for every trainee
• Examination data
• Colleges’ own surveys (e.g. Training Evaluation Form)
• Reports from external assessors on local/regional QA
processes (ARCP and quality visits)
• Increasingly linked with quality of care and patient
safety reviews
GMC Trainees’ survey – O&G
Perspectives
• Three specific elements o How O&G trainees compare with other
specialties.
o How the results from this year for specific
questions compare with those in previous years
(looking at the areas previously considered).
o Specialty Specific Questions
Total number of trainees responding 49000 (95%)
Trainee Evaluation Forms
• Not mandatory
• Might work effectively if based on MSF ‘360’
feedback
• Should be real-time tool for local training
quality management
• Best discriminator is ‘would you recommend
this job to a friend?’
Programme Groups
National
Programme Group
ACCS
Mean
This Report
Min Q1 Median
Q3 Max Lower CI
Upper CI
N
Mean
Lower CI
Upper CI
N
79.66
24 72
80 92
100
78.75
80.57 1114
79.66
78.75
80.57 1114
Acute Internal Medicine
81.72
20 76
84 92
100
81.32
82.12 4766
77.6
75.97
79.24
302
Allergy
81.72
20 76
84 92
100
81.32
82.12 4766
82.8
74.7
90.9
10
Anaesthetics
82.68
20 76
84 92
100
82.16
83.2 2409
82.58
82.05
Anaesthetics F1
75.46
20 68
76 84
100
75.12
75.79 7077
89.92
88.5
91.33
198
Anaesthetics F2
78.67
20 72
80 88
100
78.33
79.01 7138
87.79
86.32
89.27
232
Audio vestibular medicine
81.72
20 76
84 92
100
81.32
82.12 4766
81.6
74.11
89.09
15
Cardiology
81.72
20 76
84 92
100
81.32
82.12 4766
81
79.75
82.26
550
Cardio-thoracic surgery
83.67
20 76
84 96
100
83.19
84.14 3514
82.78
79.28
86.28
95
Chemical pathology
84.93
20 80
84 96
100
83.93
85.92
672
80.45
76.81
84.09
62
Child and adolescent psychiatry
86.46
20 80
88 96
100
85.75
87.18 1232
87
85.32
88.67
211
Clinical genetics
81.72
20 76
84 92
100
81.32
82.12 4766
86.78
83.84
89.73
46
Clinical neurophysiology
81.72
20 76
84 92
100
81.32
82.12 4766
85.22
80.66
89.77
23
Clinical oncology
84.53
20 76
84 96
100
83.85
85.2 1332
82.3
80.79
83.81
285
Clinical pharmacology and therapeutics
81.72
20 76
84 92
100
81.32
82.12 4766
78.86
71.5
86.22
21
Clinical radiology
84.53
20 76
84 96
100
83.85
85.2 1332
85.13
84.38
85.89 1047
CMT
74.55
20 64
76 84
100
74
75.11 2730
74.55
74
75.11 2730
Community Sexual and Reproductive Health
78.59
20 68
80 88
100
77.93
79.25 1915
78.46
71.5
Core Anaesthetics
85.28
20 80
88 96
100
84.48
86.08 1052
85.28
84.48
86.08 1052
CPT
81.77
24 76
84 92
100
81.06
82.47 1529
81.77
81.06
82.47 1529
CST
74.52
20 64
76 88
100
73.67
75.38 1463
74.52
73.67
75.38 1463
Dermatology
81.72
20 76
84 92
100
81.32
82.12 4766
84.29
82.38
86.21
191
Emergency medicine
80.15
28 72
80 92
100
78.98
81.32
550
80.15
78.98
81.32
550
Emergency Medicine F1
75.46
20 68
76 84
100
75.12
75.79 7077
87.81
86.05
89.57
169
Emergency Medicine F2
78.67
20 72
80 88
100
78.33
79.01 7138
82.45
81.75
83.14 1199
83.11 2358
85.43
13
O&G Programme Group
Comparison
Supervision (1)
How would you rate the
quality of (clinical) supervision
in this post?
Very Poor
Poor
Fair
0.26%
0.91%
1.31%
1.38%
3.00%
3.81%
5.09%
4.31%
O&G Trainees 2012
15.83%
18.02%
20.08%
22.61%
O&G Trainees 2010
50.05%
48.95%
49.78%
49.22%
Good
Excellent
O&G Trainees 2011
O&G Trainees 2009
30.86%
28.31%
23.74%
22.49%
Did you have a designated educational supervisor (the person responsible for your appraisal) in this post?
Yes
2012 – 99.3% (2011 – 99.5%, 2010 – 99.5%, 2009 – 99.8%)
In this post did you have a training/learning agreement with your educational supervisor, setting out your respective responsibilities?
Yes
2012 – 86.4% (2011 – 91.9%, 2010 – 92.6%, 2009 – 91.1%)
In this post did you use a learning portfolio?
Yes
2012 – 92.4% (2011 – 94.7%, 2010 – 89.9%, 2009 – 91.2%)
In this post were you told who to talk to in confidence if you had concerns, personal or educational?
Yes
2012 – 71.2% (2011 – 77.7%, 2010 – 72.2%, 2009 – 68.8%)
Supervision (2)
Did you have a formal meeting with your supervisor to
talk about your progress in this post?
Did you have a formal assessment of your performance
in the workplace in this post?
O&G Trainees 2012
O&G Trainees 2011
O&G Trainees 2012
O&G Trainees 2011
O&G Trainees 2010
O&G Trainees 2009
O&G Trainees 2010
O&G Trainees 2009
No, but I would like to
No, but this will happen
Yes, but it wasn't useful
Yes, and it was useful
1.63%
1.71%
1.81%
3.79%
No, but I would like to
6.47%
4.17%
6.65%
13.83%
10.15%
11.94%
11.82%
10.04%
81.13%
82.17%
79.71%
72.34%
4.84%
5.17%
6.94%
8.27%
No, but this will happen
8.94%
6.50%
10.30%
20.22%
Yes, but it wasn't useful
7.26%
8.94%
8.17%
7.60%
Yes, and it was useful
76.97%
79.38%
74.59%
63.91%
Access to Training (1)
How would you rate the practical
experience you were receiving in
this post?
Very Poor
Poor
O&G Trainees 2012
1.05%
1.02%
1.64%
1.79%
O&G Trainees 2011
O&G Trainees 2010
5.10%
4.09%
7.11%
7.60%
Fair
O&G Trainees 2009
22.50%
25.13%
26.53%
30.86%
42.95%
47.53%
44.69%
42.34%
Good
Excellent
Never
Rarely
Monthly
O&G Trainees 2012
2.05%
3.92%
4.98%
3.89%
O&G Trainees 2011
O&G Trainees 2010
21.50%
28.03%
28.17%
27.45%
O&G Trainees 2009
16.82%
16.43%
17.72%
16.15%
38.15%
36.71%
37.44%
Weekly
Daily
28.39%
22.23%
20.02%
17.40%
12.72%
13.47%
12.42%
15.07%
46.90%
In this post, how often have you
worked beyond your rostered
hours?
Access to Training (2)
O&G Trainees 2009
2.63%
O&G Trainees 2010
O&G Trainees 2011
O&G Trainees 2012
16.81%
20.63%
12.62%
19.04%
18.27%
10.89%
2.90%
22.85%
17.57%
9.10%
1.53%
26.60%
16.40%
7.89%
1.68%
47.31%
48.91%
How confident are you that this post will help you
acquire the competencies you needed at that
particular stage of your training?
Very confident
Fairly confident
Neutral
48.95%
Not very confident
Not at all confident
47.42%
How good or poor was access to each of the following
in your post? (2012 question only)
Simulation facilities
Equipped rooms for group teaching
Space for private study
Internet access
E-learning resources
Very
Good
Good
Online journals
Library
9.41%
31.91%
11.72%
53.68%
6.36%
31.91%
22.77%
14.14%
12.78%
24.24%
51.79%
53.21%
46.32%
46.90%
Working Beyond Competence (1)
In this post how often did you feel forced to cope with clinical problems beyond your competence or experience?
26.67%
O&G Trainees 2009
55.74%
10.11%
6.16%
1.32%
27.90%
O&G Trainees 2010
10.28%
6.24%
0.98%
O&G Trainees 2012
O&G Trainees 2010
O&G Trainees 2011
O&G Trainees 2012
4.19%
2.81%
0.72%
4.38%
2.13%
0.82%
3.81%
1.82%
0.68%
3.36%
1.58%
0.42%
32.66%
57.70%
8.93%
6.03%
0.85%
44.11%
48.37%
4.89%
2.21%
0.42%
In this post how often have you been expected to obtain
consent for procedures where you feel you do not understand
the proposed interventions and its risks?
59.63%
O&G Trainees 2009
33.64%
32.58%
59.03%
Never
Rarely
61.11%
O&G Trainees 2010
Monthly
Weekly
O&G Trainees 2011
Daily
28.50%
Weekly
Daily
In this post how often, if ever, were you supervised by
someone who you felt wasn't competent to do so?
O&G Trainees 2009
Rarely
Monthly
26.49%
O&G Trainees 2011
Never
54.60%
33.01%
34.83%
11.75%
17.93%
2.48%
2.06%
0.61%
0.11%
1.85%
0.64%
0.06%
66.14%
O&G Trainees 2012
1.10%
0.21%
0.11%
22.20%
20.28%
16.56%
75.01%
Never
Rarely
77.17%
Monthly
Weekly
79.71%
Daily
Working Beyond Competence (2)
O&G Trainees 2009
O&G Trainees 2010
No, there was no-one I could contact
O&G Trainees 2011
In this post did you always know who
was providing your clinical supervision
when you were working? (2009 – 2011
inclusive)
0.00%
0.55%
0.23%
No, but there was usually someone I
could contact
6.96%
8.34%
6.03%
Yes, but they were not easy to access
6.72%
8.34%
6.60%
Yes and they were accessible
86.32%
82.78%
87.14%
3.89%
0.37%
0.16%
6.89%
In this post did you always know who
your available senior support was during
on call (2012)
Yes - accessible
Yes - not easy access
No - but someone to
contact
No - no one to contact
88.70%
N/A
Undermining – 2012 (1)
0.63%
1.74%
How often, if at all, have you been the victim of bullying and
harassment in this post?
2.00%
3.73%
18.82%
64.51%
8.57%
Every Day
0.47%
At least once per week
2.58%
How often, if at all, have you witnessed someone else being the
victim of bullying and harassment in this post?
At least once per fortnight
2.42%
At least once per month
7.26%
24.76%
Less often than once per month
51.95%
10.57%
Prefer not to answer
0.53%
2.00%
In this post, how often if at all, have you experienced behaviour
from a consultant/GP that undermined your professional
confidence and/or self esteem?
2.79%
5.47%
26.81%
54.15%
8.25%
Never
Undermining 2012 (2)
Overall
•
96.0% of trainees said they had never
been bullied and/or harassed in their post,
or if they had, it happened less than once
a month. 1.1% said it happened every day
or at least once per week (n=48,512).
•
1.6% said they had witnessed someone
else being the victim of bullying and/or
harassment in their post every day or at
least once per week (n=48,464).
•
92.4% said they had never experienced
behaviour from a consultant or GP that
undermined their professional confidence
and/or self-esteem or, if they had, it
happened less than once a month. 1.7%
said it happened every day or at least
once per week (n=48,785).
O&G
•
The equivalent figures for O&G are
83.33% and 2.37%.
•
The equivalent figure for O&G is
3.05%.
•
The equivalent figures for O&G are
80.96% and 2.53%.
(1902 respondents for O&G)
O&G versus other specialties
Next Steps
• Specialty specific questions to be further analysed
• Breakdown by training level may also be available –
need to discuss with GMC.
• Will be involved with preparation for 2014 survey.
• Must publish more quickly
• An updated trainers survey has been discussed –
believe something may be in place within 12 months.
• TEF – a potential method of triangulation?
• RCOG has appointed Workplace Advisory Officer to
combat undermining
Who does what in governance
of training?
• Education Board -RCOG
• Heads of Schools – joint between college and
local regional Postgraduate Dean
• Local Education and Training Boards
(Deaneries)
• Individual hospitals (Local Education
Providers, LEPs) – DMEs or Clinical Tutors
• Individual doctors through trusts or
organisations
Future developments
• Outcome of review of QA system by GMC in Autumn 2013
(note: GMC became regulator in 2010 for PG medical
education)
• Growing recognition of need to clarify role of colleges in QA
• Increased focus on sharing data between deaneries, colleges
and GMC
• Increased emphasis on the role of educational and clinical
supervisors/trainers with consequent impact on service
delivery
• Impact of national policy changes, e.g. Shape of Training, new
English healthcare structure
• Role of HEE, relationship with Colleges, GMC, devolved
nations
Download