Project Outline

advertisement
HEA funded collaborative project
(Sandra Dunsmuir, UCL; Sarah Wright, University of Southampton; Cathy Atkinson, University of
Manchester)
Developing objective structured professional assessment and feedback protocols
for professional training of practitioner psychologists
Abstract
The Children and Families Bill will pass into law by 2014, extending statutory protections for
young people with Special Educational Needs and Disabilities until age 25. Consequently the
core curriculum for training educational psychologists (EPs) will need to be developed
beyond the current focus of work with early years and school-age children. This project
proposes a collaboration between three postgraduate doctoral educational psychology
training programmes with five distinct phases; developing this new element of the
curriculum; defining relevant professional competencies; developing an objective structured
professional assessment protocol; providing student feedback delivered by podcast;
evaluation of outcomes.
Project aims and rationale
This project proposal is based on the recognized need to develop a comprehensive
framework to assess professional skills/competence/performance in educational psychology,
following Miller’s (1990) model of hierarchical learning. This project aims to improve
assessment and feedback and enhance the student experience with regard to knowledge
acquisition, development of professional competence, performance in a novel assessment
situation and transfer of learning to practice in the workplace.
Trainee educational psychologists undertake placements in local authority settings and it is
in this context that professional competence is currently assessed. A number of surveys of
practitioner psychologists, both national and international, have reported that many students
consider that their professional competencies are assessed only adequately (Scott, Pachana
& Sofronoff, 2010; Woods, 2013), typically based on a sample of observations of the student
in practice undertaken by placement or clinical supervisors. This method is not standardized
and relies to a great extent on subjective supervisor judgment. It has been argued that a
combination of continuous assessment in practice and the use of an objective structured
protocol are required for fair, reliable and valid assessment of competence (Rao, 2005).
This project aims to:
1) Focus on developing an evidence-led, high quality curriculum and competency
descriptors for working with the 16-25 population, along with novel assessment
and feedback processes, with student learning and experience at the heart.
2) Work collaboratively across HEIs, with university tutors and local authority
placement supervisors to reshape approaches to assessment, adopting an
innovative, structured and reliable approach that will enhance professional
training, in the form of Objective Structured Professional Assessments (OSPAs)
3) Provide audio feedback via podcasts to students on their performance in the
OSPAs to assist them to self-assess and reflect on their learning through
provision of detailed, individualized, high quality and comprehensible feedback.
4) Promote academic integrity through clear integration of research, theory and
practice and dissemination through publications and presentations
The objectives are to:
1. Engage employers in defining requisite professional competencies for working
with young people aged 16-25 with learning difficulties and disabilities and mental
health needs in the broad contexts in which they live, learn and work.
2. Develop and adapt Objective Structured Clinical Examinations (OSCEs), an
assessment method widely used in medical training to assess practice
competence in trainee educational psychologists (to be called OSPAs)
3. Develop a range of scenarios, delivered by actors, that will enable key
competencies to be demonstrated by students and assessed by training providers
4. Develop a protocol for assessors on structuring feedback (strengths and
improvement suggestions) in relation to key competencies
5. Deliver audio feedback to students via podcasts, enabling them to reflect on their
video recorded performance in the OSPA whilst contemplating the audio
feedback.
6. Gain data with regard to the reliability of assessments within controlled
simulated scenarios that have high professional authenticity.
7. Increase the employability of students and ensure that they are equipped and
competent within the changing workplace
This innovative project has the potential to transform assessment and feedback in
educational psychology and beyond. However, incorporating these new assessment and
feedback methods into professional training and the development and dissemination of
materials and resources, will not be possible without funding.
Methodology
The participants in this project will be Year 2 trainee educational psychologists studying on
three year post-graduate professional doctorate courses at UCL (15 students), University of
Manchester (10 students) and University of Southampton (12 students).
There will be 5 distinct phases:
1. Consulting and developing a curriculum for educational psychology training
identifying key knowledge required for work with young people aged up to 25 years,
their families and educators in the settings in which they live and learn.
2. Consulting and defining relevant professional competencies required for this area of
work in terms of both performance and action (Miller, 1990)
3. Developing OSPA protocol, and scenarios in order to assess competencies using
video (Rao, 2005).
4. Devising and implementing an audio feedback system, based on assessment of
competence in videoed OSPA scenarios framed as qualitative comment by local
authority supervisors and delivered electronically to students by podcast (Merry &
Orsmond, 2008).
5. Evaluation activities, to include calculation of reliability of competence framework to
assess videoed OSPAs; evaluation of student perspectives on participation, quality of
learning; the OSPA as an assessment tool, student usage of audio feedback;
qualitative analysis of the nature of feedback provided.
OSPAs are a method that will increase the likelihood that variability in candidate scores can
be attributed to the difference in competence of the students being tested by ensuring all
students are examined using exactly the same things (same clients/same problems/same
questions) and marked against explicit competence-based criteria. This evidence-based
approach will involve a number of time limited activities (stations) that each student
completes. It is envisaged that stations will present a range of professional scenarios relating
to practice with young people aged 16-25 years (e.g. history taking, advice giving,
information analysis and interpretation, assessment skills and communication skills). The
running of the OSPAs will be coordinated across all participating universities by a member
of the project team (Jane Lang, a Research Fellow at UCL). Actors will be employed to
enact the scenarios and provide the student with a simulated and controlled situation within
which to demonstrate the required knowledge and competencies. A university tutor will be
present at each station to oversee, monitor performance and ask questions in relation to
the task. Scenarios will also be videoed. The OSPA coordinator will download and edit the
videos and send to the assessors, who will be recruited from local authority educational
psychology services.
There will be three training days, held at each participating university. These will be
attended by local supervisors who have expressed an interest in becoming OSPA assessors.
In the morning, there will be presentations about the project and interactive training
activities relating to assessment of competence. During the afternoon, there will be live
assessment of video. Video footage of students on different courses in OSPA scenarios will
be given to the assessors (to ensure that there are no pre-existing relationships and the
anonymity of students is protected), along with marking protocols and digital voice
recorders. By the end of the day, all video scenarios will have been watched by assessors
and audio feedback recorded. The OSPA coordinator will be responsible for collating the
video and feedback and ensuring that arrangements are made for the delivery to students at
a pre-determined time.
There will be sessions for students on watching the video, reflecting on the audio-feedback
and generating professional development objectives from this. These will be discussed with
individual students in a series of professional development tutorials at each participating
university.
A wide range of outputs will be generated at each stage of the project.
Curriculum content and competencies
 Consultation across the profession about the content of the curriculum (via
meetings with professional groups, training providers, PEPs, professional bodies e.g.
BPS Division of Educational and Child Psychology (DECP)
 Consultation with two external expert advisors (experienced specialist educational
psychologists who have researched and delivered services to young adults up to 25
years of age)
 Reviewing a range of existing competency frameworks, agreeing structure and
defining competencies
 Choosing the skills through definition of scenarios
 Using an expert panel to standardise the assessment
Materials that will be generated:
1. Guide for students
2. Guide for actors involved in OSPA scenarios
3. Training for assessors (on video appraisal and feedback)
4. Guide for video assessors with regard to framing audio feedback in relation to
competencies (strengths and areas for development). Also to provide guidance on
the technicalities of transferring digital audio files
5. Guide for students on how to self-evaluate their performance in the videoed OSPAs,
reflect on the assessor’s audio feedback and derive professional development
objectives from this information
Timescales and outputs:
June- Sept 2013 – Attend meetings and conduct a survey of views across the EP
profession about curriculum content and core competencies re 16-25 work. Publicise the
project and set up meetings to recruit volunteer assessors
Early Sept 2013 – Recruit student representatives
Sept - Oct 2013 – Meetings with experts to review the consultation and draft the core
curriculum, competencies, key scenarios and assessment criteria.
Nov 2013- consult with medical advisor/educator, experienced in OSCEs re content of
scenarios and practical management and co-ordination of the stations during the OSPAs
Jan – Feb 2014- Brief supervisors and recruit as assessors
March 2014 – Brief students, practice OSPAs
April-May 2014 – finalise arrangements for running OSPAs
June 2014 - run OSPAs for Year 2 TEPs at 3 HEIs
July 2014 – Training days for OSPA assessors. Conduct inter-rater reliability checks on
OSPA assessments.
July-August 2014 – students receive video files and audio feedback from OSPAs
August 2014 – feedback questionnaire circulated online to TEPs and request completion
via Opinio
Sept 2014 – students to run focus groups
Oct-Nov 2014 – Data analysis and writing up
Dec 2014- Dissemination event at UCL for delegates from UK educational psychology
postgraduate professional training programmes
Download