Feedback

advertisement
PAPER 12/PC/33
Assessment and Academic Feedback
Student Views at Annual Lectures 2012
A summary is given on this page, followed by more detailed findings on pages 2 and 3, and the Appendix on
page 4 gives information on the format of the interactive lectures.
During June-December 2012, in the 2012-13 academic year, David Hope (DH) and Helen Cameron (HSC)
met with every student year group to discuss their assessment and facilitate a discussion on feedback. This
innovation is part of an on-going effort on behalf of the School and University to better understand student
perspectives on assessment and feedback and advise students of work in progress. It is expected that in
future all students will have a similar opportunity to discuss assessment and feedback every year.
Summary of principal findings
1. Students were unclear on many aspects of their assessment, especially concerning how we arrive at
their marks
2. There were many complaints over poor feedback, particularly after MCQ and OSCE exams but also
during and at the end of clinical attachments.
3. However students were enthusiastic about much of the work that is now underway
4. The cause of some problems – such as the inability to take away exam papers or the variability of
feedback from tutors – have often not been clarified to students and doing so may help
Action points and priorities from findings
1. All Y4 modules will develop formative exams to be delivered online for students to undertake in
their own time in 2013-14. Detailed formative feedback will be offered via OSCA-FM.
Development will be taken forward to allow students to see how their own performance compares
with that of peers.
2. Y2 will pilot a formative exam in Semester 2 of 2012-13, using OSCA and OSCA-FM, delivered
online for students to undertake in their own time.
3. The Y1 formative exam will compare use of tags and OSCA-FM.
4. All OSCA exams will have 2 or 3 questions asking students to indicate the fairness of the exam and
to offer feedback on troublesome or confusing questions.
5. Summative exams will in time have every question tagged with enough information to give students
a helpful breakdown of performance.
6. A formative approach to miniCEX assessments will be further debated with staff and students in Y35.
7. Further exploration of ways to assess PPD in Y3-5 more reliably is required.
8. The balance between global (tutor-dependent) marks and those from more objectively marked tests
should be reviewed.
9. Research / development funding will be sought for administrative help to trial the return of detailed
feedback from OSCE stations, based on tags.
10. Discussion threads on assessment and feedback will be opened on EEMeC for each year group to
encourage and support discussion on these matters.
11. Information to clinical tutors needs to emphasise the School’s principles of assessing students against
expected standards of the stage, and awarding an A-grade to 10-15% of all assignments.
12. Meetings with whole year groups will be repeated next year.
******************************************************************
Findings
Participation
 This tended to be extremely good for years 1-4, but was very poor for year 5

Students suggested this was a combination of two factors: many year 5 students felt that they were
unlikely to benefit personally from new feedback innovations (due to time) and because students
were much more likely to be dispersed over a large area

Participation was active and many students were willing to discuss the topics in depth and at length
Assessment
 Most students were aware of the goal of assessment and the general purpose behind it

Students tended to become more aware of standard setting (both conceptually and on a practical
level) as they advanced through the course with Y1s knowing very little and Y5 having some
detailed knowledge

However knowledge of specifics was very scarce and none could name the standard setting
technique used in most of our MCQ exams

Some students – especially in year 1 + 3 – seemed to be under the impression that standards were
adjusted so a certain number of candidates always failed, which is untrue

Students – especially in later years – noted that there were frequent grammatical/proofreading errors
in exam papers and that this caused unnecessary worry

Students wanted access to past papers and were worried that leakage of questions meant a small
number of people knew what would be in the exams in advance

Students in clinical years frequently emphasised concern over variability in marks given at the end of
module. They were particularly concerned that some tutors would give high global judgements even
when the student went on to fail the written (MCQ) or OSCE exam, and wanted tutors to try to be
more accurate

Students wanted, wherever possible, to receive feedback only from those who had observed them
regardless of whether that person was a consultant or registrar

Variation among tutors was a particularly frustrating point with many students, particularly where
such global marks comprised a significant fraction of the total module mark. Some tutors always
gave a particular grade to the entire group, or claimed to never give ‘A’ grades. It would appear that
such tutors did not appreciate that medical students should be assessed according to the standard
expected of them at that stage and that 10-15% of students should be awarded an A-grade for any
one assignment or component. Although students recognised these staff misconceptions, they felt
unable to question them.

Students frequently suggested converting the Personal Professional Development (PPD) assessment
to a Pass/Fail decision with a narrative commentary, though several other students remarked that
professionalism and the personal skills assessed through the PPD mark were key to being a good
doctor.
Feedback
 Feedback on summative assessment should be timely but after the BoE sits. Immediate feedback was
considered too stressful.

There was some disagreement on feedback to failing students. While some felt it was appropriate
that such students received more support, those who had just passed felt they had, in comparison,
little to guide future improvement.

Students felt the ability to see their own answers was critical, as their knowledge and understanding
might have changed since the exam.

Tagging – providing scores for individual domains of the exam – was considered useful but
perspectives on the specificity varied. Generally students felt that more general tags would help when
they did poorly, and specific tags would help identify the relatively smaller domains of weakness
when they did well overall.

Being able to relate performance to the rest of the class (especially with tagging) was seen as
important.

In whole class feedback many discussions around questions were seen as irrelevant and wasteful of
time when students had answered them correctly. On the other hand other groups noted that there
tended to be a focus on a very small number of questions due to time constraints.

Students could not remember which answers they had put down in the exam, did not always have
their own responses at the feedback session and were not permitted to write things down.Some
groups (especially year 2) noted that where there was an attempt to give them their own mark sheets,
this was often difficult to follow and understand and suggested better methods be developed.

As a result most students found class wide feedback sessions unhelpful and claimed this was the
reason for low attendance in such events.

Most students were not aware of the level of concern around leakage expressed by many staff so
were unclear as to why they were unable to write things down or receive their exam papers back.

Students suggested that personal tutors might go over summative feedback with them and show them
their answers (but not let them take anything away).

The idea of organized PBL-style groups discussing feedback was popular in the early years but much
less so in the later years.

Students had a very negative response to feedback being organized out of hours (evenings or
weekends) as some student commitments were difficult to move (especially work).

Notably, the only exception to this was Y1 – most did not offer an opinion on out of hours feedback
when asked to vote on it.

The idea of a formative exam per module was very positively received by all years, especially if they
mimicked the final exam.

Students noted that in the past formative exams were often much easier, and with a very narrow
focus compared to the real thing.

Students were made aware of the feedback log and asked to complete the feedback questionnaires
when they appeared so as to share best practice.
Drs David Hope and Helen Cameron
Centre for Medical Education
4 January 2013
APPENDIX
Format of Meetings
Each meeting was held in a lecture theatre, lasted an hour, and took the form of an interactive
lecture, with many discussions, addressing the following:
 Range of purposes of assessment

Overview of assessment methods and components for the year

Deciding the pass score (concentrating on Modified Angoff Method) and number of A
grades

Calculating the academic score for allocation of Foundation posts, and the award of
MBChB with Honours.

Academic feedback as an integral part of teaching

The tutor’s dilemma: challenging feedback improves performance but may not be
appreciated by students and staff have difficulty presenting it constructively.

An overview of recent research on feedback in the Edinburgh MBChB

Open discussion on the problems with assessment and/or feedback

More focussed discussion on draft proposals for feedback on summative MCQ exams
and formative exams using OSCA-FM (Online System for Clinical Assessment in
Feedback Mode) and tagging of MCQ questions.
Contemporaneous notes were taken by DH and HSC and were used to construct a record of
each meeting and then this report.
Drs David Hope and Helen Cameron
Centre for Medical Education
4 January 2013
Download