External Examiners` Overview Report 2012-13

advertisement
UNIVERSITY OF BRADFORD
Overview of the External Examiners’ Annual Reports 2012/2013
Prepared for University Learning and Teaching Committee
Introduction
This paper provides a summary of External Examiners’ reports for undergraduate
and postgraduate taught provision submitted to the University of Bradford for the
academic year 2012/13.
The University of Bradford currently employs 203 External Examiners for
undergraduate and postgraduate taught programmes (6 of these did not see any
work in 2012/13 as the modules they examine did not have any students enrolled on
them).
All External Examiners are required to provide a written report, using a standard
university template (appendix 1) at the end of the year.
The External Examiners annual report template was amended for 2012/13 to
enhance the clarity of information collected in relation to provision delivered in
partnership with others. The amended template was distributed via email to External
Examiners in June 2013 for completion by 30 September 2013 for undergraduate
programmes and by 30 November 2013 for postgraduate programmes.
For each report an Action Plan has been developed by programme teams that
details the response to External Examiner comments. Completed Action Plans have
been sent to External Examiners along with a response from the Pro-ViceChancellor (Learning and Teaching) on any issues of institutional concern.
Institutional oversight of the processes related to External Examiner reports and
responses has been maintained by the Academic Quality & Partnerships Office
(AQPO). University Learning and Teaching Committee oversees the governance of
these arrangements and reports annually to Senate.
A central record of current External Examiners including contracts and fees is
maintained by AQPO. In addition a central database of University staff holding
external examining posts is maintained in AQPO to facilitate the identification of
potential conflicts of interest when considering appointments and to support
University staff in those roles.
The Guide to External Examining for Taught Programmes provides essential
information for examiners and staff on all aspects of external examining at Bradford
and is available in hard copy, as a PDF or a Word document, or on the AQPO
website.
1
The Board of Examiners Information Pack is updated annually and sent
electronically to all External Examiners. This contains all the information they need
for Assessment Committee/Board of Examiners processes along with related policy
and regulations.
The External Examiners & External Experts Sub-committee, a sub-committee of
University Learning and Teaching Committee, recommends for approval External
Examiner and External Expert appointments, and oversees the production of the
annual overview of reports and amendments to the Guide annually as required.
External Examiner reports and actions are shared with student representatives
through Staff Student Liaison Committees and Student Forums and will be included
in the Programme Enhancement Planning process for all programmes.
Summary of Actions from 2011/2012 Overview Report
Following approval by Learning and Teaching Committee (LTC) of the 2011/12
Overview Report and Update Report an Action Plan was created and this has been
updated on completion of the actions (appendix 2).
Outstanding actions:
Action 1: Review Fee Structure
It was planned to review the fee structure during the 2012/13 academic year for
implementation in 2013/14. However, LTC approved the recommendation from the
External Examiners & External Experts Sub-committee (EEEESC) to widen the
review and to produce an enhanced, more comprehensive, External Examiner Policy
during 2013/14 academic year for implementation in the 2014/15 year.
This will include clarification of the roles, fees and workloads of External Examiners,
improvements to the mentoring system, clarification of the nature of the work that
External Examiners are required to carry out in relation to assessment types, student
meetings etc. and will incorporate the information currently in the Guide.
Action 2: Distribution of Annual Monitoring Reports
Annual Monitoring Reports have been replaced in 2012/13 by twice yearly
Programme Enhancement Plans and the process for the distribution of these to
External Examiners for both home and partner provision will be included in the
External Examiner Policy.
Action 5: Enhancement of Guide and Module Pack
The Guide to External Examining for Taught Programmes was updated in 2012/13
and distributed to External Examiners, staff and partners at the start of the 2013/14
academic year.
The guide now includes an enhanced checklist of information for External
Examiners, the academic calendar for the year, the questions included in the
updated report template as well as an update of processes. Consideration of a
module pack for documentation for External Examiners had been suggested by
some External Examiners in their 2011/12 reports and this will take place as part of
the Policy discussions for 2014/15 implementation.
2
Summary of findings from External Examiners’ Reports 2012/2013
The following table details the number of External Examiner reports, by School,
received for the 2012/13 academic year.
School
Reports Received*
Total
EDT
SCIM
SOH
SLS
SOM
SSIS
CS/LSS
TOTAL
11
20
38
43
34
49
3
198
PG
3
2
14
12
9
11
2
53
UG
4
12
17
30
14
38
1
116
Reports Outstanding
Both
4
6
7
1
11
0
0
29
Total
2
1
1
2
0
2
0
8
PG
0
1
0
2
0
1
0
4
UG
0
0
0
0
0
1
0
1
Both
2
0
1
0
0
0
0
3
*Including second reports submitted by 9 examiners, but excluding 6 examiners who did not review
any work in 2012/13 as the modules they examine did not have any students enrolled on them.
The submission rate of External Examiners’ annual reports is higher than in 2011/12
at this stage (96.1% in 12/13; 93.1% in 11/12) although the submission rate for
2011/12 increased to 99.4% by June 2013 with just one report not received that
year.
The missing 11/12 report was the final report for an examiner whose contract ended
in September 2012. This external examiner has not been re-appointed and has not
been paid a fee for 2011/12. The External Examiner was responsible for a range of
modules within the School and had been in contact throughout the year with reports
to the module teams at the time of assessment. They did not have responsibility for
any programmes in the School and was not required to attend the Board of
Examiners meetings. Their module feedback was incorporated into the
enhancement planning for the School.
For 2012/13, External Examiners were reminded of the deadline dates when the
online report was launched in June 2013 and again near to the submission dates.
External Examiners who had not submitted their report by the deadline were
contacted by the Academic Quality Officer (External Examiners) by email, on a
monthly basis.
Despite this rigorous management process, 8 reports (1 undergraduate and 7
postgraduate) remain outstanding for 2012/13. These cases are being investigated
by the Academic Quality & Partnerships Office. The examiner comments will be
added to the overview once their reports are received and the School plans will be
amended if necessary as a result of these extra reports.
External Examiner reports are acknowledged by email on receipt and an Action Plan
is created and forwarded to Schools for their response to the issues. The completed
Action Plan is sent to the External Examiner along with a response from the ProVice-Chancellor (Learning & Teaching).
3
The deadlines for return of the action plans was detailed clearly when the reports
were issued to Schools to ensure a timely response. Schools are working on the
responses to the Action Plans but there are at present 15 plans that have not been
completed and returned to the External Examiner:
School
SOM
EDT
SCIM
SSIS
GS
Total
Responses awaited
1
5
7
1
1
15
The outcomes of the analysis of the available data from External Examiners’ Annual
Reports 2012/13 are largely positive and demonstrate that the standard of the
University of Bradford’s awards are set and maintained at the appropriate level, that
the assessments are valid, reliable and fair.
Once again External Examiners have confirmed that the standard of the University
Awards and student performance is in line with the standards of awards in other UK
Higher Education Institutions.
External Examiners agreed that assessments were appropriate in relation to learning
outcomes and that marking schemes were in use. External Examiners reported that
they saw good evidence of feedback to students and that this feedback was more
consistent and related to the intended learning outcomes.
External Examiners who examine programmes in Partner institutions noted some
improvement in the consistency of marking and the standard of papers produced.
Most Examiners did not see any significant differences in design or delivery of
programmes at Partner institutions and recognized the University’s efforts to
enhance this provision.
Examiners are positive about the support they receive to assist them in fulfilling their
roles and responsibilities.
An Action Plan will be compiled from the 2012/13 findings to inform future planning
for the University.
4
Recommendations to Learning and Teaching Committee:
 Examination Feedback
A. Consideration of annotation of examination scripts and return of these
scripts to students.
B. Consideration of provision of Model Answers for markers/examiners which
could also be used as general feedback to students.

Consideration of provision of IT support and Blackboard training for External
Examiners accessing work online.

Consideration of removal of the requirement for External Examiners to attend
all Boards of Examiners in favour of compulsory attendance at the main board
with the choice of attending subsequent or supplementary meetings.

Review of the process for approval of assessment drafts by External
Examiners prior to issue to students.
Sue Ledger
Academic Quality Officer (External Examiners)
7 May 2014
5
2012/13 Report Findings
At its February 2013 meeting Senate approved changes to the University academic
regulations for undergraduate and postgraduate taught programmes for
implementation from 27th February 2013 for the 2012/13 academic year (SA22/1213). The changes were communicated to all staff, students, partners and External
Examiners prior to the implementation. These changes took cognisance of previous
feedback from External Examiners.
The changes included fixing the previously flexible award boundaries at 48%, 58%
and 68%, amending the weighting of the calculation for Honours to a 20/80
(previously 30/70) split between level 5 and level 6 modules, discounting the lowest
20 credit module mark in the calculations at both levels 5 and 6, amending the
number of attempts permitted and clarifying the carry forward rules.
Some External Examiners have commented upon these changes in their reports, but
over 95% agreed that overall the processes for assessment and determination of
awards were sound and fairly conducted.
Maintaining academic standards
8. In your view are the standards set for the awards appropriate for qualifications at this level, in
this subject and against the national benchmarks and the Framework for Higher Education
Qualifications?
Yes:
98.0%
194
No:
2.0%
4
Overall External Examiners again confirmed that the standards of the University
awards are in line with national benchmarks, the Framework for Higher Education
Qualifications and where appropriate meet PSRB requirements.
Further investigation was undertaken regarding the four responses that indicated No
to question eight. One examiner remained unconvinced about the use of the 40%
pass mark for Masters which has been in operation since 2006. However, following a
discussion and clarification from the Dean of School the examiner was happy to
confirm the standards of postgraduate awards as well as undergraduate awards.
The other three examiners who replied No to question eight were concerned that
since the classification boundaries had been fixed at 48, 58, 68, to remove the
previous inconsistent use of the 2% discretion that Boards of Examiners had, the
marking criteria should also be amended.
The University position is that although the degree boundaries have been fixed and
the method of calculation of the final degree classification amended, the marking
criteria should not reflect degree classifications but should enable students to
achieve across the full range of marks. Degree classifications represent the final
categorisation of the complete range of desirable learning outcomes and should not
be applicable to single pieces of work.
6
External Examiners confirmed that the assessments were set appropriately for the
intended module learning outcomes and curriculum objectives and were pleased to
see a large range of targeted, well planned and imaginative assessments that would
allow students to demonstrate their knowledge but also the application of that
knowledge.
“Assessments are clearly pegged to learning outcomes and, both within individual modules and
across the diet, offer students a variety of opportunities to develop and demonstrate their learning
and skills. The mixture of examined and coursework assessed modules, and of assessment lengths
and requirements, is healthy and appropriately diverse. Assignment titles and questions are
consistently sufficiently open to invite complex responses and to allow students to negotiate their
own levels of engagement effectively.”
External Examiners highly praised some modules and offered specific advice to
further enhance other modules. This advice has been acknowledged by the Schools
for future planning.
External Examiners were impressed with the variety of assessments employed but a
few did question if there was too much variety and if this created overlap with
learning outcomes being assessed more than once.
This will be considered by Schools in the review of programme specifications in line
with the new curriculum framework and the specific instances reported by External
Examiners have been addressed in the Action Plan responses.
The External Examiner for the MA Research Methods programme has proposed that
the Graduate School should oversee the assessment of the research proposal
aspect of assessment on the grounds of equity and consistency. This will be
considered as part of the review of Postgraduate research.
Comparability of standards
9. In relation to your comments and judgements in this report, are you satisfied that the
standards of student performance are comparable with similar programmes or subjects in other UK
institutions with which you are familiar?
Yes:
98.0%
194
No:
2.0%
4
External Examiners were happy to confirm the standards of performance of the
University’s students in comparison with other UK institutions.
“Standards of student performance are comparable with similar programmes in the UK. The best
work is impressive, professional, technically assured and sophisticated. Work at the lower end
demonstrates appropriate minimum standards.”
Of the four responses that indicated No, two examiners had also answered No to the
Academic Standards question eight above.
7
With the exception of one External Examiner there were no issues raised regarding
the assessment of clinical modules. That External Examiner was concerned that
although students were performing at a high level in the “Clinical Practice” modules
the issue was that "Clinical Practice" assessment was not comparable with the other
forms of assessment and the higher marks awarded for the assessment could inflate
the overall degree classification awarded. This was addressed in the Action Plan
response and the Dean contacted the examiner concerned to discuss the response
in relation to the “Clinical Practice” modules.
The CPD Strategy External Examiner recognised that the modules within the CPD
framework were evaluated by the External Examiners associated with those
programmes but that “there is no single mechanism or collective analytical report
which scrutinises the outcomes of the School’s CPD provision as a combined entity”
and did not feel able to comment on standards of student performance or gauge
trends. This was discussed at a meeting with the School, AQPO and the Examiner
and a process was agreed to enable the Examiner to be able to compare
performance between the component parts of the framework as well as with other
CPD frameworks.
Working with Partners
External Examiners who examine work from Partner institutions were asked to
comment specifically about the processes and student performance in comparison
with home provision.
Two thirds of the External Examiners overseeing provision at Partner institutions
reported that they identified no difference in the performance of students compared
with students studying at Bradford.
Where differences were identified student performance at the Partner institutions
was reported to be weaker than the home student performance, with module
averages slightly lower, with a narrower range and fewer top end results. It
appeared in some programmes that there was a longer tail of resitting students and
in some others that marking was occasionally over generous.
Examiners suggested that differences in cohort sizes and modes of attendance,
differing expectations in different education systems or student ability could account
for some of the differences rather than any assessment bias, and commented that
these differences did not seem dissimilar from comparable cohorts in the sector.
Examiners recognised the support provided by Bradford staff and noted
improvements from previous years.
“MDIS has had some issues as explained above but have made some good progress. Hong Kong
programmes are on the route to become well managed but attention to the quality of students
recruited is still needed. India and Oman sites have had some signs of under-performing and
issues in assessments, but there have been signs of improvement and I attach this to the age of
these programmes.”
External Examiners also noted that poor use of English, poor grammar and
referencing and less time spent reading material in libraries could also be a factor.
8
Over 90% of External Examiners did not see any difference in the design or delivery
of programmes and modules as compared with Bradford and three quarters of
External Examiners did not identify any difference in the assessment, marking or
feedback processes at Partner institutions.
Of the differences identified by External Examiners, one Examiner did note the use
of outdated programme materials in one location and some Examiners suggested
that further clarity of guidance and contextualisation was needed for some modules
at the Partner institutions.
Examiners reported that the standard of marking and feedback had significantly
improved this year but suggested that staff at some Partner institutions would still
benefit from further training in the following areas:



assessment processes and marking technique – some marking was identified
as over generous or erratic
feedback technique – some feedback was inconsistent and of lower quality
dealing with plagiarism and poor academic practice – paying attention to
sources rather than just the percentage match in the originality report
Examiners recognised the efforts made by the Bradford teams to ensure students
were awarded appropriate and fair marks and that the feedback was consistent and
useful and noted the extra workload for Bradford staff to do this.
“There is still the need for UoB staff to amend marks and remark some scripts which is a burden
upon staff. However it is notable that this is showing signs of improvement. Once again I would
like to commend UoB staff for their diligence in making sure that all students do actually get what
they deserve in assessments.”
The comments and suggestions made by External Examiners will be included in
programme enhancement planning as appropriate and will again be the focus of
Partnership Board meetings during 2013/14.
External Examiners were mostly provided with adequate information about Partner
programmes and samples of work were clearly identified.
“The programme leader was extremely helpful and provided me with all the information that the
students have access to via the VLE.”
However, some Examiners only received the information over the course of the year
and noted that it would have been more helpful to receive the information about the
programmes and the partnership at the start of the year.
The Programme teams concerned have been made aware of this in the Action Plans
and they will address this during the 2013/14 year.
A small number of programme specific issues have been raised by External
Examiners and these have been addressed by the Programme teams at Bradford in
their enhancement planning process.
9
Measuring achievement, rigour and fairness
22. In your view are the overall processes for assessment, examination and the determination of
awards sound and fairly conducted?
Yes:
96.0%
190
No:
4.0%
8
Overall, External Examiners were confident that the academic processes were
sound and fairly conducted.
Of the responses that indicated No, three External Examiners had concerns about
the accuracy of the data presented at a specific meeting but this was resolved at the
time by the School concerned and had been caused by an error in the progression
and award rules on the database.
Five Examiners made comments regarding the changes to the academic regulations
but these comments raise no cause for concern regarding academic standards.
Once again, over 95% of External Examiners received sufficient details of
programme structure, content and assessment methods and had access to an
appropriate number and range of scripts and marking schemes to enable them to
carry out their duties effectively.
“All documents were made available to me from the start, plus appropriate module and
assessment information are provided again with scripts requiring review”
External Examiners reported being able to contact academic and administrative staff
throughout the year and found this particularly useful in addition to the written
information.
Some examiners found the information in relation to Partner programmes to be less
comprehensive than for home programmes and some found the programme
structures unclear.
Examiners who accessed student work via Blackboard and Pebblepad found it very
useful to be able to choose their own samples. Some examiners did encounter
technical problems at first, and the University needs to recognise this in future
initiatives, but these were resolved quickly and to the satisfaction of the examiners.
The checklist of documentation required by External Examiners has been enhanced
in the 2013/14 Guide to External Examining for Taught Programmes and individual
issues have been raised with the Programme teams through the Action Plans.
85% of External Examiners were sent some or all drafts of proposed assessment
tasks including almost 100% of proposed examination papers.
“Proposed assessments and exams were sent in sufficient time to be able to comment on the
content and where required offer advice”
10
Coursework drafts not received were either as a result of the assessment brief not
changing from the previous year, the draft being generic as for the dissertation, or
the External Examiner being appointed after the briefs had been approved by their
predecessor.
External Examiners requested that draft original and supplementary examination
papers be sent for consideration at the same time.
Again some External Examiners recommended that more robust checking of these
papers takes place prior to submission to the Examiner.
90% of External Examiners had sufficient time to examine assessments although
some External Examiners did report that the turnaround was very tight, and felt that
now also examining level 4 work increased both the workload of the External
Examiner and the pressure on internal staff.
“I wish to commend the team for the timely nature of their marking process allowing scripts to be
despatched well in advance of the Assessment Committee/Board of Examiners. Overall, this
process was much improved this session.”
A few External Examiners examined samples on site before meetings and found this
to be a useful exercise as it removed the need to post large amounts of work out and
enabled discussion with module leaders if required.
All Schools had been advised to produce an Assessment Schedule at the start of the
year to enable staff and External Examiners to manage their workloads in relation to
marking and attendance at assessment meetings and this will be reinforced in
2013/14.
Comments made by External Examiners on assessments mainly related to the
information supplied by Schools, for example module pack content or submission
methods, and these have been referred to the Schools concerned in the action
plans.
Student performance
Overall, External Examiners were impressed by the quality of the knowledge and
skills demonstrated by students. They reported a range of student abilities with the
best students displaying a professional attention to detail, a positive engagement
with their topic and a range of critical thinking skills. These students were able to
synthesize knowledge and present it in a readable form and to develop theory to
practice.
“The quality of work demonstrates students' ability and willingness to apply their acquired
knowledge to real life issues. This is evident from the level of analysis carried out on case studies
and group assignments. Moreover, the quality of group assignments is testimony of good teamworking, a skill that will be useful in work situations. “
External Examiners report that, in general, weaker students typically demonstrate
those weaknesses seen in students at other institutions offering similar programmes.
Examiners concurred with markers’ comments about areas for improvement in
11
student work and commended the feedback given to students. In particular for those
students who struggled some module teams provided clear feed-forward guidance
on how to develop further.
Particular areas for enhancement include further support for students in areas such
as academic writing skills, examination preparation, early identification of underperformance and guidance on the use of web resources.
The University has already reinforced its support for students in these areas and
provides workshops and one-to-one support, via the English Language Centre,
Learner Development Unit and Library, as well as a range of online tutorials in
Blackboard.
The Student Experience and Success team is committed to enhancing the student
experience and supporting attainment and employability of all students.
External Examiners were very confident that the intended learning outcomes had
been achieved by the majority of students and that a wide variety of assessment
methods had been employed. The successful students understood and were able to
articulate and apply the knowledge learned.
“There is diverse range of achievement across the modules and profiles that I have viewed. This,
in my opinion demonstrates student capability to achieve the intended programme outcomes.
Again, this is comparable to my experience across other institutions.”
Where students did not meet the intended learning outcomes Examiners felt that this
was due to a lack of engagement by the students with the module and assessment
criteria; poor English skills, although it was noted that this had improved since last
year; lower entry levels via Clearing.
These concerns and the comments will be addressed in the Action Plans and
discussed at University Learning and Teaching Committee as required.
Assessment and Feedback
External Examiners confirmed that work was marked thoroughly and to a high
standard, stating that comments made were justified in a professional and impartial
manner. They were pleased to see feed forward comments for the best work as well
as for poorer work.
“Marking was at all times thoroughly professional in terms of impartiality. The anonymising of all
coursework is an excellent practice. All the work I saw had been double marked and the
negotiation process to agree grades was always clear. Marking and feedback is thorough and full,
providing students with an excellent resource for the improvement of their work.”
External Examiners were confident that robust moderation of assessments had
taken place but some Examiners found that this was not always well documented on
the work, creating difficulties for them. Also some External Examiners felt that where
first class work was identified higher marks could be given for excellent work.
12
Some External Examiners noted that not all work was anonymous and it was
reported as best practice where work had been marked anonymously. It was
suggested that the Board of Examiners sheets should also be anonymous. Whilst
there is that option on the Board sheets now it is not generally utilised by schools.
It was felt by Examiners that students would benefit from also receiving feedback on
their examinations and suggested that this was considered by the University. In the
meantime, it was also suggested that markers put a one line explanation of the
marks awarded at the end of each script.
These concerns will be addressed in the Action Plan from the School and the
Examiners’ suggestions discussed.
In general External Examiners found no issues with the use of assessment criteria in
marking of student work. Although the criteria were well put together and sound,
Examiners felt that there was room to enhance them by making them more specific
and clearer to show where and how students were expected to gain marks and by
further consistency in approach.
It was suggested that it would be helpful for module markers, and encourage them to
use the top range of marks, to clarify the boundaries between “outstanding” and
“excellent” work in the 70 – 80% and 80 – 100% brackets.
External Examiners were generally pleased with the quality and thoroughness of
feedback to students, noting improved consistency in feedback from the previous
year with typed comments showing the best practice and evidence of more formative
feedback being provided.
Examiners supported the provision of electronic feedback, in a few cases finding
handwritten feedback difficult to read.
“Feedback is clear, thorough and extensive. It always provides a clear rational for the mark
awarded and is effective in its balance of formative and summative information.”
Examiners reported that there was some excellent feedback to students, particularly
feed forward comments, with one marker prefacing their feedback with a statement
about feed forward and how students should use the summary feedback to improve.
The External Examiner saw this as good practice.
“Feedback was comprehensive with staff using online feedback to students. Summative feedback
was clear to the students indicating what they had done well. The quality of formative feedback
was excellent and clearly guided students to what they needed to improve, and how to improve
their performance in the future. The results of this could be seen in the improvement of
referencing as students progressed on the programme.”
Some Examiners reported, however, that there was still room for further
enhancement of feedback given to students and the University has as a continuing
priority the provision of timely, good quality and effective feedback to students.
During 2013/14 the University will seek to further enhance the quality of assessment
feedback provided to students and this will be addressed by University Learning and
13
Teaching Committee with Schools, the Centre of Educational Development and
students.
A few External Examiners questioned the use of the online feedback tool and its
usefulness for students in terms of retrieving their feedback and using that feedback
in future work.
External Examiners agreed with all the marks and grades awarded to students.
“Across all 25 modules, I have had no reason to disagree with the level of marking and its
consistency. Indeed, I would like to commend the School for achieving such a high degree of
consistency across the array of modules, modes and geographical locations.”
Examiners did advise that markers ensure the grades awarded match the feedback
given and that the full range of marks is used. This had improved from previous
years but it was noticeable that markers were prepared to utilise the highest marks
but not the lowest.
Examiners suggested that good practice would be to hold refresher workshops for
staff on overall process and criteria.
Board of Examiners Meetings
The University of Bradford regulations do not permit Boards of Examiners to take
place without the participation of an External Examiner at the meeting.
The role of the External Examiner at the Boards of Examiners is detailed in the
Board of Examiners Information Pack and is outlined in the standard statement
which is read out at each Board.
Participation in Boards of Examiners meetings:
Yes
No
161
10
94.2%
5.8%
Not applicable
26
Participation in Boards of Examiners has increased to 94.2% in 2012/13 (74.7% in
10/11, 78.3% in 11/12). For 2012/13 the figure includes participation by Skype and
video/telephone conference as well as attendance in person. (Skype 18, telephone
conference call 24, video 2, in person 131)
Of the 10 Examiners who did not participate:
5
2
3
health related (1 SOH, 1 SOM, 1 SSIS, 2 SCIM)
clash of dates (1 SSIS, 1 SOH)
not aware that they needed to attend due to the nature of the awards they
examine (3 SOH)
However, Schools confirmed External Examiner participation at every meeting and
where one Examiner was not able to participate the meeting took place only if
another Examiner was in attendance to confirm the process and interpretation of the
14
regulations. A verbal or written report was received from the External Examiners not
able to attend prior to confirmation of awards to students.
26 External Examiners examine modules only and are not required to attend the
Board of Examiners meetings.
Module External Examiners are invited to attend the Assessment Committees for the
modules they examine and are required to confirm the marks prior to release to
students.
External Examiners comments were mainly positive about the assessment process
at Bradford. They were confident that the Assessment Committees and Boards of
Examiners were conducted professionally, within the regulations and with due
consideration of all relevant details. The administration of the meetings was efficient
and well organised with fair and consistent decisions made throughout.
“The board itself was well administered, efficient and professionally conducted. Regulations were
obviously well understood by the administrators of the board. There was discussion where
necessary and consistent recommendations were made across students. Tutors were
knowledgeable about their personal tutees.”
External Examiners who participated by Skype reported that this worked well with
just a few Examiners who had minor difficulties with the technical aspects or sound.
The use of Skype will be monitored further during 2013/14, and in particular the
resources required, and will be reported in the 2013/14 overview report.
The University implemented changes to the regulations in February 2013 which fixed
the boundaries for awards. The University will continue to ensure External
Examiners understand their role and responsibilities especially within the context of
the change to the University Academic Regulations that have removed the element
of discretion related to degree classification boundaries and to communicate its
expectations in relation to Board of Examiners participation.
Additionally, several External Examiners were unhappy about the reinforcement of
the attendance requirement in 2012/13 for supplementary Boards of Examiners and
requested that the University revise this regulation to the requirement to attend the
main Board and the choice to attend subsequent or supplementary Boards. This will
be considered against the QAA Quality Code indicator at a Learning and Teaching
Committee meeting in 2013/14.
There was a specific issue with accuracy of data at one Board of Examiners meeting
and this has been addressed and resolved by the School concerned.
Suggestions made by External Examiners regarding further enhancement of the
Assessment Committee/Board of Examiners process, for example for extra
information on Board spreadsheets, have been referred to the appropriate
department for action.
15
Support and Enhancement
29. Is there anything that the University could do to support you in your role as External
Examiner?
Yes:
28.8%
57
No:
71.2%
141
Overall External Examiners felt supported to carry out their duties both by the
University and by Schools and new External Examiners found the Induction session
welcoming, informative and useful.
“I feel very well supported and have clear access to administrative and academic guidance
whenever this is required.”
“I have felt well supported in this first year. I found the induction session welcoming and useful.”
Suggestions made by External Examiners, for example for Blackboard training, have
been highlighted for consideration in the University’s planning process.
Examiners highlighted other specific enhancements related to their
programmes/modules, for example for earlier confirmation of deadlines and for
flexibility of sample provision etc., and these will be addressed by the relevant
Schools in the Action Plan responses.
30. Are you satisfied that any comments made in your previous reports have been responded to,
as appropriate?
Yes:
61.1%
121
No:
8.1%
16
30.8%
61
Not applicable - new
examiner:
Although the majority of External Examiners were satisfied that their comments had
been acted on the embedding of the Action Plan process should further reassure
Examiners that their comments and suggestions have been considered in the
planning process.
Of the 16 responses that indicated No, ten had received partial responses and two
had not received a detailed enough response to their issues from the Schools
concerned.
Two External Examiner reports had raised issues with University database last year
and a response had been received but it was noted this year that the situation had
not improved. This will be investigated by the SAINT team.
Two External Examiners had specific issues that had been reported in 11/12 and in
12/13 that they felt had not been addressed; one was financial support for a
programme in terms of working environment and equipment; one was in relation to
16
reviewing the marking around the pass mark. The Schools concerned are aware of
these issues.
External Examiners whose term of office ended in 2012/13 have commented on the
welcoming and valuable experience they gained at Bradford and have commended
the professionalism and commitment of academic and support staff in supporting
students to achieve excellent awards. They commented on the improvements made
over their term of office, particularly in the quality of assessment and feedback and
recognised the commitment of staff to working with partners.
Examiners offered encouraging advice and comment for the in-coming External
Examiner and one External Examiner was particularly pleased to be able to meet
with the in-coming External, an activity they regarded as good practice.
“It has been a pleasure to serve as an External Examiner on this programme. The academic staff
at Bradford (admirably supported by administrative and Registry staff) have an obvious
commitment to this franchise agreement. This dedication is also excellently upheld by core staff at
Singapore. The Programme Director shows an admirable dedication to MDIS students and I am
very grateful for his support throughout my tenure. I wish them all well for the future.”
Other areas of good practice identified by External Examiners include:
Teaching and Learning
 Engagement with industry and placing employability at the heart
 Good spread of modules and mix of creative assessments
 Engagement in critical reflection
 Relevant, well designed and supported placements/work based learning
opportunities
 Team based learning approach
 Hands on research opportunities
 Peer group learning opportunities
 Self-assessment by students enhancing ability to reflect on own performance
and identifying areas needed to develop
 MCQ assessment enabling good students to perform highly but discriminating
good from weak students and allowing large cohorts to be marked efficiently
 Student University Ambassador Scheme building contacts with local schools
 Design Show showcasing student work to the outside world and also acting
as a marketing and recruitment event
 Dragons Den exercise on Pharmaceutical Innovation module
 Group work project for Organisation and Capacity Building module is a good
example of development of teamwork, consultancy and research skills
 Buddying system for new mentors
Processes
 Drop Box providing quick and easy access for Examiners to all course
material, background information and full access to student work
 Attending poster presentations and meeting students
 Fixing of boundaries making classification easier and decisions more
consistent
17



Small panel of markers overseeing the marking of dissertations ensuring
consistency of marking
Conduct of Broads of Examiners and Assessment Committee meetings
Good communication with External Examiner
Feedback
 Constructive and comprehensive feedback with clear feed-forward
 Electronic marking and feedback
 Online portfolio system – Pebblepad
 Audio feedback
 Blackboard feedback
Further areas for enhancement identified by External Examiners in their final
comments:

Further Language support for International Students - The English Language
Centre and International Study Centre provide specific support for
international students as well as support for home students.

Earlier identification and intervention for non-attenders – The University is
embedding the new approach to attendance monitoring and engagement and
the new "check in” system will be implemented during 13/14. The project
team are also working to enhance the reporting mechanisms associated with
the system.
Quantitative Data
7. It is your responsibility to make a declaration of interest where circumstances may give rise to a
reasonable apprehension that a potential conflict of interest could be construed so as to threaten
the quality assurance processes of the University. Do you have a potential conflict of interest to
declare?
Yes:
1.0%
2
No:
99.0%
196
7.a. If YES please give details:
I have received honoraria and research support from many companies
One of the candidates on the module works on the Leeds Neonatal Unit. I declared this at the
examiners board meeting. She was not a borderline candidate, so no discussion required.
8. In your view are the standards set for the awards appropriate for qualifications at this level, in
this subject and against the national benchmarks and the Framework for Higher Education
Qualifications?
18
Yes:
98.0%
194
No:
2.0%
4
9. In relation to your comments and judgements in this report, are you satisfied that the
standards of student performance are comparable with similar programmes or subjects in other UK
institutions with which you are familiar?
Yes:
98.0%
194
No:
2. 0%
4
13. Did you receive sufficient details of the programme structure, content and methods of
assessment etc. to enable you to carry out your duties?
Yes:
94.9%
188
No:
5.1%
10
Some:
30.3%
60
All:
54.0%
107
None:
15.7%
31
14. Were you sent drafts of proposed assessments?
15. Did you have access to a sufficient number and range of scripts to enable a view to be formed
on the marking?
Yes:
95.5%
189
No:
4.5%
9
16. Did you have sufficient time in which to examine proposed assessments and scripts?
Yes:
89.9%
178
No:
10.1%
20
Yes:
94.9%
188
No:
5.1%
10
17. Did you see evidence of the use of marking schemes?
21. Were you satisfied with the administration of the assessment process as a whole?
Yes:
96.0%
190
No:
4.0%
8
22. In your view are the overall processes for assessment, examination and the determination of
awards sound and fairly conducted?
Yes:
96.0%
190
No:
4.0%
8
Working with Partners
The number of External Examiners working with Partner Institutions and the number
of responses to questions 23 to 28 differs and so the statistics here are suspect.
However, the comments made have been included in the summary.
23. Did you see any difference in the performance of students at the Partner Institution(s)
compared with students studying at the home University?
19
Yes:
32.9%
26
No:
67.1%
53
24. Did you see any difference in the design or delivery of the programmes/module(s) compared
to similar ones offered at Bradford?
Yes:
8.2%
6
No:
91.8%
67
25. Did you identify any differences in the assessment, marking or feedback processes at the
Partner Institution(s)?
Yes:
26.0%
19
No:
74.0%
54
26. Were you provided with adequate information about the Partnership e.g. handbooks, guides
etc?
Yes:
73.0%
54
No:
27.0%
20
Yes:
85.3%
58
No:
14.7%
10
27. Was the Partner Institution work clearly identified in the sample?
29. Is there anything that the University could do to support you in your role as External
Examiner?
Yes:
28.8%
57
No:
71.2%
141
30. Are you satisfied that any comments made in your previous reports have been responded to,
as appropriate?
Yes:
61.1%
121
No:
8.1%
16
30.8%
61
Not applicable:
The Annual Report form will be updated for 2013/14 to further identify comments in
relation to Partner institutions and Board of Examination data, as well as separating
the quantitative data and qualitative comments sections.
20
Trend Data
Trend Data from External Examiner Overview Reports
2009/10 % 2010/11 % 2011/12 % 2012/13%
1. In your view are the standards set for the awards appropriate
for qualifications at this level, in this subject and against the
national benchmarks and the Framework for Higher Education
Qualifications?
2. In relation to your comments and judgements in this report, are
you satisfied that the standards of student performance are
comparable with similar programmes or subjects in other UK
institutions with which you are familiar?
3. Did you receive sufficient details of the course structure,
content and methods of assessment to enable you to carry out
your duties?
97.5
98.8
97.1
98.0 #
94.2
95.3
95.9
98.0 #
97.5
94.7
93.6
94.9 #
4. Were you sent drafts of proposed assessments?
5. Did you have access to a sufficient number and range of
scripts to enable a view to be formed on the marking?
6. Did you have sufficient time in which to examine proposed
assessments and scripts?
76.7
74.7
72.1
84.3 #
97.5
100
94.8
95.5 #
88.3
94.1
86.6
89.9 #
7. Did you see evidence of the use of marking schemes?
97.5
95.9
94.8
94.9 #
8. Did you attend the Board of Examiners meeting(s)?
9. Were you satisfied with the administration of the assessment
process?
69.2
74.7
78.3
-
97.5
98.2
93.6
96.0 #
97.5
99.4
96.5
96.0 "
61.7
65.9
71.5
61.1 $
72.5
64.2
74.2
71.2 $
10. In your view are the processes for assessment, examination
and the determination of awards sound and fairly conducted?
11. Are you satisfied that any comments made in your previous
reports have been acted on, as appropriate?
12. Is there anything that the University could do to support you
in your role as external examiner? NO
The Trend Data mainly shows an upturn in trends.
Question 10 shows a slight downturn most likely due to nervousness around the
implementation of the amended regulations which should even out as the regulations
are embedded. In addition an individual one-off event at a Board of Examiners in
one School may have affected the percentage this year.
Question 11 shows a downturn most likely due to the embedding of the action plan
process which formally records all issues and responses, reminding External
Examiners year on year of previous issues.
Question 12 shows a downturn with the majority of requests relating to IT support
and blackboard training as electronic marking and sampling is embedded in the
system; requests to remove the necessity to attend all Boards of Examiners and
individual administrative requests.
21
External Examiners Annual Report 2012/13
Appendix 1
Submission Date:
Section 1. General Information
1. Name of External Examiner:
2. Institution or workplace of External Examiner:
3. Please list individually all Home programmes/modules being examined: (if None please indicate
this)
4. Please list individually all Partner Institution programmes/modules being examined: (if None
please indicate this)
5. Type of provision examined:
Postgraduate
Undergraduate
Both (Postgraduate & Undergraduate)
6. Academic School:
School of Engineering Design and Technology
School of Computing, Informatics and Media
School of Health studies
School of Life Sciences
School of Management
School of Social and International Studies
Corporate Services/Learner Support Services
22
7. It is your responsibility to make a declaration of interest where circumstances may give rise to a
reasonable apprehension that a potential conflict of interest could be construed so as to threaten
the quality assurance processes of the University. Do you have a potential conflict of interest to
declare?
Yes
No
7.a. If YES please give details:
Section 2. Maintaining Academic Standards and the Comparability of
Standards and Student Performance
8. In your view are the standards set for the awards appropriate for qualifications at this level, in
this subject and against the national benchmarks and the Framework for Higher Education
Qualifications?
Yes
No
8.a. Please give details:
9. In relation to your comments and judgements in this report, are you satisfied that the
standards of student performance are comparable with similar programmes or subjects in other UK
institutions with which you are familiar?
Yes
No
9.a. Please give details:
10. Please comment on the appropriateness of the assessments to the learning outcomes of
modules:
11. Please comment on the extent to which students are achieving intended programme learning
outcomes as observed in the assessments you have seen:
12. Please comment on the quality of knowledge and skills (general and subject specific)
demonstrated by students, identifying any particular strengths and weaknesses that the students
have:
23
Section 3. Measuring achievement, rigour and fairness
13. Did you receive sufficient details of the programme structure, content and methods of
assessment etc. to enable you to carry out your duties?
Yes
No
13.a. Comments:
14. Were you sent drafts of proposed assessments?
Some
All
None
14.a. Comments:
15. Did you have access to a sufficient number and range of scripts to enable a view to be formed
on the marking?
Yes
No
15.a. Comments:
16. Did you have sufficient time in which to examine proposed assessments and scripts?
Yes
No
16.a. Comments:
17. Did you see evidence of the use of marking schemes?
Yes
No
17.a. Comments:
24
18. Please comment '''in each section''' on the marking of scripts and other assessed work in
relation to:
18.a. the impartiality and thoroughness of marking
18.b. the assessment criteria
18.c. the thoroughness of feedback to students
18.d. your agreement with the marks and grades
19. '''for Programme External Examiners only''' Did you participate in the Board of Examiners
meeting(s)?
Main Board only
Main Board and Supplementary meeting(s)
Supplementary meeting(s) only
None
19.a. If NONE please state why you did not participate:
19.b. If you participated, how did you do this?
In person
By Skype
Video conference
Telephone conference
Other
19.c. If you participated please comment on the operation of the Board of Examiners meeting(s):
20. ''' for All External Examiners''' Did you attend the Assessment Committee(s)?
Yes
No
not applicable
20.a.
25
21. Were you satisfied with the administration of the assessment process as a whole?
Yes
No
21.a. Any further comments:
22. In your view are the overall processes for assessment, examination and the determination of
awards sound and fairly conducted?
Yes
No
22.a. Any further comments:
Section 4. Working with Partners
23. Did you see any difference in the performance of students at the Partner Institution(s)
compared with students studying at the home University?
Yes
No
23.a. Comments: '''(Please comment separately for each partner)'''
24. Did you see any difference in the design or delivery of the programmes/module(s) compared
to similar ones offered at Bradford?
Yes
No
24.a. Comments: '''(Please comment separately for each partner)'''
25. Did you identify any differences in the assessment, marking or feedback processes at the
Partner Institution(s)?
Yes
No
25.a. Comments: '''(Please comment separately for each Partner)'''
26
26. Were you provided with adequate information about the Partnership e.g. handbooks, guides
etc?
Yes
No
26.a. Comments:
27. Was the Partner Institution work clearly identified in the sample?
Yes
No
27.a. Comments:
28. Please add any further comments you would like to make: '''Please comment separately for
each Partner)'''
Section 5. Final Comments
29. Is there anything that the University could do to support you in your role as External
Examiner?
Yes
No
29.a. Comments:
30. Are you satisfied that any comments made in your previous reports have been responded to,
as appropriate?
Yes
No
Not applicable - new examiner
30.a. Comments:
31. If this is your FIRST report did you receive a copy of the previous External Examiner's report?
Yes
No
Not applicable - new programme
31.a. Comments:
27
32. If this is your FINAL report as an External Examiner, please provide a brief overview of your
term of office, which may be passed onto the incoming External Examiner.
33. Please provide any additional comments for the attention of the School or the University under
the following headings:
33.a. Exemplary Practice -- Please add any comments not identified elsewhere in this report
33.b. Commendations -- Please add any comments not identified elsewhere in this report
33.c. Recommendations -- Please add any comments not identified elsewhere in this report
28
Appendix 2
Action
1
b/f 10/11 plan
Review Fee
Structure
11/12 plan
Include workload
review
2
b/f 10/11 plan
Review the process
for distributing and
monitoring external
examiners receipt of
AMRs for
collaborative
provision
Update 11/12
Overview with
missing reports
comments and
statistics
3
4
Continue to monitor
external examiner
attendance at
Boards of Examiners
5
Enhance information
sent to external
examiners
6
Communicate
regulations changes
to external
examiners
7
29
Ensure external
examiner access to
Evidence
By
Whom
By
When
a. Report to May
LTC for
discussion
b. Amend Guide
c. Inform
appropriate
external
examiners
a. Collate current
process from
Schools and
Partnerships
Office
b. Agree process
for distribution
AQU
For
13/14
session
Now
14/15
AQU
ongoing
a. Chase missing
reports and
review when
received
b. Overview update
to go to May LTC
c. Report in 12/13
overview
d. Collate and
monitor
attendance
sheets from
Boards of
Examiners
meetings
a. Update Guide
with enhanced
checklist of
information
b. Consider
module pack for
documentation
for external
examiners
a. Update Guide
with new
regulations and
associated
processes
b. Email external
examiners with
details of changes
a. Agree process
with HR/IT
AQU
May
2013
Completed
Y
Y
AQU
12/13
Y
Y
AQU
For
13/14
session
Y
AQU
For
13/14
session
Y
AQU
Y
Spring
2013
AQU
For
13/14
Y
University IT systems
8
30
Review and enhance
the external
examiner online
report
services
b. Include User ID
and password in
appointment letter
with instructions
for registration
a. Amend questions
and order for
clarity
b. Add areas
covered in report
to Guide
c. Enhance
instructions for
completion of
online report
AQU
session
Y
May
2013
Y
Y
Y
Download