UCL’s Quality Management and Enhancement Structures and Review of Processes

advertisement

Review of

UCL’s Quality Management and Enhancement Structures and

Processes

Report to Academic Committee (via Quality Management and Enhancement Committee) by the Quality Strategy Review Group

-

-

-

Summary

The Quality Strategy Review Group was charged by the Vice-Provost (Academic and

International) to conduct the review of UCL’s quality management and enhancement structures and processes envisaged in the Provost’s June 2005 document ‘UCL: the White Paper - one year on’. This report: sets out the background to and context of the review comments on the specific areas of QME operations that the review has addressed

[paragraphs 13-60]

[paragraphs 1-12] concludes with proposals for ‘efficiency gains’ for consideration by the December 2005 meetings of Quality Management and Enhancement Committee and, subsequently,

Academic Committee [paragraphs 61-72]

– including (i) maximising the scope of the new system of Annual Monitoring Reports to help streamline QME processes, (ii) specific changes to the Internal Quality Review process, and (iii) discontinuation of the process of

Quinquennial Programme Review.

November 2005

1

QA

QAA

QME

QMEC

QPR

QSRG

SEQ

SES

SMT

SR

TQA

UPR

Key to abbreviations

AC Academic Committee

ACWGEAP

AM

AMR

CBE

Academic Committee Working Group on Examinations and Assessment Policy

Annual monitoring

Annual Monitoring Report

College Board of Examiners

DAPs

DfES

DTC

EISD

ESCILTA

FTC

HEFCE

HEI

ILTS

IQA

IQR

LTS

PDExSCo

PIQ

PoT

PRSG

Degree-awarding powers

Department for Education and Skills

Departmental Teaching Committee

Education & Information Support Division

Executive Sub-Committee on Innovations in Learning, Teaching and Assessment

Faculty Teaching Committee

Higher Education Funding Council for England

Higher education institution

Institutional Learning and Teaching Strategy

Internal Quality Audit

Internal Quality Review

Learning and Teaching Stategy

Programme Development Executive Sub-Committee

Programme Institution Questionnaire

Peer observation of teaching

Programme Review Steering Group

UPRSG

WPOYO

Quality assurance

Quality Assurance Agency for Higher Education

Quality management and enhancement

Quality Management and Enhancement Committee

Quinquennial Programme Review

Quality Strategy Review Group

Student Evaluation Questionnaire

Self-evaluative Statement

Senior Management Team

Subject Review

Teaching Quality Assessment

Unified Programme Record

Unified Programme Record Steering Group

UCL: The White Paper – one year on

2

Background

1 The Provost’s June 2005 document ‘UCL: the White Paper - one year on’ includes the following statements:

… UCL is committed to teaching excellence and innovation and ensuring that our curricula have an international outlook. As Heads of Department suggested at the recent meeting [on 1 June 2005], there are several opportunities to achieve greater efficiencies in terms both of academic time and administrative overhead. …

… Professor Worton [Vice-Provost (Academic and International)] will … facilitate urgent consideration of the ideas that arose at the meeting on 1 st June with regard to quality assurance measures. …

2 The ‘Next Steps’ action schedule which concludes the WPOYO charges Professor

Worton to conduct a ‘review of UCL’s teaching quality assurance mechanisms with a view to achieving efficie ncies’ with a timescale of ‘Immediate commencement; report by end-November 200 5’. Professor Worton convened a preliminary meeting in July 2005 - with key committee Chairs (Professors Chris Carey, Vince Emery and Peter Mobbs) and officers - to discuss the parameters of and process for the review. The meeting agreed that it made sense for the review to be conducted in conjunction with the (ongoing) development of a UCL Quality Management and Enhancement Strategy and, therefore, for the review to be undertaken by the existing Quality Strategy Review Group of QMEC.

3 The QSRG is chaired by Professor Chris Carey (Head, Department of Greek and Latin and Chair of QMEC). The other academic members of the Group are Professor Vince

Emery (Pro-Provost and Chair of several committees with responsibilities for teaching quality management and enhancement) and Professor David Saggerson (Head,

Department of Biochemistry and Molecular Biology). The Group’s membership also includes Jason Clarke (Academic Services) and Gary Smith (Registry) from the

Corporate Support Services. Tim Perry (Director of Academic Services) was co-opted to membership of QSRG for the purposes of conducting this review of QME structures and processes.

4 The timescale for the review indicated in the WPOYO was subsequently adjusted: while

Professor Worton is being asked to give quarterly reports to the Provost’s Senior

Management Team on ‘improvements in efficiency [in] … teaching, quality review and timetabling’, an interim report from AC on ‘Review of teaching quality (and enhancement) processes ’ is due to be submitted by Professor Worton and Tim Perry for consideration at the SMT meeting on 11 January 2006. QSRG has therefore worked to produce a report for submission to the QMEC meeting on 5 December 2005 – for transmission to the AC meeting on 14 December 2005 and thence to the Provost’s SMT meeting of 11

January 2006.

Context, scope and underlying principles for the review

5 The scoping meeting convened by Professor Worton in July 2005 noted the following national developments in recent years and their internal consequences at UCL:

3

direct external QA burden on departments/disciplines has reduced with the demise of universal Subject Review by the QAA

the shift away from universal SR was part of a ‘deal’ in which DfES/HEFCE agreed to discontinue SR as HEIs had claimed that they had in place their own robust QA procedures – that claim to be tested in future in QAA Institutional Audit. (UCL was one of the HEIs which lobbied vociferously for the end of universal SR on those grounds)

emerging evidence of a widespread view within the UCL academic community (and common to other research-intensive HEIs) that, with the demise of universal SR, the requirement for internal QA has diminished

a continuing need for robust internal QA procedures to withstand external scrutiny by the QAA in Institutional Audit – since failure to provide evidence of these could lead to renewed debate on the scope of external QA review with the risk that this could jeopardise the autonomy of HEIs.

6 The scoping meeting also noted the concerns that had been expressed in recent senior academic discussion fora at UCL, including the Heads of Departments meeting of 1 June

2005 referred to in the WPOYO. These included concerns, repeatedly expressed during the academic year 2004-05, about certain aspects of UCL’s programme development processes and, in particular, the Programme Institution Questionnaire [see paragraphs 19-

21 below] . UCL’s policy on double marking of examinations and assessed work had also been questioned at the meeting on 1 June 2005 [see paragraphs 41-42 below] . The scoping meeting understood that other issues raised on 1 June 2005 were already the subject of separate reviews or action plans ( eg UCL’s student admissions processes, rationalisation of the number and structure of teaching programmes).

7 The scoping meeting agreed that both the national developments etc and the internal concerns about specific existing QME processes, as outlined above, should form the context of the QSRG review.

8 A further issue raised on 1 June 2005 was the division of administrative workload within departments between academic staff and support staff. The scoping meeting saw this as a matter of departmental management rather than a consequence of existing QA policies or other requirements – and therefore as falling outside the scope of the QSRG review.

Nevertheless, the scoping meeting agreed that a fundamental issue for review by QSRG was the question of whether UCL should continue to profess and implement an academic-led QA strategy .

9 The scoping meeting in July 2005 was followed by two meetings of QSRG devoted to the review of QME processes and structures (on 5 October 2005 and 10 November 2005).

10 At the first of these meetings QSRG noted the above contexts and other main features of

UCL’s existing QME processes and structures and agreed that the following specific areas should be included in its review:

the Programme Institution Questionnaire

4

Internal Quality Review

Quinquennial Programme Review

the proposed new system of Annual Monitoring Reports on teaching and learning operations in departments

double marking of examinations and other assessed work

student feedback processes, including student evaluation questionnaires

peer observation of teaching

discipline-based learning and teaching strategies

operation of the pyramidal structure of Academic Committee, Faculty Teaching

Committees and Departmental Teaching Committees.

11 QSRG also confirmed at the first of its two meetings what it saw as essential characteristics of UCL’s QME structures and processes – ie that these should be designed:

to be fit for purpose

to ensure maximum integration and minimum duplication of effort for academic departments and faculties

to ensure that UCL retains the increased degree of independent responsibility for its QA operations that has resulted from the demise of universal subject-level review by the QAA.

12 Except where a clear distinction is being made between (i) quality management and (ii) quality enhancement processes, the term ‘QA’ (rather than QME) is used for convenience in this report hereafter.

An academic-led QA strategy

13 The critical self-analy sis which formed the basis of UCL’s successful application for taught and research degree-awarding powers in November 2004 included the following statement:

UCL’s quality management and enhancement agenda is driven by the fundamental aim of providing high quality programmes of study, delivered through first rate research-led teaching and resulting in demonstrably excellent qualifications, based on demonstrably valid assessments. UCL’s commitment to disciplinary focus of the quality agenda has led us to establish two strategic principles in our approach to QME: that it should be led by the academic staff who are active in teaching and research and on whom the delivery of high quality programmes and qualifications depends; and that ownership of quality processes should be devolved to as close as possible to programme level.

5

14 The implications and consequences of an academic-led approach to QA which impinge on UCL academic staff time include the following:

discussion on policy development is conducted and major decisions are taken by academics working through the committee system

there is active involvement of academic staff at all grades in QA operations at department, faculty and institutional levels ( eg membership of teams for Internal Quality Review)

 UCL’s central QA process – Internal Quality Review - aspires to be more than mechanical compliance-testing and to promote a constructive dialogue between the reviewers and the academic department being reviewed

UCL has continued to eschew the

‘quality unit’ approach espoused by many other HEIs, where key decisions on QA matters are taken by a central QA unit staffed by administrative ‘professionals’ with those decisions simply notified to the academic community and reported for information to the relevant committee: UCL Academic Services’ role is essentially (i) to co-ordinate rather than direct QA operations and (ii) to support the academic staff involved in QA processes.

15 The QMEC’s IQR Panel provides a good illustration of the current arrangements. The

Panel is responsible for monitoring the IQR process and, in particular, follow-up action taken by departments in the light of IQR reports. Reading these reports and the action plans devised by departments in response to the reports’ recommendations is timeconsuming for the membership of the Panel, which is predominantly academic. The

Head and/or another representative of the department which has produced the action plan attend a meeting of the Panel to discuss the plan and the department’s perceptions of the IQR process. The modus operandi of the Panel could be changed so that followup to IQR became a largely paper-based process, administered and monitored purely by officers, and this would certainly lead to a saving of academic staff time. It would also mean the loss of what seems to be mutually valued dialogue between academic departments which have recently been reviewed and their academic colleagues on the

IQR Panel.

16 A distinctive feature of UCL’s QA structures and processes – and which is arguably an indirect consequence of the academic-led approach to QA – is that these are sufficiently flexible to take account of the real pedagogical etc differences which exist between different subject areas. For example: in summer 2003 Academic Committee agreed to introduce a harmonised scheme of award for undergraduate course-unit degree programmes across UCL; the next steps in implementing this decision were summarised in the November 2004 DAPs application as follows:

Movement towards harmonisation of schemes has required the reconciliation of differing and firmly held views across the UCL faculties but good progress has now been made and a new harmonised scheme of award for course unit-based undergraduate degree programmes has been approved by AC and will be implemented with effect from the academic year 2005-06. … CBE, discharging its responsibilities for implementing the policy defined by AC, has discussed with each faculty whether it is content to adopt the

6

new harmonised scheme in full and considered cases where a faculty has asked to negotiate some deviation from the scheme. The deviations negotiated between the faculties and CBE were formally approved by AC in October 2004. We believe that thoughtful negotiation has paid real dividends in this project – not only in developing the new scheme but also in promoting greater awareness and receptivity across UCL of the perceived benefits of harmonisation. AC, in accepting the various faculty variations from the norm, has applied the scheme flexibly and has thus kept faith with faculties by demonstrating that harmony in this context need not mean uniformity.

17 In other words, there is no doubt that negotiating these variations between subject areas has been a time-consuming process for both the faculties involved and the committee responsible for implementing the process (and whose membership is predominantly academic). There is no doubt either that the flexible arrangements agreed and implemented are regarded by those concerned as preferable to the imposition of a nonnegotiable standardised requirement across all faculties.

18 Does UCL regard membership of QA-related committees etc and participation in activities such as IQR as an appropriate and worthwhile activity for academic staff? Is activity of this kind to be regarded as evidence of ‘enabling’ that can carry weight in the senior academic staff promotions process? The academic staff involved in the review scoping meeting in July 2005 agreed unanimously that UCL’s approach to QA should continue to be academic-led; and QSRG has addressed specific QA structures and processes accordingly.

It should nevertheless be recognised that if the answer to either or both the above questions is no, it may not be feasible for UCL to continue to implement, as well as profess, an academic-led approach to QA.

Programme Institution Questionnaire

19 As noted above [paragraph 6] , the PIQ has attracted particular criticism during the last year – ie criticism of what is seen as the excessive length of the PIQ and the amount of time required in order to complete a PIQ for consideration by the relevant committees at departmental, faculty and institutional levels of UCL. As also noted in the November

2004 DAPs application (and in subsequent UCL discussions of the PIQ):

UCL is in the process of developing a unified programme record, stored in a database and accessible online through a secure system, to replace the paper-based questionnaires used hitherto for programme institution, amendment, review and withdrawal.

20 The development of the UPR is being overseen by a steering group of the Programme

Development Executive Sub-Committee, chaired by Professor Vince Emery. A note on progress to date is at Annexe 1. A test version of the UPR is due to be provided to the steering group, following programming work by EISD officers, by the end of December

2005. While the online PIQ will ask no fewer questions than the current Word document version, UPRSG believes that there are significant potential efficiencies to be gained by the implementation of the UPR – but these will largely depend on the acceptance by the

Faculty Tutors and/or Faculty Graduate Tutors of increased responsibility in their enhanced role as ‘gatekeepers’ of the process. It has been noted at successive UPRSG meetings that this new role for Faculty Tutors and/or Faculty Graduate Tutors will require some cultural change if the UPR is to be implemented effectively and the benefits of the potential efficiency gains that it offers are to be realised.

7

21 In other words, QSRG sees an element of reciprocity between the enhanced responsibilities required of Faculty Tutors and/or Faculty Graduate Tutors and the increased efficiency of online PIQs. In view of the particular concerns that have been expressed about the existing PIQ, QSRG believes that the implementation of the UPR should be carefully monitored, the UPRSG should therefore remain in place after the formal launch of UPR, and feedback should be invited from faculties and departments throughout the first year of the operation of UPR so that UPRSG can assess the extent to which potential efficiency gains are actually being achieved.

Internal Quality Review

22 The history and purpose of IQR were summarised in UCL’s DAPs application 2004 as follows:

Internal Quality Review is at the heart of UCL's academic quality operations. It is a crucial source of evidence that our internal QME processes are robust and dynamic. IQR is a rolling programme, which operates on an approximately five-yearly cycle and includes all academic departments of UCL (and a small number of interdepartmental degree programmes). IQR derives from the Internal Quality Audit process, introduced in 1992.

The original IQA methodology succeeded in enhancing quality to the extent that departments mostly implemented the recommendations for improvement suggested in

IQA reports, and attracted favourable comment in a number of TQA and Subject Review reports on UCL departments. Nevertheless, we became increasingly concerned that internal criticisms of IQA - as an arid, largely compliance-testing exercise - were justified, and increasingly convinced that a more creative process was needed. … [A] proposed revised methodology … was piloted in one (large and complex) academic department in early 2000 and implemented for the full IQA programme with effect from the academic year 2000-01. The new IQA process was further developed and gradually refined with the experience of the reviews which took place then and in the following session. The process was renamed IQR, to reflect its revised scope and character, from the start of session 2002-03.

23 QSRG endorses the view that IQR is central and crucial to robust QA operations at UCL.

The various incarnations of the process have been commended in a succession of external reviews of both institutional and subject-level teaching and learning quality operations at UCL. As noted above, IQR reflects the academic-led QA strategy to which

UCL is at present committed. It needs to be recognised in particular that IQR has become more extensive as a direct result of the deliberate reshaping of the process from one that was seen as ‘arid and largely compliance-testing’ to one that aims to be ‘more creative’. To revert to IQR that is purely compliance testing could well yield savings in time and effort for both departments and review teams but would reinstate a process previously rejected as too mechanical. It could also run the risk of depriving UCL of compelling evidence for its ability independently both to manage and to enhance quality.

24 QSRG has nevertheless identified four (partly interlinked) areas in which to achieve staff time and resource savings and/or efficiencies in relation to IQR:

redefining the IQR agenda

the scope for expanding the role of the IQR administrative secretary in scrutinising briefing documentation on behalf of the team

8

the scope for moving to increasing reliance by IQR teams on online materials - rather than hard copy documents supplied by departments - in the pre-visit briefing process

the duration of IQR visits to departments.

25 The current IQR guidelines set out a list of 24 areas to be covered by IQR. QSRG officers have read the IQA/IQR reports produced for sessions 2002-03 and 2003-04 in order to confirm which of these areas were actually covered by teams. The summary at

Annexe 2 confirms that some areas for investigation are much more likely than others to be addressed in IQR reports. The fact that only seven of the 24 items set out in the IQR guidelines are known to have been covered in all IQRs conducted in 2002-04 is evidence to suggest that the current list of ‘mandatory’ items is overambitious. An IQR team’s ability to cover the full list of areas for investigation set out in the guidelines may at present depend to some extent on the duration of its visit to the department but, as noted below [see paragraphs 30-31] , there is scope for reducing the length (and increasing the consistency of length) of visits. In short: QSRG is convinced that there is a clear and immediate need to identify a smaller number of core areas for investigation in IQR .

26 Paperwork recently received for an IQR visit to take place during the current Autumn

Term comprises: the departme nt’s self-evaluative statement (24 pages); and supporting material comprising 19 sets of documents, which fill a lever arch file. This volume of material, describing the activities of a relatively small department, is not unusual. The briefing material is normally supplied by the department to each of five members of the review team ( ie the four reviewers and the administrative secretary); in theory, each member of the team is expected to read the SES and the supporting material in its entirety (although it is unlikely that this always happens and usually unnecessary – given that teams often divide up between them the various parts of the IQR agenda). Once the

IQR team has held its planning meeting, it may request additional briefing material from the department. In short: the current arrangements make excessive demands on departments to generate and on review team members to read documentation.

27 QSRG proposes the following model for rationalising this process:

department submits SES in electronic form to IQR team – appending a list of supporting material and supplying web links to departmental website for each item of supporting material

administrative secretary downloads all supporting material – to be available to reviewers as required

administrative secretary reads all supporting material relating to core areas for investigation by IQR team and prepares report for reviewers drawing attention to issues arising from this material

reviewers read the SES in conjunction with the administrative secreta ry’s report – and decide at their planning meeting whether, in the case of the department concerned, there are areas in addition to the core areas which merit exploration and, if so, how the team will organise its scrutiny of relevant briefing material.

28 While this arrangement would significantly increase the responsibilities of the administrative secretary in Academic Services, it offers the prospect of a reduction in time/resources spent by (i) departments in producing supporting material and (ii) IQR

9

reviewers in reading this material . There is a quid pro quo involved here: the arrangement clearly depends on whether the supporting briefing material required for

IQR purposes is actually maintained by departments online.

29 The IQR guidelines currently provide as follows:

The IQR team should visit the Department for not less than one and a half working days

(normally consecutive days). The visit will not normally exceed two working days, except in the case of particularly large and/or complex provision. …

30 In the early years of IQA, the visit to the department sometimes lasted only half a working day. Before the introduction of a revised IQA process in session 2000-01, the visit normally lasted no more than one working day. The increased duration of the visit in recent years reflects the evolution of IQA/IQR from a purely compliance-testing exercise to a more ambitious, developmental process, an essential principle of which is to encourage dialogue between the IQR team and members of the department being reviewed. Nevertheless, it is even now not unusual for an IQR visit to be completed in one working day - because of the limited availability of IQR team members and/or key personnel in the department. The fact that IQR visits are of varying length – and that there is not usually any correlation between the size and/or complexity of the department being reviewed and the length of the visit – could be seen as a failing of IQR in that it reflects an unjustified inconsistency of practice.

31 Reductio n of the prescribed length of the IQR visit to ‘normally one working day’ (perhaps with occasional exceptions in the case of large and/or complex departments) would therefore eliminate inconsistency, as well as reduce the amount of staff time demanded by participation in IQR.

Quinquennial Programme Review

32 The history and development of QPR are summarised in UCL’s 2004 DAPs application as follows:

The priority assignment for the new Programme Review Steering Group, when it started work at the start of the 2003-04 session, was to review and revise the programme review system so as to ensure, in particular, (i) a more robust and proactive approach, particularly at institutional level, and (ii) increased articulation of the timetables for programme review and IQR. The revised programme review policy and procedure devised by PRSG, and approved by PDExSCo in June 2004, have been implemented from the start of academic year 2004-05. PRSG will henceforth both monitor the implementation of the new process and determine whether individual programme reviews, completed by DTCs and submitted to PRSG via FTCs, are satisfactory and/or raise either specific or generic issues requiring further attention. A five-year programme review timetable to 2008-09 - devised with the IQR timetable for the coming years in mind - has also been put in place. In order to achieve greater articulation of the two timetables, there will need to be in the next few years a degree of flexibility in defining intervals between a department ’s latest and next programme reviews. However, it is intended that a department ’s programmes will normally in future be reviewed in the session preceding its next IQR - so that the IQR team can receive, through the availability of recent programme review documentation, reasonably current information on the department ’s programmes

(as part of the team ‘s briefing material for IQR).

10

The revised programme review process retains many of the key features of the previous process, notably the involvement of an external scrutineer. As well as introducing

(through the role of PRSG) a stronger engagement with programme review at institutional level, the revised programme review policy statement situates quinquennial programme review within UCL ’s overall QME framework and clarifies the relationship of the process not only to IQR but also to the annual monitoring of quality and standards by academic departments which QMEC is now in the process of codifying … .

33 Although it has formally been a part of UCL’s QA processes since 1995, QPR has been implemented much less fully and consistently than IQR. This tradition continued during session 2004-05 when, in spite of the intentions outlined in the DAPs application, the implementation of the revised QPR system proved to be very limited.

34 Many HEIs conduct a programme of annual programme review. In QAA Subject Review visits between 1995 and 2001, UCL departments and officers were repeatedly required to justify the operation of a quinquennial rather than an annual system. The justification was, essentially, that QPR was part of a network of related review and monitoring processes – also including approximately quinquennial IQA/IQR and various annual monitoring activity in academic departments, although not in the systematised form now envisaged. However, the fact that QPR has been implemented so patchily over the past decade makes for particular difficulties in implementing the process more fully in future: hitherto QPR has not, for significant numbers of departments, actually contributed to QArelated workload.

35 QPR, as redesigned in 2004, remains essentially a paper-based process which lacks the developmental aspirations of IQR. In summary, QSRG: recognises that QPR has operated largely ineffectively at UCL for the past decade; sees little evidence that it is now operating more effectively; and thinks it likely that the redesigned QPR process, if it is pursued, will be perceived by academic departments as an additional and as a largely administrative burden [see also paragraphs 56-59 below] .

Annual monitoring

36 UCL’s plans for introducing systematised Annual Monitoring Reports were summarised in the 2004 DAPs application as follows:

Annual monitoring by departments can take a number of forms – by reference to, eg : external examiners ’ reports; student evaluation questionnaire returns; the faculty and/or departmental learning and teaching strategy; and a range of student statistics relating to, eg , qualifications on admission, progression and achievement, and degree classification.

The findings of IQR teams over a number of years have produced evidence which confirms that self-evaluative monitoring of this kind does take place in departments.

However, QMEC is now in the process of defining and promulgating minimum requirements for annual monitoring by academic departments with a view to:

 promoting consistent quality monitoring processes across UCL

 clarifying what may be in some cases unclear arrangements for quality management at the departmental level

 confirming the context of UCL programme review within this overall framework for selfreview …

11

 considering whether and, if so, how the results of departmental annual monitoring should be reported at faculty level and transmitted to QMEC.

37 The one ‘advisable’ action for UCL emerging from the QAA’s Institutional Audit report is for action to be taken to:

… complete the regularisation of annual monitoring as expeditiously as possible, ensuring that it is implemented in a systematic and consistent way, and that procedures are in place to identify and act upon any consistent themes which emerge.

38 The UCL AMR is to be based on the model used since 1996-97 by the Department of

Biochemistry and Molecular Biology (see http://www.biochem.ucl.ac.uk/teachingresources/annual-monitoring-report/eds-report-03-04.htm

). QMEC is now in the process of consulting with faculties on the Co mmittee’s AMR proposals, which envisage a phased introduction of the system with pilots during 2005-06 and UCL-wide implementation in

2006-07. Initial responses to the proposals received from faculties have been positive.

This consultation process is planned to include a series of presentations to FTCs by

Professors Carey and Saggerson (a good example of the academic-led QA approach in action).

39 QSRG believes that the introduction of AMRs has the potential to generate significant efficiency gains wi thout compromising UCL’s commitment to robust QA processes – although how soon and how fully these gains might be realised will depend, to a large extent, on how purposefully departments engage with the new AMR system. Here too

QSRG recognises that, because UCL has not previously operated an explicit and consistent annual monitoring process, the introduction of one naturally runs the risk of being seen as an additional QA burden. QMEC is convinced, however, that a more systematic approach could soon yield savings in the time currently spent on annual monitoring activity – which is often diffuse and uncoordinated. Much of the information required for an AMR already exists in departments. The additional information requirements can be met by the Registry ’s Management and Information Services unit.

40 A number of further and specific efficiency gains should be possible through AMR : eg once the AMR system is implemented, there may be scope for further rationalising the supporting material required for IQR – ie by making available the AMRs produced by the department since its previous IQR. The advent of AMR also provides an important opportunity to address the chronic ineffectiveness of Quinquennial Programme Review

[see also paragraphs 56-59 below] .

Double marking

41 QSRG learned that UCL’s current policy on double marking of examinations and other assessed work was already being reviewed by AC’s Working Group on Examinations and

Assessment Policy. Although there seems to be a widespread belief in UCL that blind double marking is standard practice across the institution, the actual arrangements allow for considerable variation (and reflect the outcomes of an extensive process of consultation with individual departments that took place in the early 1990s). A paper to be submitted to the Working Group’s next meeting both (i) highlights these variations and

(ii) takes the view that marking is sufficiently robust, with a low rate of variation between markers, to mean that a standard policy of blind double marking should be unnecessary at UCL. The paper also suggests alternatives, such as a single mark, with a summary of

12

key points in order that the moderator and external examiner can see the justification for the mark awarded. The ACWGEAP is also looking at the corresponding processes of a number of other Russell Group universities which operate schemes other than double marking.

42 QSRG, without wishing to preempt the outcome of ACWGEAP’s review of this area of QA operations, feels that there may be potential for efficiency gains in this area, even if these entail some additional work for moderators. QSRG understands that ACWGEAP plans to submit a report and recommendations for consideration by AC at its meeting in March

2006.

Student feedback processes

43 The QAA Institutional Audit report of 2005 includes among the ‘desirable’ actions recommended to UCL:

… ensure that its student representative system and feedback systems operate effectively throughout the institution.

44 In fact, the DAPs application of November 2004 noted that UCL had already set up a working group, operating under the auspices of QMEC, to review student feedback systems generally – including, inter alia , the operation of (i) student evaluation questionnaires and (ii) staff-student consultative committees.

45 QMEC accepted the report and recommendations of the Working Group on Student

Feedback at its meeting in October 2005. These recommendations

– listed at Annexe 3 - largely consist in the confirmation of existing policies, with some strengthening of these where necessary. QSRG considers that the recommendations will for the most part be likely to deliver efficiency gains if they are consistently implemented ( eg the recommendation that UCL explore the introduction of UCL-wide learning contracts with a requirement that students use their UCL e-mail accounts). Certain recommendations, such as those concerning online SEQs, can be expected to deliver efficiencies only in the medium term and in the context of wider institutional change.

Peer observation of teaching

46

UCL’s policy and procedures on peer observation of teaching were the subject of a comprehensive review by a working group of QMEC during the academic year 2003-04.

A revised policy was implemented with effect from the start of session 2004-05. The

2003-04 review recognised that PoT, potentially at least, could be operated both as a significant QA mechanism and as an aid to staff development. The QMEC working group was in no doubt that it was important to develop PoT in a way which minimised its contribution to QA workload and maximised its staff development potential. QSRG is satisfied that the 2003-04 review achieved this objective and that there is minimal scope for producing further efficiency gains in relation to PoT

– although the introduction of

AMR should mean that it is no longer necessary for departments to make a separate report to faculties on the operation of PoT in the preceding 12 months (since this information could be subsumed in the AMR).

47 All in all, QSRG endorses the summary of the 2003-04 PoT review that appeared in

UCL’s DAPs application:

13

UCL’s arrangements for peer observation of teaching, which have operated formally since the mid-1990s, have always aimed to enhance teaching quality by encouraging reflection on practice and supporting personal development – and been based on a view that the peer observation process will only work in an atmosphere of trust and mutual esteem.

Peer observation is expected to benefit the observer as well as the colleague observed.

We see it as crucial that there should be no direct links between this process and any formal assessment or appraisal procedures; and that any feedback given through the peer observation process should be formative, not summative – and confidential to the two colleagues involved. Peer observation is intended to be a continuing process and one that, in time, becomes an integral part of departmental life. We nevertheless see it as important to combine the developmental philosophy which informs peer observation with effective measures to monitor its operation and enable good practice in this area to be disseminated. Our QMEC took the view that – even though the IQR process looks at departments’ arrangements for peer observation – a more comprehensive collection of current information on how it actually operates across UCL was needed, in order to develop more informative statements of policy and good practice.

The QMEC working group which conducted a review of peer observation arrangements in session 2003-04 issued a questionnaire to departments which invited them: to describe their procedures and express any concerns about these; to suggest ways in which the current arrangements for peer observation at UCL might be improved; and to supply examples of teaching practices which, although they could not be defined as peer observation in the normally understood sense of the term, were in effect serving a similar purpose. A number of departments, in addition to responding to the questionnaire, volunteered more detailed information on their approaches to peer observation and similar practices. The responses from departments left the working group in no doubt that a light touch approach to peer observation – and its confidential, developmental, nonjudgmental aspects – had to be preserved. The group’s report, approved by QMEC in

June 2004, unequivocally confirmed these principles of peer observation of teaching, which will underpin the revised policy statement which has been formally implemented from the start of academic session 200405. We see this working group’s approach as a further illustration of UCL’s QME philosophy. On the one hand, it strikes a balance between strong quality management to enhance the student experience, and sensitive attention to staff concerns and development needs. On the other hand, it sets a policy which embodies core expectations, to be met by all departments but which are flexible enough to respect real differences between subject areas.

Learning and teaching strategies

48 UCL’s Institutional Learning and Teaching Strategy sees it as essential that there also are in place discipline-based strategies, that these latter strategies are continuously developed and regularly reviewed, and that there is an increasingly dynamic relationship between the ILTS and the disciplinebased LTSs, so that they inform each other’s development. Like a number of the other processes considered by QSRG, LTSs can be seen to have both a QA and a developmental aspect. The definition of ‘discipline-based’ strategies reflects an extensive process of consultation between faculties and officers during 2003: some faculties regarded their departments as so different from one another that the concept of a faculty LTS was meaningless; others saw their departments and degree programmes as sufficiently homogeneous to justify a facultylevel strategy.

49 QSRG suspects that at present the LTS is not (as PoT was claimed to be in the DAPs application) ‘a continuing process and … an integral part of departmental life’ – rather

14

that many faculties/departments see the LTS merely as a document which they were required to produce on the instruction of an institution-level committee (AC). Now that they have followed instructions, faculties/departments may not be clear as to what happens next with the LTS. QSRG also noted some differences of opinion, even within its own membership, as to whether a faculty-level LTS can be useful.

50 QSRG notes that a working group of ESCILTA has now drawn up guidelines and a proforma on developing departmental LTSs. These were endorsed by ESCILTA in

October 2005 and, subject to the agreement of the Vice-Provost (Academic and

International), will now be issued to faculties for comment, before being resubmitted to

ESCILTA in January 2006 and to AC in March 2006. QSRG feels that the guidelines could help to yield real improvements in teaching and learning. But, charged as it is to review and try to identify efficiencies in QA operations, QSRG wishes to stress that this process should not be too unwieldy. Though it has been suggested eg that disciplinelevel LTSs should be reviewed annually by the FTC or DTC concerned and that there needs to be central monitoring of this annual review process, QSRG thinks it should be sufficient for quality monitoring purposes for a department’s self-evaluative statement for

IQR to include a reference to the way in which the department has developed its LTS (or contributed to the development of a faculty-level LTS) in the five years since the previous

IQR.

Committee structures

51 UCL operates a pyramidal structure of committees to assure quality and standards, which reflects its academic-led and devolved approach to QA – and, as indicated above

[see paragraph 14] , requires the engagement of academic staff at all three levels of the structure. The senior UCL committee with responsibility for teaching, learning and quality matters is the Academic Committee, chaired by the Vice-Provost (Academic and

International) and whose constitution includes, among others, the Provost, the Deans (or their nominees) and the Faculty Tutors. Each faculty has at least one Faculty Teaching

Committee (in some faculties, the undergraduate and graduate FTCs meet separately), chaired by the Faculty Dean or by the Dean's nominee. The FTC normally comprises departmental representatives for all the programmes for which the FTC has responsibility. At least one Departmental Teaching Committee operates within each academic department. The FTCs submit the minutes of their meetings to AC officers.

The latter also produce an annual report for AC which summarises the main themes emerging from the preceding session’s FTC minutes. That report, once it is approved by

AC, is disseminated by Deans to their FTC to promote awareness of the extent of common ground being covered (and the differences in the range of issues being addressed) by FTCs.

52 AC has developed core terms of reference and constitutions for both FTCs (Annexe 4) and DTCs (Annexe 5). This pyramidal committee structure, which is designed to facilitate continuing dialogue within and between departments, faculties and the institution, is replicated in respect of library and examination committees (in the latter case , the ‘base level’ committee is defined at programme rather than departmental level).

53 Against this background, QSRG reviewed the terms of reference for FTCs and DTCs in order to consider whether they were appropriately constructed and fit for purpose. QSRG considered particularly the reporting and information requirements placed on FTCs and

15

DTCs by their terms of reference; and scrutinised AC and QMEC minutes for evidence in relation to this. QSRG concludes that:

subject to some minor adjustments and updating, the DTC and FTC terms of reference are fit for purpose and sufficiently distinct from each other

certain parts of the FTC terms of reference

– eg the requirement that FTCs receive annual analyses from DTCs of the results of student evaluation questionnaires – are expected to be subsumed within the proposed new AMRs, whereby departments will report in a more streamlined and aggregated way to faculties

otherwise, there is little evidence to suggest that reports or submission of other kinds of information within the pyramidal committee structure is excessive but AC and/or QMEC should invite FTCs and DTCs to identify any specific areas where they consider that current arrangements are excessive and in which it may be possible to achieve efficiencies.

Streamlining and rationalisation of existing processes

54 Progress was made in 2004 towards articulating the IQR and QPR timetables but there appears to be little scope for further integration of these two processes. It has occasionally been suggested that IQA/IQR might be reconfigured as a programme-based rather than a department-based process – in effect a fusion of IQR and QPR. This suggestion has never been seriously pursued – essentially because a process which retained the fundamental features of both IQR (a visit to the department and interviews with staff and students) and QPR (the involvement of a subject expert as ‘external scrutineer’) would seem likely to require significantly more resource than is required by the existing combination of IQR and QPR.

55 While the introduction of AMRs should result in further rationalisation of the documentation required for IQR [see paragraph 40 above] , QSRG sees clear and important differences between the IQR and the AM processes. The latter are primarily designed to enable a department to review its teaching and learning operations. IQR involves an institution-level perspective, takes a broader view and concerns itself with an investigation of a department’s structures and processes within a developmental agenda.

56 Other specific examples are given above of the ways in which AMRs should provide opportunities for efficiencies in the operation of QA processes [see paragraphs 46 and 53] .

Most significantly, however, QSRG believes that the introduction of AMRs would enable

UCL to abolish the quinquennial review programme . The WPOYO assumes the implementation of AMRs at UCL, noting that ‘from 2005-06 Departments will be required to report on [their teaching programmes] through the formalised annual monitoring process’. In fact, the AMR model which it is planned to adopt focuses on programme components rather than whole programmes but QSRG believes that the planned AMR model could be adapted without difficulty to include a sufficient but economical programme review dimension. Programme Directors could be asked to consider and report briefly in the AMR on any implications for overall programme structure which they see as arising from the annual review of programme components.

16

57 In exploring this possibility, QSRG noted the relevant section of the QAA Code of

Practice . It has been made repeatedly clear to the QAA that UCL’s position in relation to the Agency’s Code is one of intelligent engagement rather than automatic compliance with its precepts (and the QAA evidently accepts this position). Nevertheless, QSRG sees it as important to note the distinction between programme review and programme monitoring drawn by the QAA Code:

Institutions should consider the appropriate balance between regular monitoring and periodic review of programmes. Monitoring should consider the effectiveness of the programme in achieving its stated aims, and the success of students in attaining the intended learning outcomes. Periodically, the continuing validity of those aims and outcomes themselves should be reviewed. In general, monitoring is an activity likely to be undertaken within the providing department. Review will normally be an institutional process, often involving external participants of high calibre and academic/professional credibility.

58 With this in mind, QSRG believes that the programme review element in AMR would need both (i) to retain the involvement of an external scrutineer and (ii) to be clearly addressed in the IQR process. QSRG suggests that, in the year preceding a department’s IQR, the AMR should be augmented by requesting an external scrutineer to consider the information in the report on the constituent parts of a programme and provide a commentary on the programme as a whole in the light of this information – and that the following session’s IQR should consider the department’s AMRs for the preceding five years. QSRG is satisfied that, if AMR and IQR were adjusted in this way, the existing QPR process would become entirely redundant. Assuming that the AMR methodology can be revised accordingly, QSRG recommends that QPR should be suspended immediately with a view to being formally discontinued with effect from the start of academic year 2006-07.

59 The abolition of QPR and its absorption in a modified and economical form within AMRs would not only create a significant efficiency in UCL’s QA operations; it would also provide a clear illustration of UCL’s ability to be self-critical about its processes and, where these are revealed to be ineffective and inefficient, to take ameliorative action.

Et alia

60 In conducting its review, QSRG has identified a number of other possible changes to existing arrangements which the Group believes could enhance UCL’s QA operations – without entailing additional workload for academic departments or faculties. Since these changes are not envisaged as efficiencies in themselves, they are not detailed in this report but are listed in Annexe 6. AC and QMEC are invited to agree to these arrangements being implemented.

17

Proposals

61 That UCL continue to profess and implement an academic-led approach to quality assurance strategy. [paragraph 18]

62 That the implementation of the Unified Programme Record be carefully monitored, that the Unified Programme Record Steering Group therefore remain in place after the formal launch of UPR, and that feedback be invited from faculties and departments throughout the first year of the operation of UPR so that the Steering Group can assess the extent to which potential efficiency gains are actually being achieved. [paragraph 21]

63 That QMEC’s IQR Panel review the current list of areas to be covered by IQR teams with a view to defining, for IQRs taking place with effect from the start of the academic year

2006-07, a smaller set of core areas for investigation. [paragraph 25]

64 That the arrangements for production and reading of briefing material for IQR teams be rationalised as follows, with effect from the start of the academic year 2006-07, in order to reduce the time/resources spent by (i) departments in producing briefing material and

(ii) IQR teams in reading such material [paragraph 27] :

department submits SES in electronic form to IQR team – appending a list of supporting material and supplying web links to departmental website for each item of supporting material

administrative secretary downloads all supporting material – to be available to reviewers as required

 administrative secretary reads all supporting material relating to core areas for

 investigation by IQR team and prepares report for reviewers drawing attention to issues arising from this material reviewers read the SES in conjunction with the administrative secretary’s report – and decide at their planning meeting whether, in the case of the department concerned, there are areas in addition to the core areas which merit exploration and, if so, how the team will organise its scrutiny of relevant briefing material; and that departments be encouraged, in order to benefit from these arrangements, to arrange for the relevant materials to be posted and maintained on departmental web pages as soon as possible . [paragraph 28]

65 That, with effect from the start of academic year 2006-07, the normal length of an IQR visit to a department be reduced to one working day . [paragraph 31]

66 That, subject to the adaptation of the planned Annual Monitoring Report model to incorporate essential elements, Quinquennial Programme Review be suspended immediately with a view to being formally discontinued with effect from the start of the academic year 2006-07. [paragraph 58]

67 That QMEC continue to explore the scope for QA efficiency gains to be achieved through the introduction of a UCL-wide system of Annual Monitoring Reports. [paragraph 40]

18

68 That Academic Committee, in receiving a review report in March 2006 on UCL policy for double marking of examinations and other assessed work from the Committee’s Working

Group on Examinations and Assessment Policy, be asked to pay particular attention to the potential for any efficiency gains identified by the Working Group. [paragraph 42]

69 That QMEC keep under review the potential for and achievement of efficiency gains through the implementation of the recommendations contained in the recent report of its

Working Group on Student Feedback, approved by the Committee in October 2005.

[paragraph 45]

70 That, in relation to the development and review of discipline-based Learning and

Teaching Strategies, it be sufficient for quality monitoring purposes for a departm ent’s self-evaluative statement for IQR to summarise the way in which the department has developed its LTS (or contributed to the development of a faculty-level LTS) in the years since the previous IQR. [paragraph 50]

71 That Academic Committee and/or QMEC invite Faculty Teaching Committees and

Departmental Teaching Committees to identify any specific areas where they consider that current arrangements are excessive and in which it may be possible to achieve efficiencies. [paragraph 53]

72 That Academic Committee, on the advice of QMEC, approve the arrangements for further changes in existing QA operations at UCL as outlined at Annexe 6 to this report.

[paragraph 60]

19

Download