Institutional audit: University of Greenwich June 2004

advertisement
University of Greenwich
JUNE 2004
Institutional audit
Preface
The Quality Assurance Agency for Higher Education (the Agency) exists to safeguard the public interest in sound standards
of higher education (HE) qualifications and to encourage continuous improvement in the management of the quality of HE.
To do this the Agency carries out reviews of individual HE institutions (universities and colleges of HE). In England and
Northern Ireland this process is known as institutional audit. The Agency operates similar but separate processes in
Scotland and Wales.
The purpose of institutional audit
The aims of institutional audit are to meet the public interest in knowing that universities and colleges are:
providing HE, awards and qualifications of an acceptable quality and an appropriate academic standard;
z
and
z
exercising their legal powers to award degrees in a proper manner.
Judgements
Institutional audit results in judgements about the institutions being reviewed. Judgements are made about:
z
the confidence that can reasonably be placed in the soundness of the institution's present and likely future
management of the quality of its programmes and the academic standards of its awards;
z
the reliance that can reasonably be placed on the accuracy, integrity, completeness and frankness of the
information that the institution publishes, and about the quality of its programmes and the standards of its awards.
These judgements are expressed as either broad confidence, limited confidence or no confidence and are
accompanied by examples of good practice and recommendations for improvement.
Nationally agreed standards
Institutional audit uses a set of nationally agreed reference points, known as the 'Academic Infrastructure', to consider an
institution's standards and quality. These are published by the Agency and consist of:
z
The framework for higher education qualifications in England, Wales and Northern Ireland (FHEQ), which include
descriptions of different HE qualifications;
z
The Code of practice for the assurance of academic quality and standards in higher education;
z
subject benchmark statements, which describe the characteristics of degrees in different subjects;
z
guidelines for preparing programme specifications, which are descriptions of the what is on offer to students in
individual programmes of study. They outline the intended knowledge, skills, understanding and attributes of a
student completing that programme. They also give details of teaching and assessment methods and link the
programme to the FHEQ.
The audit process
Institutional audits are carried out by teams of academics who review the way in which institutions oversee their
academic quality and standards. Because they are evaluating their equals, the process is called 'peer review'.
The main elements of institutional audit are:
z
a preliminary visit by the Agency to the institution nine months before the audit visit;
z
a self-evaluation document submitted by the institution four months before the audit visit;
z
a written submission by the student representative body, if they have chosen to do so, four months before the
audit visit;
z
a detailed briefing visit to the institution by the audit team five weeks before the audit visit;
z
the audit visit, which lasts five days;
z
the publication of a report on the audit team's judgements and findings 20 weeks after the audit visit.
The evidence for the audit
In order to obtain the evidence for its judgement, the audit team carries out a number of activities, including:
z
reviewing the institution's own internal procedures and documents, such as regulations, policy statements, codes
of practice, recruitment publications and minutes of relevant meetings, as well as the self-evaluation document itself;
z
reviewing the written submission from students;
z
asking questions of relevant staff;
z
talking to students about their experiences;
z
exploring how the institution uses the Academic Infrastructure.
The audit team also gathers evidence by focusing on examples of the institution's internal quality assurance processes at
work using 'audit trails'. These trails may focus on a particular programme or programmes offered at that institution,
when they are known as a 'discipline audit trail'. In addition, the audit team may focus on a particular theme that runs
throughout the institution's management of its standards and quality. This is known as a 'thematic enquiry'.
From 2004, institutions will be required to publish information about the quality and standards of their programmes and
awards in a format recommended in document 02/15 Information on quality and standards in higher education published by
the Higher Education Funding Council for England. The audit team reviews progress towards meeting this requirement.
Published by
Quality Assurance Agency for Higher Education
Southgate House
Southgate Street
Gloucester GL1 1UB
Tel
Fax
Email
Web
01452 557000
01452 557070
comms@qaa.ac.uk
www.qaa.ac.uk
© Quality Assurance Agency for Higher Education 2004
ISBN 1 84482 207 9
All the Agency's publications are available on our web site www.qaa.ac.uk
Printed copies are available from:
Linney Direct
Adamsway
Mansfield
Nottinghamshire NG18 4FN
Tel
Fax
Email
01623 450788
01623 450629
qaa@linneydirect.com
Contents
Summary
1
Introduction
1
Outcome of the audit
1
Features of good practice
1
Recommendations for action
1
Discipline audit trails
2
National reference points
2
Main report
4
Section 1: Introduction:
the University of Greenwich
4
The institution and its mission
4
Collaborative provision
4
Background information
5
The audit process
5
Developments since the previous academic
quality audit
Section 2: The audit investigations:
institutional processes
Assurance of the quality of teaching delivered
through distributed and distance methods
21
Learning support resources
22
Academic guidance, support and supervision
23
Personal support and guidance
24
Section 3: The audit investigations:
discipline audit trails
Discipline audit trails
Section 4: The audit investigations:
published information
25
25
32
The students' experience of published information
and other information available to them
32
Reliability, accuracy and completeness of
published information
33
Findings
36
6
The effectiveness of institutional procedures for
assuring the quality of programmes
36
7
The effectiveness of institutional procedures for
securing the standards of awards
38
The institution's view as expressed in the SED
7
The institution's framework for managing quality
and standards, including collaborative provision
The effectiveness of institutional procedures for
supporting learning
40
7
Outcomes of discipline audit trails
42
The institution's intentions for the enhancement
of quality and standards
9
The use made by the institution of the Academic
Infrastructure
43
Internal approval, monitoring and review processes
10
External participation in internal review processes
13
The utility of the SED as an illustration of the
institution's capacity to reflect upon its own
strengths and limitations, and to act on these to
enhance quality and standards
44
Commentary on the institution's intentions for the
enhancement of quality and standards
44
External examiners and their reports
13
External reference points
15
Programme-level review and accreditation by
external agencies
16
Student representation at operational and
institutional level
17
Feedback from students, graduates and employers
18
Progression and completion statistics
19
Assurance of the quality of teaching staff,
appointment, appraisal and reward
19
Assurance of the quality of teaching through staff
support and development
20
Reliability of information
45
Features of good practice
45
Recommendations for action
45
Appendix
The University of Greenwich's response to the
audit report
47
47
Institutional Audit Report: summary
Summary
z
Introduction
A team of auditors from the Quality Assurance
Agency for Higher Education visited the University of
Greenwich (the University) from 14 to 18 June 2004
to carry out an institutional audit. The purpose of
the audit was to provide public information on the
quality of the University's programmes of study and
on the academic standards of its awards.
To arrive at its conclusions the audit team spoke to
members of staff throughout the University, to
current students, and read a wide range of
documents relating to the way the University
manages the academic aspects of its provision.
Recommendations for action
The audit team also recommends that the University
should consider further action in a number of areas
to ensure that the academic quality and standards of
the awards it offers are maintained.
Recommendations for action that is advisable:
z
to provide schools with more explicit guidance
on the expectations for reporting on matters
relating to the quality assurance of provision
through the ARPD, in order to improve
consistency and comprehensiveness and thereby
to make the ARPD a more effective channel for
institutional oversight within the University's
framework for managing quality and standards;
z
in the interests of improving transparency in the
information provided to students, to expedite
the process of determining those aspects of
assessment policy that should be universally
applicable and either incorporated in the
Academic Regulations (for taught awards), or
standardised across schools' assessment policies;
z
to strengthen arrangements for ensuring parity
of treatment for combined honours students
whose programmes cross schools with those
whose programmes operate within a single
school, given the scope for variation in the
content of school policies and the format of
documentation given to students, together with
the system of allocating personal tutor support
solely on the basis of the school responsible for
the first-named subject of a combined
programme of study; and
z
to review the provision of skills training for
research students and, in particular, to establish
a training requirement for those students
involved in teaching or demonstrating activities,
together with a mechanism for subsequent
monitoring and support.
The words 'academic standards' are used to describe
the level of achievement that a student has to reach
to gain an academic award (for example, a degree).
It should be at a similar level across the UK.
'Academic quality' is a way of describing how well
the learning opportunities available to students help
them to achieve their award. It is about making sure
that appropriate teaching, support, assessment and
learning resources are provided for them.
In institutional audit, both academic standards and
academic quality are reviewed.
Outcome of the audit
As a result of its investigations, the audit team's view
of the University is that:
z
broad confidence can be placed in the
soundness of the University's current and likely
future management of the quality of its
academic programmes and the academic
standards of its awards.
Features of good practice
The audit team identified the following areas as
being good practice:
z
z
the holistic approach to reporting and planning
through the Annual Reporting and Planning
Document (ARPD), combining in a single process
and a single document both the academic quality
and standards, and human and financial resources
aspects of schools' activities, thus providing the
University with a valuable instrument for
managing its current and future portfolio;
the comprehensive Student Satisfaction Survey,
the thorough consideration of its findings and
the well-publicised and timely feedback of its
results to both students and staff; and
the Student Experience Initiative, stage one of
which has been successfully implemented
university-wide to strengthen the personal
tutor system.
Recommendations for action that is desirable:
z
to make explicit the University's approach to
maintaining consistency of its procedures with
the Code of practice for the assurance of academic
quality and standards in higher education
(Codeof practice), published by the Agency,
including how central and local responsibilities
are to be distributed;
z
given the importance placed by the University
on accreditation by professional or statutory
page 1
University of Greenwich
review bodies (PSRBs) in providing externality to
the monitoring of its standards, to ensure that
PSRB reports are routinely considered centrally
for the purpose of identifying generic issues,
emerging themes or good practice;
z
to give greater priority to promoting the
involvement of students in quality management,
including working more cooperatively with the
Students' Union to reinstate training for student
representatives and encouraging all schools to
adhere to regular meeting schedules;
z
to take the necessary steps to ensure full
implementation of teaching staff appraisal,
particularly given its linkage through staff
development to delivering both the Human
Resources and Learning and Teaching
institutional strategies; and
z
to provide more systematic training and continuing
staff development for research supervisors.
Discipline audit trails
The audit team also looked in some detail at several
individual programmes in the four discipline areas of
chemical sciences, mathematics and statistics, law,
and marketing to find out how well the University's
systems and procedures were working at
programme level. The University provided the team
with documents, including student work, and
members of the team spoke to staff and students
from each discipline area. As well as its findings
supporting the overall confidence statements given
above, the team was able to state that the standard
of student achievement in the programmes was
appropriate to the titles of the awards and their
place within The framework for higher education
qualifications in England, Wales and Northern Ireland
(FHEQ), published by the Agency. The team was
also able to state that the quality of learning
opportunities available to students was suitable for
the programmes of study leading to the awards.
National reference points
To provide further evidence to support its findings,
the audit team also investigated the use made by
the University of the Academic Infrastructure which
the Agency has developed on behalf of the whole of
UK higher education. The Academic Infrastructure is
a set of nationally agreed reference points that help
to define both good practice and academic
standards. The findings of the audit suggest that the
University has responded appropriately to the Code
of practice, the FHEQ, subject benchmark statements
and programme specifications.
page 2
From 2004, the audit process will include a check
on the reliability of information about academic
standards and quality published by institutions in a
standard format, in line with the Higher Education
Funding Council for England's document 03/51,
Information on quality and standards in higher
education: Final guidance. At the time of the audit,
the University was making progress towards fulfilling
its responsibilities in this area. The information it was
publishing about the quality of its programmes and
the standards of its awards was found to be reliable.
Main report
University of Greenwich
Main report
1 An institutional audit of the University of
Greenwich (the University) was undertaken from 14
to 18 June 2004. The purpose of the audit was to
provide public information on the quality of the
University's programmes of study and on the
academic standards of its awards.
2 The audit was carried out using a process
developed by the Quality Assurance Agency for Higher
Education (the Agency) in partnership with the Higher
Education Funding Council for England (HEFCE), the
Standing Conference of Principals (SCOP) and
Universities UK (UUK), and has been endorsed by the
Department for Education and Skills. For institutions in
England it replaces the previous processes of
continuation audit, undertaken by the Agency at the
request of UUK and SCOP, and universal subject
review, undertaken by the Agency on behalf of HEFCE
as part of the latter's statutory responsibility for
assessing the quality of education that it funds.
3 The audit checked the effectiveness of the
University's procedures for establishing and
maintaining the standards of its academic awards,
for reviewing and enhancing the quality of the
programmes of study leading to those awards and
for publishing reliable information. As part of the
audit process, according to protocols agreed with
HEFCE, SCOP and UUK, the audit included
consideration of an example of institutional
processes at work at the level of the programme
through four discipline audit trails (DATs), together
with examples of those processes operating at the
level of the institution as a whole. The scope of the
audit encompassed all of the University's 'in-house'
provision leading to its awards. The provision
offered under collaborative arrangements will be the
subject of a future audit.
Section 1: Introduction: the University
of Greenwich
The institution and its mission
4 The University was created in 1992 from
Thames Polytechnic and traces its roots back to the
foundation of Woolwich Polytechnic in 1890. Its
regional focus is on south-east London and Kent.
5 Since the previous audit in 1999, the University
has rationalised and consolidated the majority of its
provision onto three campuses (reduced from five):
Avery Hill, Maritime Greenwich and Medway.
Following major restructuring that began in June
2001, the University is now organised into eight
page 4
schools: Architecture and Construction; Business;
Computing and Mathematical Sciences (CMS);
Education and Training; Engineering; Health and
Social Care; Humanities; and Science, together with
the Division of External and Combined Studies
(DECS). In addition, there are two institutes: the
Greenwich Maritime Institute (GMI) and the Natural
Resources Institute (NRI), these being respectively
linked to the schools of Humanities and Science for
the purposes of quality management.
6 Statistics for 2002-03 show that there were
some 18,000 university-based students, compared
with 14,500 at the time of the 1999 audit, and a
further 2,100 UK externally-based students. The
increase at undergraduate level in the proportion of
UK students from minority ethnic backgrounds,
from 34 per cent in 2000-01 to 42 per cent in
2002-03, is evidence of the University's commitment
to widening participation. Of the total student
population, 73 per cent are studying on
undergraduate programmes and 27 per cent on
postgraduate programmes; 64 per cent of students
are studying on full-time or sandwich programmes.
7 The University offers a range of provision from
sub-degree awards, through undergraduate and
postgraduate taught degrees to research degrees,
including a professional doctorate (EdD). It also
offers extended degree programmes, including
courses allowing entry to its degree programmes.
8 The University's mission statement is that it
'nurtures excellence in learning and teaching,
research, consultancy and advanced professional
practice serving a range of international, national
and regional communities'.
9 The University has refocused its approach to
learning, teaching and quality under the umbrella of
'nurturing excellence'. In particular, through the
Student Experience Initiative (SEI), it aims to improve
the nurturing of its students and embed excellence
in both its programmes and in student achievement.
In view of its centrality to both the mission and the
institutional Learning and Teaching Strategy (200203 to 2004-05), the audit team explored the SEI in
some detail. The University has also recognised that
its approach would out of necessity require the
adoption of a similar stance to staffing and staff
development, research, consultancy and advanced
professional practice. Therefore, the team also looked
closely at the mechanisms for staff development.
Collaborative provision
10 The University has substantial collaborative
provision offered in partnership with other overseas
Institutional Audit Report: main report
and UK institutions which includes both validated
and franchised programmes. In 2002-03,
collaborative partnerships accounted for 1,558 fulltime equivalent (FTE) UK-based students and 876
FTE students taught wholly overseas. In its SED,
which included a full list of its collaborative
provision, the University described three main
categories of partnership:
z
the Partner College Network, which comprises
eight further education colleges in the southeast region managed by the Partnership Unit
within DECS and offering a range of HNC/Ds,
Foundation Degrees, and initial/foundation
courses of extended degree programmes;
z
the national Post-Compulsory Education and
Training Network of link colleges, which includes
some 20 organisations offering post-compulsory
education programmes in partnership with the
University's Department of Post-Compulsory
Education and Training;
z
a range of other overseas and UK partners,
including private institutions and public colleges
and universities, each of which is managed by
the appropriate school. The overseas links are in
11 countries with over 20 separate partners.
11 In view of the size of the University's
collaborative provision, it will be the subject of a
separate audit in the future, so does not form part
of the present institutional audit.
Background information
12 The published information available for this
audit included:
z
the University's Restructuring Strategy: Proposals
for the Structure and Strategy (March 2001);
Implementation of the New Strategy and
Structures (June 2001); A Framework for the
Strategy and Structure (December 2001);
z
school implementation plans: eight documents
outlining how each school would implement the
Restructuring Strategy at local level;
z
committee minutes as follows: Academic Council
(October 2001 to January 2004); Learning and
Quality Committee (LQC) (February 2002 to
January 2004); Quality Assurance Sub-Committee
(January 2003 to January 2004); Quality
Enhancement Sub-Committee (QESC) (January
2003 to January 2004); Academic Collaboration
Committee (October 2001 to October 2003);
Combined Honours Sub-Committee (November
2001 to November 2003); Portfolio Working
Group (May 2002 to January 2004);
z
the Quality Assurance Handbook (spring 2003,
revised January 2004);
z
Student Satisfaction Surveys (SSSs) (2000 to 2002);
z
Academic Planning and Management Statistics:
Data on the Student Population (2000 to 2002);
z
School Annual Reporting and Planning Documents
(ARPDs) (2003-04) and their statistical appendices.
15 During the briefing and audit visits, the audit
team was given ready access to the University's
internal documents in hardcopy or on the
University's web site or intranet and to a range of
documentation relating to the selected DATs, the
latter including examples of student work.
z
the information on the University's web site;
The audit process
z
the report of the previous quality audit of the
University by the Agency, undertaken in
November 1999;
z
the report of the audit of the partnership
between the University and Systematic
Education Group, Petaling Jaya, Malaysia,
undertaken in February 2001;
z
the reports of HEFCE and Agency reviews of
provision at subject level.
16 Following the preliminary meeting at the
University in October 2003, the Agency confirmed
that four DATs would be conducted during the audit
visit. The Agency received the SED in February 2004
and the DSEDs in April 2004. The audit team's
selection of DATs was chemical sciences,
mathematics and statistics, law, and marketing. The
DSEDs were specifically written for the audit, with
the exception of chemical sciences which was a
compilation of programme review documentation.
13 The University provided the Agency with the
following documents:
z
the self-evaluation document (SED);
z
discipline self-evaluation documents (DSEDs) for
the four areas selected for DATs.
14 In addition, the University provided the
following documents on CD-ROM:
17 The audit team visited the University on 11 and
12 May 2004 for the purpose of exploring with the
Vice-Chancellor, senior members of staff and student
representatives matters relating to the management
of quality and standards raised by the SED or other
documentation provided for the team. During this
briefing visit, the team signalled a number of
themes for the audit and developed a programme of
page 5
University of Greenwich
meetings for the audit visit, which was agreed with
the University.
18 At the preliminary meeting, the students of the
University were invited, through the Students' Union
(SUUG), to submit a separate document expressing
views on the student experience at the University,
identifying any matters of concern or
commendation with respect to the quality of
programmes and the standards of awards. They
were also invited to give their views on the level of
representation afforded to them and on the extent
to which their views were taken into account.
19 In February 2004, SUUG submitted to the
Agency a students' written submission (SWS)
prepared by its Quality Audit Team on the basis of a
series of consultative meetings, separate analyses of
the University SSS over the period 2001 to 2003,
and a special SUUG Quality Assurance Questionnaire
designed for the purpose of obtaining information
for the SWS. SUUG indicated that the document
had been shared with appropriate University staff.
There were no matters that the audit team was
required to treat with any level of confidentiality
greater than that normally applying to the audit
process. The team is grateful to the students for
preparing this document to support the audit.
20 The audit visit took place from 14 to 18 June
2004 and involved further meetings with staff and
students of the University, both at institutional level
and in relation to the selected DATs. The audit team
comprised Ms N Channon (audit visit), Professor R
Davis, Dr P Easy (briefing visit), Dr H Fletcher, Dr S
Hargreaves, Mr J Napier, auditors, and Ms L Rowand,
audit secretary. The audit was coordinated for the
Agency by Ms J Holt, Assistant Director, Reviews Group.
Developments since the previous academic
quality audit
21 The University was subject to an academic
quality audit by the Agency in November 1999. The
report (published in May 2000) commended several
aspects of the University's practice: its effective
approach to the management of change; progress
in the development of quality audit and assurance
procedures, including the effective arrangements for
devolution of quality assurance responsibilities to
campus and faculty level; the practical
implementation of its mission through, for example,
its partner college network; the utility of the
documents underlying the Academic Framework in
providing central mechanisms for the regulation of
devolved quality assurance responsibilities; the
policy of ensuring that research and advanced
page 6
professional practice supported teaching at
advanced levels; its support of, and communications
with, students; and the establishment of local
communications strategies.
22 The report also invited the University to find
ways of maintaining the balance between corporate
authority and local autonomy in respect of quality
and standards; to ensure that the procedures used
in the regulation of the programmes of combined
studies students were consistent with the University's
overall regulatory requirements; to continue to
develop its capacity to provide robust management
data; to continue its investigations into noncompletion rates for first-year students and to
implement methods for reducing them; and to
examine diversity of practice in administrative
procedures and documentation across the faculties.
23 In its SED, the University described the progress
it had made against each of the recommendations
of the previous audit report. In particular, it had
undertaken an internal audit of the work of the then
Combined Studies Progression and Award Board
(PAB) and employed measures in 2001-02 to ensure
that its decisions and those of its successor, the
Combined Honours PAB, were consistent with
University-approved assessment regulations. The
University had also implemented and developed a
new management information system and claimed
that it now had access to significantly improved
data. In respect of first-year non-completion rates,
the University had implemented a SEI and
reorganised its direct services to students into a
single Office of Student Affairs. The University's
approach to the issue of diversity of practice
between faculties had, in some respects, been
overtaken by the organisational restructuring which
began in June 2001. This restructuring had also
been informed by the University's response to the
more substantial issue of maintaining the balance
between corporate authority and local autonomy.
24 At the time of the previous audit, the University
had five campuses, four faculties and 13 schools.
Following the full implementation of restructuring, it
has now consolidated onto three campuses and has
introduced an academic structure of eight schools
together with DECS. In its SED, the University stated
that the decision to remove faculties 'was informed
by a desire to establish a closer and more effective
relationship between the corporate centre and the
primary academic structure' and, more specifically,
to establish the schools as the 'academic drivers of
the institution'. As such, rather than seeking to
maintain a balance between corporate authority and
local autonomy, the University claims that it has
Institutional Audit Report: main report
sought to develop a new, and different, relationship
between the schools and the corporate centre.
25 In respect of the University's arrangements for
quality assurance, the restructuring has also seen the
introduction of a central Learning and Quality Office
(LQO), with institutional responsibilities for
development, implementation and central oversight
of policy in the areas of learning and teaching and
quality assurance; the appointment within each
school of a Director of Learning and Quality (SDLQ);
the establishment of school-based quality assurance
officers (SQAOs); the creation of a new universitylevel LQC, which has absorbed the remit of the
former Quality Audit Committee; and the
introduction of a new process, the ARPD, designed
to link schools more effectively to the centre.
26 Since the 1999 audit, the University has
participated in nine Agency subject reviews and
two developmental engagements (DEs). In the SED,
the University stated that if issues were raised in
subject review reports with sufficient regularity to
suggest a 'system-wide problem', these were
addressed at institutional level, with the issues
relating to external examiners providing an example
(see paragraph 67 below).
27 The present audit team considered that the
University had responded to the previous audit report
in an effective and timely manner. For instance, the
University had given attention to the operation of
PABs, enabling it to introduce revised arrangements
for combined honours (see paragraphs 68 and 72
below). The team saw evidence of significant activity
and progress being made in developing both the
University's management information system (see
paragraphs 98 to 102 below) and the SEI (see
paragraphs 127 to 131 below). The devolution of
quality management to schools was premised on
management information being available centrally,
while the initial focus of the SEI was on first-year
students and improving their retention.
28 However, the audit team also considered that
the scale of the restructuring reduced both the
purpose and the means of following up on specific
actions relating to the previous audit. In the context
of the new and developing relationship between
centre and schools, the team was able to identify
areas where clarity from the centre about minimum
requirements expected of schools might be
improved (see paragraph 39 below for signposting).
Specifically, it would highlight a number of
complexities facing combined honours students,
which suggested to the team that cross-school
oversight should be strengthened (see paragraphs
72, 78 and 130 below). The University's effective
approach to the management of change, previously
identified for commendation, was apparent to the
team from the general enthusiasm it encountered
from staff for the new devolved organisation.
Section 2: The audit investigations:
institutional processes
The institution's view as expressed in the SED
29 The University described its approach to the
management of quality assurance as being
characterised by:
z
devolved quality assurance and enhancement
within a central regulatory framework;
z
externality and internal cross-representation as
good practice;
z
scrutiny proportional to risk;
z
wide staff participation in quality assurance and
enhancement.
30 In the 2001 restructuring, schools were given
responsibility for both the delivery of academic
quality (through their constituent departments) and
for the management of quality assurance, with these
responsibilities being specifically vested in the
position of head of school. According to the SED,
'the specific benefit aimed for was to create a quality
assurance system which actually assures quality and
does so by virtue of a system which is genuinely
used by operational managers to ensure that in
those operations for which they are responsible
quality is being achieved'.
31 Central oversight remains a key component of
the revised quality assurance system, this being
maintained through a central committee (LQC),
office (LQO) and regulatory framework. The ARPD
was seen by the University to be the main
innovation, providing a more focused information
flow from the schools to the centre.
The institution's framework for managing
quality and standards, including
collaborative provision
32 Academic Council, the senior deliberative
committee, has ultimate responsibility for the quality
and standards of the University's academic provision.
Reporting to Academic Council is a range of
committees integral to the quality assurance system.
School boards are responsible for the oversight of
within-school provision, which they exercise through
their learning and quality (or equivalent)
subcommittees (school LQCs), while the DECS Board
page 7
University of Greenwich
and the Research Degrees Committee (RDC)
respectively provide cross-school oversight of the
management of combined honours and research
degree programmes. The central LQC is concerned
with university-wide procedural consistency and
rigour, and the enhancement of learning and the
student experience, while its subcommittees with
remits for quality assurance (QASC) and quality
enhancement (QESC), deal with operational issues and
undertake the groundwork for policy development.
Also reporting to Academic Council on the
authorisation of new programme proposals is the
Academic Planning Sub-committee (APSC) of the
Executive Committee. The latter is the University's
senior managerial committee, although both it and its
subcommittees must work within the policy and
strategy set by Academic Council. There is, in addition,
the Vice-Chancellor's Group (VCG), comprising the
University's most senior managers, which acts in an
advisory capacity to the Vice-Chancellor.
33 The chairing and membership arrangements for
these committees reflect relevant executive
responsibilities. The Vice-Chancellor chairs both the
Executive Committee and Academic Council; heads
of school chair school boards; the Pro-ViceChancellor (PVC) (Academic Planning) who also has
supervisory responsibility for the assurance of quality
and standards, chairs both APSC and LQC; and the
(UDLQ) chairs QASC and QESC. Heads of school
each report to one of three PVCs and, through their
membership of the Executive Committee (in
addition to Academic Council), they provide an
important linkage between both deliberative policy
and management and between the schools and the
centre. SDLQs are members of LQC and they or
their head of school are members of APSC. They
have responsibility for both the quality assurance of
their school's provision and for general compliance
with the University's quality assurance procedures
and, as such, have a functional link to the UDLQ
who heads the LQO. A parallel administrative link
exists between the school-based SQAOs and the
quality managers in the LQO.
34 The quality assurance remit devolved to schools
includes approval, monitoring and review of their
academic provision; production of the ARPD;
management of the assessment cycle; appointment
of external examiners and response to their reports;
response to the university-wide SSS and other
student feedback; preparation for school and
departmental reviews; and maintenance of links with
professional and statutory review bodies (PSRBs). To
enable schools to carry out these functions in
respect of taught programmes, they are required by
the University to have the posts of head of school,
page 8
SDLQ and SQAO; a school board plus one or more
committees to oversee learning and teaching, and
quality assurance; and the clustering of cognate
disciplines within departments to facilitate
operational and quality management. Schools are
otherwise free to develop their own arrangements to
discharge their quality assurance responsibilities and
to demonstrate their accountability to the
University's managerial and deliberative structures.
35 The University produces a range of documents
(termed framework documentation) that contributes
to the central regulatory framework of academic
provision (see paragraph 45 below) within which
schools must normally operate. One of the key
documents is the Quality Assurance Handbook which
deals with procedures for the approval, monitoring
and review of the University's taught awards.
Maintained by the LQO, it additionally provides
guidance for schools on such topics as risk
assessment, including the application of the
University-devised risk assessment tool, and the use of
The framework for higher education qualifications in
England, Wales and Northern Ireland (FHEQ), subject
benchmark statements and the Code of practice for the
assurance of academic quality and standards in higher
education (Code of practice), published by the Agency.
36 Also part of the framework documentation are the
Academic Regulations of which there are three sets,
covering respectively undergraduate taught, graduate
and postgraduate taught, and research awards. These
govern assessment arrangements and external
moderation of academic standards. Any changes to
these regulations, or exemptions from them, require
approval by Academic Council which regularly utilises
an Academic Regulations Working Party. The Academic
Regulations (for taught awards) define the terms of
reference for subject assessment panels (SAPs), which
have primary responsibility for addressing academic
quality and standards, and for PABs, which deal with
individual student profiles, making decisions about
students' progression and award.
37 Programme-specific regulations, which also
include relevant PSRB requirements, are confirmed
as part of the programme approval process and
published in student handbooks. Schools have their
own assessment policies, although these are being
further developed to explicate their consistency with
the Code of practice. There is in addition an
internet-based document, Assessment Information
for Candidates, maintained by the Regulations,
Standards and Examinations Office, whose
responsibilities also include the management of
examinations and administrative oversight of those
aspects of the external examiner system relating to
Institutional Audit Report: main report
the central regulatory framework (see paragraphs 65
to 67 below).
38 The SED pointed to several aspects of the
management of assessment that were liable to
change, largely in response to the phased
implementation from 2004-05 of a three-term
(rather than two-semester) academic year. A revised
set of Academic Regulations (for undergraduate
taught awards) was due by the end of the current
session, to simplify and eliminate inconsistencies in
existing regulations, and to take account of a shift in
the balance between 15 and 30-credit courses, with
the latter becoming the standard. The scheduling of
assessment related activities was also being adjusted
to accommodate a single assessment period at the
end of the academic year (rather than separate
periods following each semester), while retaining
the September reassessment period. Following an
LQO 'spot audit' of the consonance of schools'
assessment policies with the Code of practice (see
paragraph 73 below), the development of a single
policy for some aspects of assessment was under
consideration, with certain items having been
referred to the Academic Regulations Working Party
for incorporation in the Academic Regulations.
39 It was made plain to the audit team at the
briefing visit that the University considered that it had
passed through the implementation stage of its new
framework for managing quality and standards, and
that the devolved system, although relatively recent,
was sufficiently well-embedded to withstand external
quality audit. From discussions with staff, their
enthusiasm for the devolved system was not in doubt
and it was clear to the team that the strengthened
internal managerial framework in schools was leading
to a stronger sense of ownership of quality assurance.
In particular, SDLQs were operating as a coherent
group and their membership of LQC was seen by staff
to be raising the Committee's profile in influencing
University policy. However, the team also became
aware that the University was still adjusting its
procedures to achieve the optimum relationship
between central oversight and school responsibility,
and was able to point to areas where clarity from the
centre about minimum requirements expected of
schools might be improved (see paragraphs 55-56,
64, 77 and 111 below).
40 The extent to which the devolved system
reduces the University's capacity to respond quickly
to generic issues is difficult to discern, but the audit
team saw evidence of delay, such as the protracted
discussions over assessment policy (see paragraphs
77-78 below) and the limited priority being given to
promoting student involvement in quality
management (see paragraphs 89-90 below). In this
context the team also considered that there was
further scope for central committees to facilitate the
adoption of best practice across schools (see
paragraph 44 below for signposting). In general,
however, the team was satisfied that the University
had in place appropriate structures for managing
quality and standards and was using these effectively
to develop significant innovations such as the ARPD
(see paragraph 54 below), the SEI (see paragraph
131 below) and the SSS (see paragraph 95 below).
The institution's intentions for the
enhancement of quality and standards
41 In the SED, the University explained that it was
increasingly acting as a facilitator for school-level
staff rather than attempting to advance a central
enhancement agenda. It discussed enhancement in
the context of having developed mechanisms to
resource and manage change effectively, with QESC
and school LQCs acting as the respective focal
points at central and local levels for the increased
targeting of resources to school-based staff. The
University also claimed that since restructuring
schools had a better sense of one another's
activities, improving the sharing of good practice.
42 In terms of institutional change, the SED
highlighted the adoption of 30-credit courses
emphasising depth rather than just choice as having
positive implications for the enhancement agenda.
Anticipated benefits to the student experience
included promoting cohort identity, reducing
assessment loads and increasing opportunity for
informal feedback on performance. These were
judged by the University to outweigh disadvantages,
such as less flexibility for students over elective
courses and fewer opportunities for formal oversight
of their progression. Stage one of the SEI,
addressing personal tutoring and student support
services, had been implemented and the University
was now poised to extend it to the next stage, with
student self-development as the overarching theme
(see paragraph 131 below).
43 In discussions with staff, the audit team
explored the mechanisms for sharing good practice.
It learned how networking was facilitated by crossrepresentation on school-based committees; the
developing functional relationships between SDLQs
and SQAOs and their counterparts in the LQO; and
through participation in LQC and QESC. The team
suggested that principal lecturers promoted on the
basis of teaching excellence (PLTs) appeared not to
have a designated forum for networking. Staff
acknowledged that the PLT position had tended to
page 9
University of Greenwich
become 'individualised', and that the University was
seeking greater commitment to shared practice
through clarification and strengthening of the role
(see paragraph 111 below). The team thus
concluded that PLTs were not currently functioning
as a corporate resource, driving forward the
institutional Learning and Teaching Strategy.
44 It appeared to the audit team that the approach
to sharing good practice across schools was on
occasions somewhat reactive. In the case of
assessment policies, for example (see paragraph 77
below), schools were independently formulating
policies and sharing their respective practice
afterwards. The team was also able to identify
practice it considered worthy of wider dissemination
and, in this respect, highlights the creation of an SEI
working group in the School of Engineering and the
student intranet developed by CMS (see paragraphs
160-161 below). The team therefore encourages the
University to take a more proactive approach to
promoting cooperation and the sharing of ideas
between schools at an earlier stage in the
development of policies that might have universitywide application or impinge on key strategic areas
(see also paragraphs 83 and 133 below). However,
the team recognises the considerable achievement,
within a relatively short time, of establishing
mechanisms within schools for ensuring the effective
utilisation of resources for enhancement, as illustrated
by the disbursement of HEFCE funding for Widening
Participation and Teaching Quality Enhancement.
Internal approval, monitoring and review
processes
Programme approval
45 The University's taught awards are designed to
fit within a single credit-based framework (the
Academic Framework), which is available at predegree, degree and postgraduate levels. Within the
Academic Framework, schools operate a collection
of courses organised as programmes or, in the case
of combined honours, as packages, comprising a
discrete group of courses designed to form a
proportion of a named award. The University's
processes for approving new courses, programmes
and packages (approval) or substantial alterations to
existing provision (review) are set out in the Quality
Assurance Handbook. This also incorporates pro
formas for preparing the requisite documentation,
including programme specifications, together with
guidance notes for SQAOs and others on organising
associated events, although schools have discretion
to decide on the most appropriate method of
approval or review. Minor changes to courses or
page 10
programmes are approved by the school LQC and
notified to the LQO.
46 Approval first requires authorisation of new
programmes or packages by APSC to establish the
academic and business case for their addition to the
University's academic portfolio, to ensure that
adequate resources will be available in the host
school, and to trigger preparations for marketing
and recruitment. According to the authorisation
process, which is detailed in the framework
documentation on Academic Planning Procedures,
full proposals for new programmes are submitted to
APSC using standard forms and are also outlined in
the portfolio planning section of school ARPDs.
Programmes to be discontinued are similarly notified
to APSC for authorisation. The normal cycle of
portfolio planning is premised on an 18-month lead
time from authorisation to first intake, although
there are 'fast-track' procedures to allow a rapid
response to changes in market demand. In the
context of programme approval, distance delivery
via e-learning, without external partner involvement,
is treated as internal provision.
47 Following authorisation, programmes, packages
and their constituent courses are developed in detail
and subsequently submitted for approval on the basis
of the schools' chosen arrangements, as determined
by SDLQs working with school LQCs. The outcomes
of the process, which may involve conditions or
recommendations to be met before approval of the
definitive programme document is confirmed, are
reported to the relevant school LQC and monitored
centrally by the LQO, any concerns being reported
via QASC to LQC. Outcomes are also reported
upwards through the committee system to school
boards and, where applicable, to the Combined
Honours Committee. The process of review is
essentially the same, except that it must include a
critical appraisal of the programme, informed by
current and former students' views. All approvals and
reviews must involve someone external to the school
(see paragraphs 62 to 64 below).
48 In its SED, the University expressed the view that
the strengthening of the authorisation process had
been 'an important element in the development of the
University's procedures for quality and standards',
although it acknowledged that in the 2002-03 session
it became concerned about the high volume of
authorisations requested by fast-tracking which was
being monitored by APSC. The University also stated
that generally the approval process was seen to be
transparent and effective, in that it produced clear and
accessible reports, providing the basis for monitoring
compliance with procedures and for identifying any
Institutional Audit Report: main report
concerns. With respect to the increased flexibility now
open to them, the SED indicated that so far schools
had mostly chosen to retain the traditional eventdriven approval process, although they were
increasingly tuning events to the nature of the
approval and the extent of perceived risk.
49 Senior staff were acutely aware of the
drawbacks to fast-tracking programme
authorisation, explaining that the University had
taken steps to expedite decision-making by
streamlining the regular process. It had also
protected combined honours from fast-tracking, in
recognition of the added logistical complexity of
linking new and existing packages to produce
coherent programmes of study. In general, staff
confirmed that the procedures for approval and
review were thorough and essentially a continuation
of the comprehensive 'old' process. The audit team
saw evidence of continual monitoring of programme
approval by QASC to identify any action required at
university level, latterly based on the extraction of
salient points from the full reports by the LQO. The
team concluded that the procedures for programme
approval and review were rigorous and were being
effectively managed and operated within schools
with appropriate institutional oversight.
Annual monitoring
50 In 2001-02, the University introduced the ARPD
as the principal vehicle for monitoring academic
provision. The ARPD, which has been subject to
refinement over its three years of operation, is
produced according to a template under the
specified headings of: school strategy and targets;
quality and standards; school finances; staffing;
research; portfolio planning; resource requests and
collaborative provision. Feeding into the quality and
standards section are departmental reports and
programme annual monitoring reports (AMRs),
although there is no requirement for every
programme to be monitored each year. AMRs, for
which there is a University pro forma, are based on
analysis of statistical data, student feedback from
surveys and committee minutes, reports from SAPs
and PABs, external examiner reports and
programme committee minutes.
51 Schools have considerable discretion over the
frequency and timing of monitoring individual
courses and programmes, although formal
monitoring reports are normally required for the first
year of delivery of a new programme; for any
programmes that are deemed to be 'at risk', or
where external examiners or students have raised
serious concerns; or programmes showing poor
progression and completion rates. Schools also have
discretion over the process and mechanisms for
producing their ARPDs, but these have to be agreed
by the relevant school boards before they are
presented to the University by heads of school.
52 Schools submit their ARPDs to the VCG in
January and the PVCs undertake a preliminary
review of the ARPDs for the schools for which they
are responsible, providing initial feedback to heads
of school. Each ARPD is then separated into its
various sections which are distributed to the
appropriate committees of Academic Council and
the Executive Committee. For instance, the quality
and standards section is considered by LQC,
following scrutiny of the separate school submissions
by a panel to identify common themes. The parent
committees subsequently receive reports from their
respective subcommittees. Academic Council
considers summarised outcomes of the sections on
quality and standards, research and collaborative
provision, while the Executive Committee considers
summarised outcomes of the sections on school
strategies and targets, school finances, staffing,
resource requests and portfolio planning. School
boards receive feedback from both Academic
Council and the Executive Committee. This iterative
reporting process through the committee system
takes place during February and March resulting in
formal feedback to schools in March and April.
53 The SED explained that the ARPD was
conceived as a concise, highly structured document
that would minimise unnecessary workload and
facilitate widespread use. However, it had evolved
into a lengthier submission, schools being unable to
achieve the brevity of reporting envisaged.
According to the SED, the ARPD 'provides strong
and sometimes detailed insights for the University
into how schools function and stimulates dialogue
between schools and the centre'. Refinement of the
process has led to reports that more effectively meet
institutional requirements.
54 It became apparent to the audit team that the
University had responded to the expansion of the
ARPD by schools by developing an efficient
mechanism for sharing out the evaluation of the
information among a series of central committees,
thus allowing it still to give feedback to schools
within a relatively short timescale. In the team's
view, this process of disaggregating the whole and
quickly putting it back together again preserved for
schools the coherence of progress reporting and
action planning and contributed, as stated in the
SED, to ensuring a 'dynamic quality system within
schools'. The team also agreed with the University's
own assessment of the portfolio planning section as
page 11
University of Greenwich
a 'significant tool'. There is good practice in the
holistic approach to reporting and planning through
the ARPD, combining in a single process and a
single document both the academic quality and
standards, and human and financial resources
aspects of schools' activities and providing the
University with a valuable instrument for managing
its current and future portfolio.
55 In meetings with the audit team, staff
consistently expressed the view that the ARPD was
the primary means by which the University was kept
informed about the implementation and impact of
its strategies, policies and management decisions at
the programme level. Therefore, the team paid
special attention to the relevant sections dealing with
quality and standards and staff development. The
team doubted whether these sections could provide
sufficiently detailed and comparable information
because of the premise that, in producing its
overview for the ARPD, a school had considerable
discretion over the format, content and means of
compiling the underlying information. For instance,
there were fairly fundamental differences between
schools in the scale of their annual course and
programme monitoring and in the mechanisms they
used for approving AMRs; there was also ambiguity
as to which previous session was being reported
upon in relation to responses to external examiners'
comments. The variable emphasis given to staff
development within the staffing section of the ARPD
(see paragraph 111 below) was acknowledged by
senior staff in their meeting with the team.
56 The audit team considered that LQC's approach
to the quality and standards section, which was one
of iteratively refining the template to help tease out
common themes, was rather slow and timeconsuming and, given the ARPD was now in its third
year of operation, that a less tentative and more
direct approach might now be needed. The team
therefore considers it advisable for the University to
provide schools with more explicit guidance on the
expectations for reporting on matters relating to
quality assurance through the ARPD, in order to
improve consistency and comprehensiveness and
thereby to make the ARPD a more effective channel
for institutional oversight within the University's
framework for managing quality and standards.
Periodic review
57 Complementing the ARPD, the University has
recently introduced school reviews which provide a
regular check on the operations of a school across a
range of programmes within the broader context of
its management and resources. These are conducted
by a team established by the LQO that includes
page 12
both student and external reviewers and are chaired
by a senior University manager. The process involves
consideration of all activities within an individual
school, utilising audit trails to focus enquiries and is
based on a short SED prepared by the school,
supplemented by a set of data akin to that normally
used in the ARPD. Review reports are submitted to
the Executive Committee and LQC, together with an
agreed set of actions and timescales for their
completion. LQC establishes the overall programme
for the review cycle and all eight schools plus DECS
have been scheduled for review over the three-year
period from 2003 to 2006.
58 In addition to regular school reviews, the
University operates occasional reviews at the level of
both the department and the programme. The
process for departmental review is similar to that for
school review, although tailored according to the
reasons for instigating the review, which may be
triggered by either the centre or the school. Possible
reasons for departmental review include concern
over the viability of a department following a risk
assessment by the school; a restructuring exercise
having a significant departmental impact; major
changes to a department's portfolio; preparation for
review by an external agency or PSRB; or as a formal
part of the schools' own quality assurance processes.
59 At programme level, the quinquennial review
system has effectively been superseded by the review
processes previously outlined. However, there is a
mechanism, through the LQO database, for
identifying programmes that reach five years after
their initial approval without otherwise having been
reviewed by any formal process, including PSRB
review. Schools are required to conduct a risk
assessment for the programme as the basis of deciding
whether or not to conduct a review and, if so, what its
scope should be; thus schools are able to make a case
for a longer interval between reviews than five years.
Reviews undertaken via this route follow the same
process as that for programmes involving a substantial
revision (see paragraph 47 above).
60 The SED concentrated on review exercises in
the School of Humanities (2001-02) and in the
Department of Law (2002-03) which came to be
adopted as pilots for the school and departmental
review processes. The School of Humanities review,
comprising a series of developmental programmelevel reviews involving external assessors, was cited
in the SED as 'a useful example of how schools can
effectively manage their quality assurance agenda in
terms of their needs'. The Department of Law
review, instigated as a consequence of concern
about retention at stage 1, was recognised as having
Institutional Audit Report: main report
embraced the principles subsequently introduced as
requirements for departmental review, and also
demonstrated the follow-up process whereby the
resultant changes to the structure and management
of law programmes are monitored through
successive ARPDs. The SED explained that the
interaction between the various review processes
was being monitored by the LQO to provide the
relevant committees with the means of evaluating
the effectiveness of these processes and their
underlying principle of scrutiny proportional to risk.
61 The audit team was unable to comment in
detail on the effectiveness of the new procedures for
periodic review since, at the time of its visit, there
had been only one departmental and three school
reviews, with two of these conducted so recently
that the final reports were not available. The team
learned from documentation and discussion with
staff that both processes were regarded by the
University as developmental during this session and
that there was ongoing debate about whether the
frequency of school review should be three or five
years. Staff were mostly insistent that all
programmes would be reviewed within five years,
and many were seemingly unaware of the weight
now being placed by the University on the use of
the risk assessment tool for determining necessity or
scope in relation to programme review, rather than
still seeing it as being applicable mainly to
collaborative provision. The University may wish to
bear this in mind when evaluating the interaction of
its various review procedures.
External participation in internal review
procedures
62 External involvement is an integral part of the
University's periodic review procedures, with the
requirement for there to be a member of the review
team external to the University for both school and
departmental review. However, external involvement
in programme approval and review is defined in
terms of being external to the school, which could
either be someone outside the University or someone
from another school, or sometimes both. The Quality
Assurance Handbook sets out how this requirement
may be satisfied in a number of different scenarios
with advisers external to the University (excepting
external examiners) being necessary for major
developments where greater scrutiny is appropriate.
In the case of combined honours packages, there is
the additional specification that these have to be
sufficiently self-contained to be combined with any
other approved packages without further external
involvement, provided the University's 'zonal'
timetable can accommodate their delivery.
63 As stated in the SED, the University regarded
externality to be 'a fundamental principle on which
to base its confidence in its academic quality and
standards'. In the context of programme approval
it gave examples, illustrating its view of the
appropriateness of the discretion being applied by
schools in deciding the level of externality required
for restructuring existing provision, as compared
with that for a new programme entailing significant
curriculum development. It also indicated that there
was increasing use of SDLQs as panel members for
other schools. In addition, the University described
external accreditation as providing 'a valuable
complement' to its internal processes of monitoring
and review, and gave examples of regular and
successful engagements with PSRBs including, in the
School of Engineering, the joint development of
programmes with the Institute of Incorporated
Engineers, leading to their subsequent accreditation.
64 It was clear to the audit team from its reading
of documentation and discussions with staff that
they valued contributions to review processes from
outside the University, because subject peers could
comment in detail on both curricula and
programme standards, while professional peers
added the dimension of fitness for purpose in the
workplace. It also became apparent to the team that
new programme approval invariably utilised advisers
external to the University, while input from other
schools was invited more to prevent a school from
becoming inward looking and to facilitate the
sharing of good practice. However, the team noted
the tendency for the University to link together the
concepts of externality and cross-representation and,
while appreciating that this gives emphasis to the
school as the primary focus in the academic
structure, it also, in the team's view, clouds the
important distinction between these concepts in
cases where independent input from sources
external to the University is paramount.
External examiners and their reports
65 The external examiner system operates within the
framework prescribed by the Academic Regulations
(for taught awards), which are supplemented by the
notes of guidance on external examiners, published in
the Quality Assurance Handbook. The latter have
recently been developed to assist schools and contain,
as an annex, the precepts of the Code of practice,
Section 4: External examining, published by the Agency.
Heads of department are responsible for nominating
external examiners, while school boards approve their
appointment. Central oversight of the appointment of
external examiners is maintained by the Regulations,
page 13
University of Greenwich
Standards and Examinations Office which sends out
appointment letters on behalf of the University, and
also offers an induction programme for new external
examiners and a forum for existing ones.
66 The appointment letter encloses details of the role
of the external examiner, the arrangements for
presenting reports, and the rules and regulations for
assessment laid down by the regulatory framework.
Schools follow up this letter with a range of contextual
information, including their assessment policies and a
statement of the attendance requirements for external
examiners at SAPs and PABs. External examiners are
each required to produce an annual report according
to a standard template which, as a consequence of a
review of the external examiner system undertaken by
a QASC working group, has recently been revised to
allow for the publication of summaries on the
Teaching Quality Information (TQI) web site (see
paragraph 201 below).
67 Reports from external examiners are received by
the Regulations, Standards and Examinations Office,
which circulates them to the PVC (Academic
Planning), the UDLQ, the appropriate head of
school, SDLQ and SQAO. Schools are required to
provide summaries of action taken in response to
last year's external examiner reports in their ARPDs
and also to evaluate their processes for dealing with
external examiner input. The QASC working group
has recommended a timetable to be used by schools
for monitoring external examiner reports and
responses to them through programme AMRs and
departmental reports, finally feeding into the ARPD.
Complementing the annual monitoring cycle within
schools, the LQO has started to conduct an annual
analysis of external examiner reports. The first of
these pertains to the 2002-03 session, when a
procedure was also introduced, requiring a formal
response from a school to the LQO in relation to
any report suggesting a threat to standards. In
addition, the Regulations, Standards and
Examinations Office produces an annual report for
Academic Council on those aspects of the reports
relating to the central regulatory framework.
68 In the SED, the University described its external
examiners as 'a central component in maintaining
and enhancing its standards' and pointed to
changes in their institutional remit to reflect the
greater emphasis nationally on the use of outcomebased benchmarks for elucidating standards. The
University went on to illustrate how it had addressed
external examiners' concerns in areas such as the
adequacy of data supplied to SAPs and PABs and the
introduction of a revised 'discounting system' for
degree classification, introduced in 2001-02. It also
page 14
expressed the belief that it was meeting the Code of
practice in relation to external examining, although
it acknowledged that four schools had identified
action to improve their response to external
examiners in their 2002-03 ARPDs.
69 The audit team explored the institutional
process for dealing with external examiners'
comments in the context of the revised 'discounting
system'. It concluded that the University had made
every effort to appreciate the balance of external
examiner opinion, but had seemingly been less
attentive to providing external examiners with an
institution-level response to their comments.
Nevertheless, the University did make modifications
to the 'discounting system' based on the advice
received from external examiners, and continues to
monitor both the impact of the system on award
classifications and the views of a minority of external
examiners who remain critical of its application.
70 Through the DATs the audit team was able to
verify that external examiner reports were taken
seriously and fed through programme AMRs and
departmental reports into the ARPD process. While
there were few direct comments about the
appropriateness of standards in relation to external
benchmarks, assessment strategy was clearly a
feature of subject-level discussions with external
examiners. The team acknowledges that the new
report template specifically asks for this information.
As previously mentioned (see paragraph 55 above),
there were some differences in interpretation over
what constituted 'last year' in reporting action on
external examiner reports in the ARPD.
71 The audit team considered the LQO analysis of
external examiner reports to be a useful innovation,
identifying strengths and weaknesses under a
number of headings, highlighting areas where
action was required at senior level and tracking
reports outstanding. Although there was evidence of
widespread discussion of some of the issues raised in
the analysis, for example, the functioning of PABs,
the team was of the view that its full benefit might
not be realised because of the length of time
between initial presentation to LQC in October and
consideration by Academic Council the following
June. Notwithstanding this observation, the team
concluded that the University was making effective
use of external examiners in its summative
assessment procedures.
72 On a separate point, the audit team noted that
there were three different PAB arrangements for
combined honours and that these had been
approved following extensive debate about the
Institutional Audit Report: main report
parity of treatment of combined honours students.
The team recognises that the University has paid
considerable attention to standardising the
operation of PABs but, nevertheless, encourages it to
keep under review the complexity of arrangements
faced by its combined honours students in relation
to progression and support.
External reference points
73 The SED outlined the University's approach to the
Code of practice as having been to amend
progressively its own procedures, where deemed
appropriate, as each section of the Code was released.
One of the precursor committees of LQC took the lead
by undertaking a mapping exercise for those sections
relating to mainstream quality assurance, including
Section 7: Programme approval, monitoring and review,
Section 6: Assessment of students, and Section 4:
External examining. Sections of the Code of practice,
relating more directly to the student experience, for
example, Section 8: Career education, information and
guidance, Section 10: Recruitment and admissions, and
Section 3: Students with disabilities were handled by
appropriate administrative offices. QASC has since
taken on the mainstream quality assurance brief and
has, this session, organised 'spot audits', through the
LQO, to ensure continued alignment with the Code. In
addition to its main remit, the QASC working group
on the external examiner system has been considering
the proposed revision of the Code on external
examining, during the consultation phase. In the SED,
the University stated that as a result of the
considerable work it had undertaken to ensure
alignment, it was able to confirm the consonance of
its procedures with the precepts of the Code.
74 The University took a similar approach to the
FHEQ. In the March 2002 revision of the Academic
Regulations (for undergraduate taught awards), it
adopted the qualification descriptors of the FHEQ
following a review comparing these with the level
descriptors it had previously been using in the
design of programmes. The SED referred to subject
benchmark statements as 'a useful tool' against
which to check curriculum coverage, although it
also made the point that programme teams found
some statements too generic to be of much
assistance. As part of the approval process, external
advisers are asked to comment on curriculum
content against subject benchmark statements and
the revised report form for external examiners asks
for confirmation of the appropriateness of standards
in relation to national reference points.
75 The approach taken to the implementation of
programme specifications has been to integrate
their production from September 2002 with
programme approval and review procedures.
However, the University also imposed a deadline
date of January 2004 for production of programme
specifications applicable to all courses, including
those not due for review. The SED indicated that the
University had historically described its provision in
terms of aims and learning outcomes and that the
main developments had been a more explicit
mapping of skills acquisition within the curriculum
and a stronger identification of the relationship
between learning outcomes and teaching, learning
and assessment practices. The SED also alluded to
the 'varying forms of presentation' of programme
specifications, despite the existence of a standard
template, nevertheless claiming 'consistency of
coverage rather than uniformity of style'.
76 The audit team was satisfied that initial alignment
with the Code of practice had been achieved and the
University had put in place a suitable mechanism
through QASC for dealing with revisions, although this
had yet to be fully tested in practice. However, the
team was unable to identify any systematic process for
monitoring consistency over time, and was concerned
that within a highly devolved structure where there
was considerable flexibility over the form of processes
and the format of documentation, 'spot audits' would
not be sufficient to inhibit any drift in practices from
their aligned position. Skills training for postgraduate
research students was a specific area that the team
considered would benefit from review (see paragraphs
112 and 133 below). The team therefore considers it
desirable for the University to make explicit its
approach to maintaining consistency of its procedures
with the Code of practice, including how central and
local responsibilities are to be distributed.
77 In the context of the Code of practice, the
development of schools' assessment policies was
offered by staff as an example of the devolved
processes at work, but the audit team found it also to
be illustrative of the difficulties that can arise when
processes are devolved, and the importance of
requirements being communicated clearly. For
instance, schools were expected to develop policies
in a situation where assessment issues were addressed
in many different documents and mediums and
within a regulatory framework that was itself having
to keep pace with significant change. The exercise
was now becoming protracted, with draft policies
needing to be revised by schools to include
'signposting' to other source documents, delaying, as
a result, key decisions about which aspects of policy
could be divergent and which should be the same. In
the interests of improving transparency in the
information provided to students, the team considers
page 15
University of Greenwich
it advisable for the University to expedite the process
of determining those aspects of assessment policy
that should be universally applicable and either
incorporated in the Academic Regulations (for taught
awards), or standardised across schools' assessment
policies. This would allow the policies to be published
in their final form.
78 The audit team saw assessment policies as
another area of complexity facing combined honours
students; where their programmes cross schools
these students have to deal with two assessment
policies. On the basis of discussion with staff and
students, the team formed the view that the
emphasis on the relationship with the host school of
the first-named subject for combined honours
students could work to the detriment of the students'
ability to resolve difficulties that related to the second
school or crossed school boundaries. The team
considers it advisable for the University to strengthen
arrangements for ensuring parity of treatment for
combined honours students whose programmes
cross schools with those whose programmes operate
within a single school, given the scope for variation
in the content of school policies and the format of
documentation given to students, together with the
system of allocating personal tutor support solely on
the basis of the school responsible for the first-named
subject of a combined programme of study.
79 The above observations notwithstanding, the
audit team considered that the University's approach
to reference points demonstrated that it appreciated
their purposes and was reflecting on its own
practices in relevant areas. The team was able to
verify through the DATs that subject staff were
considering programme outcomes in terms of
subject benchmark statements and level/qualification
descriptors. Through committee minutes, it was able
to track the progress made by schools in meeting the
deadline for complete publication of programme
specifications and the firm line taken by LQC to keep
schools on target. The team also noted the efforts of
QASC and the LQO in encouraging approval panels
to pay closer attention to programme specifications
and in reducing the variability in format. In the
team's view the University's approach to programme
specifications provided a good example of schools
and the centre successfully working together within
clear areas of responsibility.
Programme-level review and accreditation
by external agencies
80 All nine subject reviews conducted since the
1999 audit resulted in approval of the quality of
education in the relevant subjects. Overall, the
page 16
strongest aspect was student support and guidance
and the weakest quality management and
enhancement. As the SED explained, the University
essentially attributed the local shortcomings in quality
management to a lack of ownership of institutional
procedures at subject level which it had tackled
through academic restructuring. It had also identified
those areas of its practice attracting regular attention
in subject review and taken action at institutional
level, with the external examiner system providing an
example of how local monitoring through the
departmental AMR and school ARPD processes had
been supplemented by central monitoring through
the LQO (see paragraph 67 above).
81 In the few cases where aspects of particular
subject reviews were judged to be making only a
satisfactory contribution to the attainment of stated
aims and objectives, schools had been required to
provide a detailed response directly to a central
committee (since subsumed by LQC). However, the
University now relied on the ARPD as the primary
mechanism to avoid such occurrences in the future.
The SED relayed the generally encouraging findings of
the two DEs that occurred in 2002-03, outlining
responses from the respective schools to some of the
points raised. It also gave a number of examples of the
outcomes of PSRB accreditation exercises, presenting a
positive picture across a wide range of subject areas.
82 Through the DATs, the audit team was able to
verify the mechanisms within schools for considering
PSRB reports; for instance, in the Business School the
Teaching and Learning Committee (the school LQC
equivalent) provided the forum. In general discussion
with staff, it also learned that PSRB reports were sent
to the LQO and that any critical comments would
immediately be brought to the attention of LQC.
However, staff confirmed that the ARPD quality and
standards section was the main channel for upward
reporting on accreditation exercises and that there
was no other mechanism for providing an
institutional overview of PSRB reports.
83 In the light of its views about the sufficiency of
detailed and comparable information in the quality and
standards sections of school ARPDs (see paragraph 55
above), the audit team saw the risk of rather cursory
attention being given to the subject of PSRB reports,
meaning that issues of broader significance might not
be picked up by LQC at the next stage of the process.
The team considers it desirable, given the importance
the University places on accreditation in providing
externality to the monitoring of its standards, for it to
ensure that PSRB reports are routinely considered
centrally for the purpose of identifying generic issues,
emerging themes or good practice.
Institutional Audit Report: main report
84 In addition, the audit team regarded the SEI as
a particularly apt illustration of how the University
was building on its strengths in student support and
guidance, recognised by subject review, to tackle
first-year student retention and the development of
students' critical, analytic and research skills, which
had been broached as areas for improvement by
subject review.
Student representation at operational and
institutional level
85 The SED stated that the University provided a
series of formal routes by which students were able to
comment on their experience, comprising
representation via SUUG; membership of University
committees at all levels, including school boards; and
staff-student committees. The University's Guide for
New Students includes a brief summary of the
student representation system, which states that the
University has guaranteed that student representatives
would have adequate facilities to carry out the role
and that training would be provided for newly
elected representatives by SUUG. It also draws
students' attention to the existence of user groups,
such as those covering media and library resources.
86 The SUUG constitution provides for the election
of student representatives from each school to its
Student Representative Committee. The sabbatical
and non-sabbatical officers are elected by annual
cross-campus ballot, open to all full members of
SUUG, and represent students on the main
University committees, including Academic Council.
According to the SWS, sabbatical officers
additionally hold regular meetings with senior
university managers. The constitution of school
boards allows for between two and six student
members, and the SWS clarified that each school
selected student representatives to sit on the school
board and also programme representatives for each
programme of study.
87 In the SED, the University acknowledged that 'at
local level the system of student representation
[was] somewhat diffuse' and that this, exacerbated
by institutional geography, might be leading to
insufficient coordination and transparency. However,
it added that 'membership of students on school
boards [did] provide for a degree of consistency
across the institution'.
88 Students, in both the SWS and in discussion with
the audit team, expressed general satisfaction with
the extent of their representation at institutional level
through SUUG and at programme level, where they
felt able to express their views and to participate in
quality management for their programmes. However,
at school boards, where SUUG officers had no
involvement, they felt less able to influence decisionmaking and resource allocation. Students also
considered that there was a lack of coordination and
integration between the different levels of student
representation, notably between SUUG
representatives and school board representatives.
89 The audit team noted that, while there was a
full complement of student representatives on
Academic Council for the current year, there were
vacancies on its subcommittees at the time of the
audit visit, including LQC and the Diversity and
Equal Opportunities Committee, and on school
boards. The ARPDs revealed that the nature and
degree of student representation at local level varied
between schools, ranging, for instance, from
participation in staff-student liaison committees to
representation on programme committees. The
team appreciated that differences in nomenclature
might account for some of the diversity, but it also
concurred with the University's own concern about
insufficient transparency in local arrangements. This
was, in the team's view, compounded by the
variation in selection procedures for committee
representatives which became apparent through the
DATs. The team also learned that programme
committees did not always observe the agreed
frequency of meetings and departmental
committees did not generally allow student
representation, thus limiting the opportunities for
students to have a direct input to the full scope of
quality management activity.
90 The audit team learned that the induction and
training for student representatives previously
provided by SUUG had lapsed, and concluded that
this, as with the issue of coordination between
SUUG and school board representatives, was
indicative that aspects of the representation system
had still to catch up with restructuring. The team
accepted the staff view that it was difficult to
engage students in the higher-level school
committees because these were of less immediate
relevance to them. It regarded the planned
production of a student representatives' handbook,
led by SUUG, but with support from LQO, as a
practical step in demonstrating that the University
placed value on students getting their voice heard.
The team also noted that the new internal review
processes, such as school review, incorporated an
active role for students in quality assurance through
cross-representation on review teams. However, to
date this had not been achieved in practice,
although the role in one review had been filled by a
page 17
University of Greenwich
SUUG officer who was from another school. Overall,
the team considers it desirable for the University to
give greater priority to promoting the involvement
of students in quality management, including
working more cooperatively with SUUG to reinstate
training for student representatives and encouraging
all schools to adhere to regular meeting schedules.
Feedback from students, graduates and
employers
Student feedback
91 The SSS, administered by the Office of Student
Affairs, is the main method for collecting feedback
from students across the University, although they
additionally give specific feedback on individual
courses through locally conducted questionnaire
surveys. The SSS takes as its basis the 'satisfaction
approach' (developed by the Centre for Research into
Quality at the University of Central England in
Birmingham), although the University has adapted the
methodology since introducing it in 1999. Following
an adjustment in 2003 to the way the sample was
drawn, 50 per cent of students on each campus are
now canvassed every year by means of a postal survey,
for their views on all aspects of their academic life. The
facilities provided by administrative departments are
included in the survey on a two-year cycle.
92 The SSS questionnaire comprises studentdetermined questions which are derived using focus
group methodology and yield both satisfaction and
importance ratings. The SSS results are analysed at
the level of campus, school and department and in
terms of aspects such as mode and level of study,
gender, age and ethnicity. They are communicated
to students and staff through articles published in
both the SUUG newspaper and the University's inhouse newsletter. The full report and an 'intended
action report' are made available on the web site,
with staff and students notified by personal email,
while a newsletter outlining actions effected or in
train, is produced at the end of each session and
also published on the web site.
93 The SED drew the link between the SSS and the
formulation of University policy, putting forward the
SEI as an illustration, and outlined the widespread
attention given to the SSS, through consideration of
a summary report at senior committees, the
identification of resultant action in school ARPDs,
and the mid-session progress chasing of schools'
action by LQC. It quoted examples of where the
University had responded to 'priority areas' identified
through the SSS, resulting in improvements to
computing facilities, library book stock and its
page 18
proprietary virtual learning environment (VLE). It
also pointed to future development of an on-line
questionnaire to link with other internet-based
projects (see paragraph 122 below).
94 The SSS was used as a key source in production
of the SWS which also illustrated how SUUG was
using SSS output to monitor ratings of its service
provision. In meetings with the audit team, students
commented on the ways the University encouraged
a good response rate as well as on the wide
publicity it gave to the results and consequent
action. They suggested that the length of the
questionnaire, running to nearly 200 items of their
experience, was off-putting, but recognised the
difficulties of achieving both breadth of coverage
and ease of completion in a single questionnaire.
95 Through documentation, the audit team was
able to verify the processes described in the SED for
giving detailed consideration to SSS results; for
instance, there were specific action points in school
ARPDs regarding lower ratings. It was also able to
establish that the University was completing
'feedback loops' to students and staff by providing
summaries and analyses in addition to the full
report. There is good practice in the comprehensive
SSS, the thorough consideration of its findings and
the well-publicised and timely feedback of its results
to both staff and students. Through the DATs, the
team noted the regular use of course questionnaires,
but also learned that there was no similar systematic
collection of student feedback at programme level,
although students did have representation on
programme committees.
Graduate and employer feedback
96 The University receives feedback from graduates
in its annual first-destination survey. This information is
circulated to schools which report on the outcomes
through their ARPDs. In addition, there is an alumni
office which administers the alumini web site and
survey. While there was no claim that this was a
systematic mechanism for gaining feedback from
graduates, the SED indicated that their views were
sometimes sought as part of internal approval and
review processes. Neither is there a standard approach
to obtaining feedback from employers, with methods
ranging from employer advisory boards and associated
forums to less formal or ad hoc arrangements.
97 The audit team considered the Centre for
Entrepreneurship, initiated by the Business School,
to provide a good example of how the
establishment of links with local business could
subsequently form the basis of programme
development. There were other examples of
Institutional Audit Report: main report
employer involvement being used to promote
students skills, for instance, project presentation
coupled with securing industrial sponsorship. The
team also noted the work in progress to compile a
complete listing of the University's links with
employers for publication on the TQI web site.
Progression and completion statistics
98 The SED explained that it was an institutional
objective to work from a single centrally-driven
dataset held within the Planning and Statistics Unit
(PSU) and used by schools to inform the ARPD
process. Therefore, for the past three sessions PSU
had produced an annual statistical digest, each one
containing data covering the previous three sessions,
to allow for comparison over time. The SED gave
the SEI as an example of an institutional policy that
utilised the digest through the information provided
on the student profile. It also suggested that schools
were more routinely using the digest in developing
their academic plans as they gained familiarity with
the content of the dataset.
99 Much of the digest comprises descriptive
student statistics covering admissions, student
characteristics, progression, achievement and
destinations. For the past two years it has been made
available electronically to allow schools to examine
the underlying data at programme level.
Complementing the digest is the Annual Progression
Data Analysis document introduced in February 2002
to provide a means of establishing possible causal
links between non-progression and relevant aspects
of the student experience, such as stage of study or
location. The linkage back of such statistical analysis
into institutional admissions policy was demonstrated
in the restructuring of the four-year extended degree
programme. This produced a smaller number of
more generic programmes feeding into designated
cognate areas, with the intention of fostering cohort
identity to encourage student progression.
100 The SED presented a positive view of the
management information system which, 'teething
troubles' aside, had enabled the University to
establish wider use of a common dataset for more
sophisticated statistical analysis and action planning.
However, the SED also acknowledged that the
production of the dataset was driven by external
timescales which did not coincide with those of the
internal annual monitoring cycle. This had been
addressed by introducing an additional stage into
the 2003 ARPD process, whereby schools considered
issues arising from student progression and
achievement once the dataset became available,
producing a separate commentary on identified
trends after having submitted their main ARPDs.
101 The audit team had access to the school
commentaries on statistical data for 2003 ARPDs,
including those on progression and completion
statistics. While accepting that the ARPD process
allowed schools discretion over their reporting, there
was considerable variation in the detail of analysis.
With few exceptions, aggregation of the data meant
it was difficult to gain a clear impression of cohort
progression between the stages (years) of study, but,
through the DATs the team was able to see that this
approach was being adopted by some, but not all,
departments for individual programmes. However,
the team learned from its reading of documentation
and its discussions with staff that the inclusion of
student stage in the dataset was being pursued with
PSU and that the data available to school staff were
improving. The team noted that previous concerns
raised by external examiners about the adequacy of
data supplied to SAPs and PABs over the 2000 to
2002 period had seemingly been resolved (see
paragraph 68 above).
102 Overall, the audit team concluded that the
University's capacity to provide the necessary tools
for comprehensive lower-level analysis was catching
up with its ambition for statistical management
information to support strategic decision-making at
all levels. It was clear to the team that the University
had itself identified areas for improvement to which
it was giving high priority.
Assurance of the quality of teaching staff,
appointment, appraisal and reward
103 The Regulations for the Appointment and
Promotion of Staff and the accompanying staff
Recruitment and Selection Procedures set out the
University's processes for staff appointment,
appraisal and reward. Accordingly, all established
teaching posts have both a job specification and a
person specification, produced to a standard format;
appointments are made by selection panels whose
members must have undertaken training in
recruitment and selection; and newly-appointed
teaching staff undergo a 12-month probationary
period, entailing observation of their teaching, with
reports sent to the central Personnel Office after six
and 12 months' service. For new staff without a
recognised qualification or three years' experience in
teaching, successful completion of the probationary
period is also dependent on making satisfactory
progress on either a Postgraduate certificate in
Education (PGCE) Post-Compulsory Education and
Training (PCET) or a Postgraduate Diploma (PgDip)
in Higher Education.
page 19
University of Greenwich
104 The 2002 policy statement, Nurturing Staff,
emphasised the importance of recruiting and
retaining younger staff to achieve 'a balanced
diversity of ages amongst [the University's] staff',
noting that the University was located in areas of
expensive housing where competition for staff was
severe. In this context, market premium payments
were introduced within the Human Resources (HR)
Strategy (2001-02 to 2003-04) for staffing areas
where recruitment and retention are difficult. There
is also provision within the University's Management
Guidelines on the Exercise of Pay Flexibilities to
award merit-based increments, which the SED
explained had been used particularly to reward
junior members of staff who obtained postgraduate
qualifications in the early years of their employment.
In addition, the University explicitly links teaching
quality to reward through its scheme for special
promotions to PLT posts.
105 The SED outlined the mechanisms by which the
University monitored the effectiveness of its staffing
policies. Schools reported specifically on recruitment
and retention issues, appraisal and PLT posts through
the staffing section of their ARPDs. These sections
were reviewed centrally by the Executive Committee,
which also received regular progress reports from the
Director of Personnel against the intended outcomes
of the HR Strategy, including those relating to
recruitment and retention. The November 2003
report (appended to the SED), showed there were
measurable improvements in staff recruitment and
retention, although the proportion of younger
teaching staff had remained static over recent years.
The ARPD staffing reports were additionally
considered by the Staff Development Focus Group
(see paragraph 111 below). Regarding appraisal, the
SED acknowledged that the annual participation rate
was currently 10 per cent behind target, giving as an
explanation the lower levels of participation among
manual staff, coupled with the impact of
restructuring. Relevant aspects of staffing policy were
also monitored through the Diversity and Equal
Opportunities Committee and the Equal
Opportunities Employment Group.
106 In meetings with the audit team, newlyappointed teaching staff and senior school staff
responsible for approving probation separately
confirmed that procedures were being met, with the
former commenting on the generally high level of
support they received during the probationary
period. From its study of school ARPDs, the team
was able to verify that the market premium
payments scheme was delivering results in some
schools, notably CMS, although not all schools had
page 20
seen a benefit. The ARPDs also revealed variability
across schools in relation to appraisal reporting,
appraisal practice and the impact of appraisal on
staff development planning. In particular, there was
no consistent recording of the percentage of staff
covered by appraisal, but some schools were
reporting appraisal rates for teaching staff
significantly below target. The team considers it
desirable for the University to take the necessary
steps to ensure full implementation of teaching staff
appraisal, particularly given its linkage through staff
development to delivering both the HR and
Learning and Teaching institutional strategies. The
team recognises that the Staff Development Focus
Group is keeping a watching brief on the appraisal
participation rate and, in general, considers that the
University is maintaining an appropriate institutional
overview of its procedures for the appointment,
appraisal and reward of its teaching staff.
Assurance of the quality of teaching through
staff support and development
107 The University's strategic framework for staff
support and development is constructed around a
range of interrelated strategies and policies. At its
core lie the institutional Staff Development Policy
and the staff development aspects of the HR and
Learning and Teaching institutional strategies. This
framework is supported by a number of associated
institutional policies, including Nurturing Staff and
Mentoring of New Staff, which are supplemented by
various institutional guidance, procedures and
schemes, such as those relating to induction of new
staff, probationary procedures and staff appraisal.
108 At institution level, the mechanisms for
monitoring implementation of this framework
include: progress reporting against the HR Strategy
(see paragraph 105 above); the review by the
Executive Committee of school staff development
plans and expenditure, as part of the ARPD process
(see paragraphs 52 to 54 above); the exploration of
common themes by the Staff Development Focus
Group; and the sharing of experience and expertise
through QESC, whose remit includes oversight of
the institutional Learning and Teaching Strategy. The
Staff Development Focus Group, which is not part of
the formal committee structure, is chaired by the
Director of Personnel and has among its members a
number of heads of school.
109 The intended outcome of the HR Strategy
relating to staff development, is to 'foster an
integrated approach…which emphasises the
development of leadership, the pursuit of individual
self-development and support for the University's
Institutional Audit Report: main report
learning and teaching strategies'. According to the
SED, priorities up to 2002 were orientated towards
training and included: developing schemes relating
to the PGCE/PgDip for new, inexperienced staff;
establishing a University policy towards the Institute
for Learning and Teaching in Higher Education (now
part of the Higher Education Academy), resulting in
the availability of financial support through the
LQO; and training staff in new technologies, most
recently through the University Certificate of
Professional Development in e-Learning, Teaching
and Training (CeLTT), also with LQO support.
110 While the above priorities have continued, the
emphasis in the institutional Learning and Teaching
Strategy on schools building up their own internal
staff development expertise has led to an increased
focus on expanding the PLT role within schools into
a corporate resource for promoting innovation in
learning and teaching. Priority has also been given
to other aspects of the HR Strategy, including
enhancement of leadership and management skills
through structured management development
programmes; and strengthening the research
capability of staff through school-based staff
development. As general illustrations of schoolbased activity, there was the three-day conference
organised by the School of Humanities covering a
broad range of issues, and also a proposal by the
School to introduce a voluntary scheme for peer
observation of teaching (distinct from the
arrangements for probationary staff), as a
replacement for ad hoc arrangements.
111 From its analysis of the ARPDs and its discussions
with staff, the audit team was unable to gain any
strong impression of the linkage between
institutional strategies and staff development at
school level. Nonetheless, the Staff Development
Focus Group had identified areas for further
development in relation to institutional priorities,
notably making better use of the PLT resource. It
appeared to the team that, with one exception,
schools were not capitalising on this resource and
PLTs were not perceived by staff as a coherent group.
However, the team learned that the PLTs themselves
made efforts to share expertise through informal
cross-school meetings and that QESC was overseeing
the development of a dedicated web site for the
same purpose. As with the impact of appraisal on
staff development planning (see paragraph 106
above), the team found the level of detail on budget
allocation and expenditure for staff development to
vary between school ARPDs. However, this point had
also been identified by the Staff Development Focus
Group which was suggesting inclusion of a 'needs
analysis' and a 'funding per capita' parameter.
112 With regard to supporting staff to develop their
full potential, a strong theme of Nurturing Staff,
most school ARPDs contained examples of staff
development undertaken on an individual basis. In
general, the audit team heard positive comments
from full and part-time staff about the development
opportunities available to them, both internally and
externally. Most ARPDs indicated that induction and
mentoring arrangements were in operation and this
was confirmed by staff. There was, however, little
reference made to the training and development of
research student supervisors and it was not clear to
the team where in the ARPD this would be expected
to feature. Staff gave some examples of training and
support mechanisms for inexperienced supervisors
available at a local level, and also of university-wide
supervisor training days for specific developments,
such as the research student logbook (see paragraph
127 below), but the team nevertheless considers it
desirable for the University to provide more
systematic training and continuing staff development
for research supervisors. On a separate point, the
team noted that a recent initiative to organise crossschool training sessions for postgraduate research
students with teaching or demonstrating duties had
attracted little interest (see paragraph 133 below).
113 In spite of the above recommendation for
further consideration by the University, the audit
team is supportive of the University's commitment
to achieve greater integration in its approach to staff
development, and acknowledges the progress made
towards adjusting the balance between the
respective responsibilities of school and centre.
Assurance of the quality of teaching delivered
through distributed and distance methods
114 Outside of its collaborative provision, the
University adopts two strategies for the delivery of
flexible and distributed learning (FDL): conventional
distance learning from structured learning materials,
with direct on-line support from University staff; and
e-learning within an interactive VLE. However, the
conventional distance-learning programmes more
typically involve tutorial support from a collaborative
partner, so would normally be subject to the
University's procedures for collaborative provision.
These are supplemented by specific guidelines for
the quality assurance of structured learning materials
(set out in the Quality Assurance Handbook), which
include establishing an editorial board responsible
for the oversight of both the production process and
subsequent updating of materials.
115 E-learning programmes (without external
partner involvement) are regarded by the University
page 21
University of Greenwich
as internal provision, so are subject to the same
procedures for approval, monitoring and review as
provision delivered in-house (see paragraphs 46-47,
50-51 and 59 above). However, the approval
procedure (set out in the Quality Assurance
Handbook) allows for LQO involvement, which the
SED indicated meant 'additional stringency in the
quality assurance arrangements', akin to that applied
to structured learning materials.
116 The University has established an e-learning
office, located within Information and Library
Services (ILS) (see paragraph 120 below), and
supported by the LQO, which also funds a range of
small e-projects being undertaken by staff across the
University, as well the CeLTT training programme
and an annual conference for the dissemination of
good practice. In addition, ILS has created a unit
specifically to coordinate library and computing
support for students and staff operating off-campus.
QESC has responsibility for oversight of the
development of e-learning, in the context of the
institutional Learning and Teaching Strategy.
117 In the SED, the University expressed its
commitment to enhancing its expertise in FDL, one
illustration being the growth of 'blended learning'
within the School of Health's portfolio of provision.
It currently estimated that 50 per cent of campusbased students had direct access to, or the
opportunity for, e-learning and that this proportion
was likely to increase.
118 From its study of the ARPD documentation and
discussions with staff, the audit team concluded that
the arrangements for approval, monitoring and review
of FDL programmes seemed to be operating as
outlined in the SED; the team heard examples from
different schools of how the peer review of materials
was organised through editorial boards or the
equivalent. It was also apparent to the team that
schools heavily engaged in distance delivery were
providing staff development in the pedagogical as well
as the technical aspects of FDL. The general increase in
use of internet-based applications was evident.
119 However, the audit team found it more difficult
to establish the extent to which the expansion of the
VLE was occurring in a managed way. For example, it
learned that schools were moving at their own pace
in implementing e-learning and were also developing
their own systems independently from the University's
main platform. While recognising that the blended
learning approach accommodates discipline diversity
by varying the mix of distance and face-to-face
delivery, the team considered it important for there to
be a strong infrastructure providing a common
page 22
interface for the user. The University may wish to give
this factor greater priority in its ongoing evaluation of
e-learning provision.
Learning support resources
120 The University's principal learning support
resources are the responsibility of ILS, formed in
August 2003 to bring together library, computing
and classroom services. Much of this provision is
organised at campus level, with PCs being located in
open-access laboratories within or adjacent to
campus libraries. In addition, there are school-based
machines reserved for the specialist applications
required by students of the host school. User support,
while locally-delivered, is now centrally-managed and
includes an off-campus help-desk facility. ILS also
supports the University VLE.
121 The overall level of purchasing for both library
materials and PC laboratories is determined by the
University's regular budgeting procedures. Proposals
for major capital expenditure on the infrastructure
for information and communications technology
(ICT) are also submitted annually, for consideration
by the Director of Finance and the PVC (Resources),
although ILS operates a five-year replacement cycle
for open access PCs. The SSS is the primary means
of determining student feedback on a whole range
of matters relating to the delivery of library and
computing provision, with the most persistent issue
arising from the SSS being a perceived shortage of
core texts.
122 The SED stated that planning for the future of
learning support services had 'inevitably been
dominated' by the restructuring over the past three
years. This had brought about a transition to a more
electronic service model, with PC facilities, for
example, superseding certain activities formerly
carried out by technical staff. The SED also stated
that operational aims were related to those of the
Learning and Teaching Strategy, and in this context,
ILS was working on the development of the VLE,
notably the implementation of Campus Pipeline
which would bring together academic and
administrative interactions with students. However,
the University recognised that further advances in
use of the VLE platform were dependent on the
availability of additional resources, and also that its
ability to effect significant improvement in library
purchases had been limited.
123 The SUUG survey in the SWS indicated that the
majority of students were 'particularly satisfied' with
the library service, but that opinion 'appeared
divided' over provision of computing services, and
Institutional Audit Report: main report
suggested variability between campuses as a
possible explanation for the differing views. The
SWS also stated that at local level student
representatives found staff open to discussion about
resources issues, but that follow-up action was
limited, leading them to conclude that there should
be more student representation and discussion of
student feedback in relation to resources at decisionmaking levels. In meetings with the audit team,
students identified variability in learning resources
between campuses, but attributed some of this to
provision not quite keeping pace with campus
consolidation and school relocation.
124 Staff commented to the audit team that
extensive use of the fast-track procedure for
programme approval (see paragraph 46 above)
worked against a systematic approach to central
resource planning, although the team was aware
that the University was taking steps to cut back on
the number of programmes authorised by this
route. Staff also explained the 'mixed economy' for
learning resources provision, whereby schools had
the capacity to fund lesser requirements from their
own budgets, with the development of 'customised'
intranet systems in the schools of CMS and
Engineering, utilising the infrastructure supported by
ILS, providing examples.
125 In respect of library provision, the audit team
considered that there were appropriate mechanisms
for collaboration between schools and ILS, as
demonstrated by their working together to achieve
more accurate targeting of purchases, while also
managing student expectations about the availability
of texts through guidance on reading and buying
priorities. In respect of computing provision, the
team was told that benchmarks were used to
evaluate University provision against comparator
institutions. Staff also expressed the view that ILS
planning would benefit from use of a parallel process
to the ARPD, to give it a longer term-time horizon.
Senior staff acknowledged that there was currently
variability between campuses which the University
was planning to address by 'levelling-up' provision,
and that there would need to be continued
monitoring to guard against future variability. While
it was clearly too soon to reach a firm view about the
new ILS structure, given that it had been so recently
established, the team was supportive of the moves
towards stronger institutional overview and the
levelling-up of provision.
Academic guidance, support and supervision
126 The University has a range of mechanisms for
the academic guidance of students, starting from
induction, then through the duration of their
studies, to preparation for employment. Once
students get started on their programmes, primary
responsibility for academic guidance rests with
schools and departments, with programme leaders,
course coordinators, personal tutors and
project/dissertation or postgraduate research
supervisors fulfilling the major role. These
arrangements are complemented by the support
services provided through the central Office of
Student Affairs (see paragraph 134 below), which
include assistance with generic study skills to
supplement skills training embedded in curricula or
provided through school-based courses. The Office
of Student Affairs now also incorporates a new
Research Student Administrative Office set up to
work closely with schools that have responsibility for
research supervision arrangements.
127 The SED referred to the SEI as 'overarching the
individual provision made by schools'. Envisaged as a
series of stages to be applied gradually in tandem
with students' academic development, the SEI was
implemented in 2002-03, in accordance with the
University Learning and Teaching Strategy. It initially
focused on first-year personal tutoring (stage 1),
stipulating particular forms of entitlement for students
which were translated into explicit standards against
which schools had to align their practices. Student
support, including specific initiatives linked to the SEI,
is regularly monitored through the quality and
standards sections of school ARPDs, with LQC
maintaining the overview and QESC having
responsibility for more detailed evaluation.
Supervision arrangements for postgraduate research
students are approved by RDC which also has
responsibility for monitoring the quality of their
supervision. From 2003-04, taking account of HEFCE
and research council requirements, a log has been
introduced for new research students with the
purpose of maintaining a record of progress review
meetings, a key skills audit and a record of activities.
128 The SED described the impact of the SEI as
'significant', in that all schools, even those with a
good track record in the area of academic guidance,
had to review their procedures to ensure they were
meeting its requirements and could provide an
evidential base that they were delivering the
standards. A case in point was the 'Springboard for
Learning' course introduced by the School of Health
and Social Care to support skills development within
group tutorials. The SED acknowledged that some
schools were facing difficulties in meeting SEI
standards, an instance given to the team being
issues of timetabling and space for tutor sessions in
page 23
University of Greenwich
larger schools. In addition, the SED highlighted
mechanisms for communicating with students, such
as web sites and student handbooks, as offering
supplementary means of academic guidance.
from 2004-05, although combined into a single stage
covering careers support and employability,
development of autonomous learning skills, and the
implementation of personal development portfolios.
129 The SSS was cited in both the SED and the SWS
as providing evidence of high student satisfaction
with personal tutoring arrangements, although
qualified in the SWS by the view that there was
variability between schools. Students who met with
the audit team made the same point about
variability, but some of the differences in their
perception was, in the team's view, dependent on
their year of study. First-year students appeared
typically to be having group tutorial sessions on a
fortnightly basis, indicating to the team that SEI
standards were being achieved. The team also
learned of effective student mentoring schemes,
whereby second or third-year students provided
support for first year students.
132 In their meeting with the audit team,
postgraduate students were generally positive about
their supervision arrangements which involved two
supervisors, both of whom were readily available.
However, it was apparent to the team that the new
log had not yet been fully established as a monitoring
system. It was also apparent that there was no
training requirement for students with teaching or
demonstrating duties or formal arrangements for
monitoring their teaching. However, the team
recognised that the responsibility for ensuring
research students were adequately prepared for their
teaching roles was clearly assigned to heads of school,
and learned of one case where permission to teach
had been refused on the grounds that the research
student had not undertaken training.
130 However, combined honours students reported
particular difficulties, mostly relating to the second
subject of their combined award for which they had
no formal personal tutor support. Generally, they were
unaware of the existence of DECS which staff clarified
for the audit team as having a 'behind scenes'
administrative rather than a 'student-facing' support
function. Staff also explained that within each school
there were combined honours advisers who would
make contact on behalf of students with their
counterparts in other schools, to resolve issues relating
to the second subject. However, the team considered
student support arrangements to provide a further
example of how combined honours students were
potentially disadvantaged by having only an indirect
route into the second school and no independent
channel for resolving cross-school difficulties.
131 The above point notwithstanding, there is good
practice in the SEI, stage 1 of which has been
successfully implemented university-wide to
strengthen the personal tutor system. The audit team
could see first hand the extensive supporting
documentation within the Pastoral and Skills
Handbook (PASH), provided by the Office of Student
Affairs in readily accessible formats, which the team
considered to be a valuable resource for personal
tutors acting in an academic, professional or pastoral
capacity. It heard from staff about improved means for
interrogating the student database to obtain on-line
access to grade profiles as a means of identifying
students who were academically 'at risk'. The team
could also follow reporting of action and monitoring
of progress through school ARPDs, and then through
LQO and QESC. It learned that stages 2 and 3 (for
students at levels 2 and 3) were to be implemented
page 24
133 The audit team subsequently explored with staff
the broader question of skills training for research
students and heard examples from several schools,
including frequent staff-student seminars, availability
of appropriate masters courses, and portfolio
development, which was an integral part of the
EdD. There were also university-wide events, such as
the research ethics conference and the recent efforts
to establish staff development sessions for research
students as teachers (see paragraph 112 above).
However, in the team's view this did not represent a
systematic approach, therefore, it considers it
advisable for the University to review the provision
of skills training for research students and, in
particular, to establish a training requirement for
those students involved in teaching or
demonstrating activities, together with a mechanism
for subsequent monitoring and support.
Nonetheless, the team concluded that, overall, the
University was building on its recognised strengths
in the area of student academic guidance and
maintaining an institutional overview of the
devolved arrangements within schools.
Personal support and guidance
134 Personal support and guidance for students is
provided, under the banner of 'pastoral support and
student success', by the Office of Student Affairs
which was formed in August 2003 from a merger of
student services and student administrative
functions. Although centrally managed, it delivers
services locally to students through campus-based
student centres. Within these centres, the same core
services are provided through 'one-stop-shops':
Institutional Audit Report: main report
chaplaincy; counselling; disability and dyslexia
support; financial advice, including disbursement of
hardship funds; careers; the 'jobshop', aimed at
finding students paid work compatible with their
studies; international student advice; and various
initiatives relating to student diversity. In addition,
there are a campus medical centre and a nursery,
both based at Avery Hill.
135 The SED highlighted the 'cultural diversity' of
the student population, linking this to the
University's longstanding commitment to widening
access. It referred to the University Code of Practice
for Students with Disabilities, and stressed the
proactive approach of student services' teams which
worked with named contacts in schools to provide
appropriate support. It pointed to initiatives at the
interface between academic and pastoral support,
such as PASH, which provided detailed guidance to
personal tutors on the referral of students to the
various support services available. It also gave a
number of illustrative examples from subject review
and PSRB reports, commending the University's
arrangements for student support and guidance in a
variety of contexts, as well as comments from its
internal feedback mechanisms.
136 The SWS likewise referred to the 'different
demographics' of the student population and also to
general satisfaction in which the ways these were
taken into account in the services provided by the
Office of Student Affairs, as indicated by both the
SSS and the SUUG survey. The students who met
with the audit team, although seemingly not
conversant with the full range of services on offer,
were aware of the campus centres as a means of
accessing support. The 'Listening Ears' confidential
advice service, provided by designated staff attached
to schools or student areas, was clearly regarded as
a good initiative, although individually the students
had not necessarily felt the need to use it. The
students were also aware of their responsibilities for
notifying extenuating circumstances in the context
of assessment and of the existence of appeals and
complaints procedures.
137 The University's responsiveness to various issues
relating to student pastoral support was well
documented and reinforced for the team in
discussion with staff. The exchange between centre
and schools of information on disabled and dyslexic
students, and the mentoring scheme aimed at
improving the chances of ethnic minority students
obtaining permanent jobs, provided examples of
how the University was using student management
information to tailor student support. In relation to
its retention policy, the University was evidently
giving priority to both monitoring student
progression and to early detection of students likely
to require support. The team considered the
arrangements for 'pastoral support and student
success' to be responsive to diverse student needs,
and consonant with University policies and strategies.
Section 3: The audit investigations:
discipline audit trails
Discipline audit trails
138 In each of the selected DATs, appropriate
members of the audit team met staff and students
to discuss the programmes and also studied a
sample of assessed student work, annual course and
programme monitoring reports, including external
examiner reports, and annual and periodic review
documentation relating to the programmes. Their
findings are as follows.
Chemical sciences
139 The scope of the DAT comprised provision in
the Department of Chemical Sciences, within the
School of Sciences, leading to the following awards:
z
BSc (Hons) Chemistry;
z
BSc (Hons) Analytical Chemistry;
z
BSc (Hons) Pharmaceutical Chemistry;
z
MSc Pharmaceutical Sciences;
z
MSc Industrial Pharmaceutical Sciences;
z
HNC Chemistry.
The Department also offers combined honours
packages as part of the university-wide combined
honours programme being developed at the
Medway campus, to which the Department
transferred in summer 2002.
140 The basis of the DAT was a DSED comprising
documentation and reports relating to periodic
programme reviews conducted over the period July
2002 to April 2004.
141 Programme specifications were provided in
each case and the audit team found these to be
clearly matched to the FHEQ, with course
descriptions within undergraduate programmes
aligned to the relevant subject benchmark
statements. The Department has a full involvement
in the School of Science Consultative Committee
whose external industrial and commercial members
contribute to the development of industrially
relevant programmes. The team noted that in the
last two years attendance of industrialists had not
page 25
University of Greenwich
been particularly regular, but was assured by staff
that written views were solicited from those who
could not attend.
142 Progression and completion data indicated that
most students, at both undergraduate and
postgraduate levels, were progressing normally to
achieve their award. However, the data also showed
a 'tail' of poorly-achieving undergraduates that was
leading to rates of progression, retention and
completion which were giving concern to the
Department, the University and external examiners.
The audit team shared this concern, but noted that
the Department had adopted a range of strategies
to address the problem. These included proactivity
on the part of personal tutors and course teams to
identify students at risk, remedial action for those
with weaker skills (for example, in mathematics) and
close support of students in laboratory classes. The
team considered that it was too early to judge the
success of these initiatives in improving progression
and retention rates, but it noted the appreciation
expressed by students of the efforts being made by
staff to support their learning.
143 The audit team found the procedures used for
the approval and review of courses and programmes
to be sound and in line with University
requirements, with the quinquennial approach to
review very much in evidence. Use is made of
external input from both academia and industry as
well as from staff in other schools. The team
considered that this ensured a rigorous scrutiny of
the standards and quality of programmes and their
relevance to students and employers.
144 The audit team studied external examiner
reports, all of which expressed satisfaction with the
standards being achieved by the better students, but
also concern about the limited attainment of students
at the weaker end of the spectrum. The reports are
considered by the Department as part of its annual
monitoring exercise and by the School in preparing
its ARPD. The team was able to confirm that external
examiners' comments received timely responses and
that action was taken where appropriate.
145 The School has devised its own comprehensive
code of practice for assessment, taking account of
consistency with the Code of practice. The
Department operates assessment procedures, in
accordance with the School's code, except in regard
to second-marking of coursework, which is still in
the process of implementation.
146 The audit team reviewed a range of assessed
work by both undergraduate and postgraduate
students and was satisfied that the nature of the
page 26
tasks and the standard of student achievement were
appropriate to the titles of the awards and their
location within the FHEQ. Assessment methods were
varied and appropriate to the discipline. The School
encourages departments to return marked
coursework to students within three weeks, although
the Department aims for a turnaround time of two
weeks. Students confirmed to the team that
feedback from staff was both helpful and timely.
147 The audit team found student handbooks to be
comprehensive and informative. This was confirmed
by the students who praised all aspects of support
and guidance, both academic and pastoral.
148 From the viewpoint of the audit team, learning
resources appeared to be managed effectively at
both school and departmental levels. Students spoke
highly of the laboratory, library and computing
resources available to them on the Medway
Campus. They were also appreciative of the
University's efforts to improve the supporting
infrastructure, such as the local bus services. Staff
resources are supplemented by research students
who take on supporting roles as teachers and
demonstrators in laboratory classes. However, the
team was of the view that a more proactive
approach should be taken to ensuring the provision
of effective training for these research students
before they commenced their teaching duties.
149 Student feedback is sought by means of
questionnaires for each course and programme.
These mechanisms were valued by students who
were also appreciative of the university-wide SSS.
Students, in turn, receive feedback from the
Department on the outcomes of issues raised, as well
as electronic and paper-based feedback on the SSS.
The whole student feedback process was particularly
important during the transfer to the Medway Campus
and used effectively to deal with student concerns.
150 There is a departmental staff-student liaison
committee which meets regularly. A particularly
noteworthy feature is its consideration of a draft of the
School's ARPD, thus allowing students to have direct
input to quality management by contributing to this
important annual review of the School's provision.
151 Overall, the audit team was satisfied that:
z
the standard of student achievement in the
programmes covered by the DAT is appropriate
to the titles of the awards and their location
within the FHEQ;
z
the quality of the learning opportunities is
suitable for the programmes of study in Chemical
Sciences leading to the named awards.
Institutional Audit Report: main report
Mathematics and statistics
152 The scope of the DAT comprised provision in
the Department of Mathematical Sciences, within
the School of CMS, leading to the following awards:
z
BSc (Hons) Mathematics;
z
BSc (Hons) Statistics;
z
BSc (Hons) Mathematics, Statistics and
Computing;
z
BSc (Hons) Decision Science.
The Department also offers combined honours
packages as part of the university-wide combined
honours programme.
153 The basis of the DAT was a DSED written
specifically for the audit which also contained a
report of the periodic review of programmes in the
Department of Mathematical Sciences conducted in
June 2003. Programme specifications for each of the
award programmes were appended to the DSED.
154 Programme specifications included links to the
FHEQ and explicit reference to the Subject
benchmark statement for mathematics, statistics and
operational research . In 2003, the School updated
its procedures on industrial placements to bring
them into alignment with the Code of practice and
the audit team learned from staff that other schools
had since been recommended by QASC to adopt
these as good practice.
155 Progression and completion data are commented
on in programme AMRs, feeding into the School's
ARPD, which contains several statistical tables of
progression and completion data. These are
considered at all relevant school committees, leading
to recommendations for change where necessary.
Data are also compared with those relating to other
programmes in the University and, through the group
of heads of the Department and external examiners,
with data available from other universities.
156 The 2003 review of the Mathematical Sciences
portfolio provided an illustration of programme
review as part of the School's ongoing quality
assurance arrangements. It combined the review of
programmes reaching five years after their initial
approval with the introduction of new programme
structures to facilitate transfer between awards. The
review team included both an academic and a
practitioner external to the University. Minor
amendments to programmes are subject to lower
scrutiny by externals than full programme review,
but the school LQC ratifies all amendments.
157 External examiner reports are received annually
and discussed during the AMR process, feeding into
the School's ARPD. The Head of Department responds
directly to the external examiner, while the school
LQC is responsible for ensuring that any actions are
carried out. This process has normally been achieved
in a timely manner, apart from one year when delays
were caused by the University reorganisation.
158 The School has a detailed assessment policy
which appeared to be in line with the Code of practice
and the University's Academic Regulations. The policy
has been reviewed and formalised during the last year
as part of the LQC university-wide initiative on the
development of assessment policies.
159 The audit team saw examples of assessed work
drawn from across all the programmes covered by
the DAT. The assignment briefs were clear and
accompanied by detailed marking criteria and there
was evidence of both internal moderation of briefs
and sample second-marking of scripts. The standard
of work met with the expectations of programme
specifications and external examiners. Overall, the
team found the standard of student achievement to
be appropriate to the titles of the awards and their
location within the FHEQ.
160 The Student Handbook provides clear information
for students on all aspects of their programmes,
including what is required of them in terms of learning
and assessment. The School also provides a wealth of
additional information through its own VLE.
161 The provision of learning resources is monitored
through the regular stage, programme, department
and school committees and through the ARPD.
Students meeting with the audit team reported the
availability of books, journals and electronic
information resources to be satisfactory, but were
particularly pleased with the PCs provided for them
and the facilities available through the School's VLE.
They were also complimentary about the assistance
given by support staff.
162 The first stage of the SEI has been fully
implemented in the School and students identified
a resulting improvement in personal tutoring
arrangements. Personal tutor sessions are now
regular and skills training has been built into courses.
Students were also aware of the extensive support
facilities available through the Office of Student
Affairs, in particular, the 'Listening Ears' helpline
service. In addition, there was internet-based
information providing sources of assistance covering
all aspects of student life. In respect of careers
information, lecturers regularly invite guest speakers
to talk about their roles.
page 27
University of Greenwich
163 The School conducts its own student survey,
which is now internet-based, and collects
information on various aspects of its courses
including tutorial support and personal tutoring.
Resulting action by the School is also posted on the
web. In addition, there is the more general University
SSS from which resulting action is publicised by
email and newsletter.
164 Elected student representatives are active
participants in the stage and programme
committees. Improvements to computing and
School VLE facilities provided a good example of
how student influence through these committees
had led to a positive outcome. Most of the stage
committee's business is to do with detailed and
specific issues raised by students in relation to the
courses they are taking, which are either dealt with
directly or forwarded to the next programme
committee for action. There was also evidence of
issues being routed from these committees to the
ARPD process. The departmental meeting does not
have student representation because of
confidentiality issues, but there were two student
representatives (although not from the Department)
on the School Board, which regularly deals with a
range of quality management matters.
165 Overall, the audit team was satisfied that:
z
the standard of student achievement in the
programmes covered by the DAT is appropriate
to the titles of the awards and their location
within the FHEQ;
z
the quality of the learning opportunities is suitable
for the programmes of study in mathematics and
statistics leading to the named awards.
Law
166 The scope of the DAT comprised provision in
the Department of Law, within the School of
Humanities, leading to the following awards:
z
LLB (Hons);
z
BA (Hons) Law (Senior Status) (to be
discontinued from September 2004);
z
BA (Hons) Legal Studies (to be discontinued
from September 2004);
z
LLM research (the remit of the DAT being
supervision and support).
The Department, which transferred into the School
of Humanities in summer 2003, has as its main focus
the delivery of the LLB (Hons) programme. This
covers the 'foundation' subjects required by the Law
Society and Bar for completion of the academic
stage of training for entry to the legal professions.
page 28
The Department also offers combined honours
packages as part of the university-wide combined
honours programme and, although the LLB package
is to be discontinued from September 2004, the
Law package, which comprises 50 per cent of the
award, will continue.
167 The basis of the DAT was a DSED, based on the
departmental review, conducted in April 2003.
Documentation appended to the DSED included the
main report on the departmental review (May 2003)
and the subsequent progress report (January 2004).
168 Programme specifications for LLB (Hons) and for
BA (Hons) Law (combined honours package) were
provided. Within these, links to the FHEQ honours
(H) level were clear, although not explicit, while
learning outcomes were explicitly mapped against
the Subject benchmark statement for law at threshold
level. The programme specifications also set out the
teaching, learning and assessment methods used to
enable students to achieve and demonstrate
learning outcomes.
169 Progression and completion data for the past
three sessions were made available to the audit team.
In the team's view the usefulness of these data was
limited since, due to aggregation, it was not possible
to distinguish separate progression rates at the end of
stage 1 and stage 2, or to derive overall completion
rates. Data produced for the departmental review
were similarly aggregated. However, staff who met
with the team stated that data were available in a
form that enabled accurate and meaningful
judgements to be reached in the monitoring of quality
and standards and, supporting this view, the team
noted that stage 1 progression rates were tabulated by
department in the School's most recent ARPD.
170 Internal monitoring and review is undertaken at
course, programme and departmental levels. The
audit team saw examples of course monitoring
reports, which now utilise the School pro forma, and
are informed by feedback from students and external
examiners. It also saw examples of programme
AMRs, produced according to the University pro
forma, and providing an assessment of academic
standards, curriculum development and learning
opportunities. The DSED stated that both courselevel and programme-level reports were used as
input to the departmental report which was
considered by a scrutiny group of the school LQC.
However, it became evident to the team that there
had in the past been gaps in reporting by the
Department: staff stated that no departmental report
had been produced for 2002-03; and there was also
no record of AMRs for 2001-02 at any level, being
passed to the scrutiny group.
Institutional Audit Report: main report
171 The April 2003 departmental review was
implemented by the University specifically to
address the issue of poor student retention rates
which had been the subject of external examiner
comment for several years. The Department will
have its routine review, as part of the School's
quality assurance procedures, during the 2004-05
session. The audit team noted that externality was a
feature of the 2003 review, with two of the review
team members being academic lawyers from other
higher education institutions. The report of the
review panel identified a range of interrelated factors
leading to low progression and retention rates,
particularly for stage 1. Various countermeasures
have been introduced from the 2003-04 session,
including seminar streaming, student attendance
monitoring and the introduction of a personal
development portfolio for some students. Certain
courses with low student numbers are to be
discontinued (see paragraph 166 above) and
foundation courses within a new extended degree
programme are to be introduced from September
2004 to provide an access route into law
programmes. In addition, admissions criteria are
being reviewed, although the review team had not
considered these to be a prime factor influencing
progression and retention rates. It was too early for
the audit team to reach any conclusions about the
success of the approach adopted by the
Department, but staff expressed their confidence
that it would lead to improved retention.
172 External examiner reports are considered at
course, programme and departmental levels as part
of annual monitoring. Issues raised are summarised
in programme and departmental AMRs and
incorporated into the School's ARPD. Following
consultation with the SDLQ, the Head of Department
responds to individual external examiners by letter.
The audit team was able to verify this process, but
considered that the timeliness of responses to
external examiners could be improved upon,
although follow-up action had been taken to address
the comments they made.
173 At the date of the audit visit, the Humanities
School Board had recently approved a revised
Assessment Code of Practice for undergraduate and
postgraduate taught awards and the audit team
noted that this observed the precepts of the Code of
practice. In the DSED, reference was also made to an
assessment policy for law, developed through a staff
workshop in June 2003 (prior to law joining the
School of Humanities). However, it was not clear to
the team whether this assessment policy had been
incorporated into a definitive document, and no copy
of it was made available. The team also noted that
work was ongoing to harmonise assessment practices
across the School, in addition to the university-wide
review of assessment policies aimed at achieving
greater consistency across the institution.
174 From its study of student assessed work, the
audit team found it to match the expectations of
the programme specifications and the views of
external examiners, and the standard of student
achievement to be appropriate to the titles of the
awards and their location within the FHEQ.
175 Students receive the School Handbook which
contains general information on a wide variety of
matters including assessment. Details of the aims
and intended learning outcomes of each course are
set out in relevant course booklets, which also give
students full information on the form and respective
weightings of assessments. Students are informed of
available options choices through an options booklet
and at an options fair. Students told the audit team
that they were clear about expectations of them in
terms of learning and assessment, and aware of how
they could obtain help and advice in case of
complaints or appeals. The team heard from both
staff and students that while there was currently no
programme handbook, the introduction of one was
under serious consideration.
176 While the audit team noted the students'
comments on the need for additional library books,
it also saw that pressure on paper-based materials
was being alleviated by the library's short loan
system, which students commended, and by
increasing emphasis on electronic resources.
Students are offered training in basic research skills
and use of the more advanced law databases
through timetabled sessions, run by the law librarian
in consultation with academic staff. The law librarian
also works closely with the Department on library
resource planning. In response to student feedback
about difficulties with computer access, given the
1700-hours closure of the computing labs, some
open-access machines had now been provided. The
team also noted the use of an electronic self-study
package to support learning on some courses.
177 Since the 2003 departmental review,
enhancements to the Department's student support
system have focused on first-year students. A twoweek induction programme aims to help students
settle and, during the early weeks, any special
learning needs are identified through a compulsory
diagnostic test. Students have a personal tutor and
the same tutor is retained throughout the
programme. In addition, all first-year students are
page 29
University of Greenwich
peer-mentored by a second or third-year student.
Students reported favourably on all these
arrangements and commented generally on the
helpfulness and approachability of staff. They also
commended the provision of optional extracurricular activities designed to support and enhance
their learning, including presentations from outside
speakers, 'mooting' and interview skills sessions.
178 Student feedback on individual courses is
gathered systematically at course level, through
course evaluation sheets and through an on-line
survey at programme level. Student views are also
heard through staff-student liaison committees
which operate at school and departmental levels,
while the student Law Society acts as a further
informal channel. Issues raised recently have
included examination timetabling, rules on the late
submission of coursework and access to computers.
Students meeting with the audit team confirmed
that student concerns were listened to and that
action was taken where possible. Students have a
broad involvement in quality management through
representation on the School Board, although the
Department was not necessarily represented
specifically. There was no student representation at
departmental meetings, although issues raised at
staff-student liaison committees were tabled.
179 Overall, the audit team was satisfied that:
z
z
the standard of student achievement in the
programmes covered by the DAT is appropriate
to the titles of the awards and their location
within the FHEQ;
the quality of the learning opportunities is
suitable for the programmes of study in law,
leading to the named awards.
Marketing
180 The scope of the DAT comprised provision in
the Department of Marketing and Operations
Management, within the Business School, leading to
the following awards:
z
BA (Hons) Marketing;
z
BA (Hons) International Marketing;
z
BA (Hons) Marketing Communications;
z
BA (Hons) Tourism Management;
z
BA (Hons) Arts, Gallery and Heritage
Management;
z
BA (Hons) Events Management.
The Department also offers combined honours
packages as part of the university-wide combined
honours programme.
page 30
181 The basis of the DAT was a DSED written
specifically for the audit which also contained
illustrative documentation, including that pertaining
to the review of the Department's programmes as
part of a wider Business School restructuring of its
undergraduate provision and the development of
the BA (Hons) Events Management.
182 Programme specifications for the Marketing,
International Marketing, Marketing
Communications, and Tourism Management
programmes were appended to the DSED. Although
these did not make specific links to the FHEQ or to
subject benchmark statements, the DSED separately
stated that learning outcomes had been mapped
against these reference points as well as PSRB
requirements. The DSED also explained how
assessment and placement procedures were
consistent with relevant sections of the Code of
practice. The audit team gained confirmation of the
approach to the use of external reference points in
its meetings with departmental staff.
183 The Department uses the dataset supplied by
PSU to measure progression and completion rates.
Achievement is monitored through SAP reports and
PABs, which are used to inform the action plans
included in programme AMRs, subsequently feeding
into the School's ARPD. The latter showed the audit
team that data were being used to support
decision-making and, as an example, a retention
problem had been identified, which was being
addressed within the framework of the SEI.
184 Internal monitoring and review is undertaken at
course, programme and departmental levels. As an
illustration, the audit team was provided with one
programme AMR, a departmental report and the
School's 2003 ARPD. The programme AMR included
a five-point action plan for the coming year and a
description of the outcomes resulting from the action
plan of the previous year. The departmental report
was informed by student evaluation and external
examiners' comments and had been scrutinised by
the School Teaching and Learning Committee. All
programmes in the Business School are subject to
review after a minimum of five years, but may be
reviewed earlier on the basis of a risk assessment
exercise. The most recent review of undergraduate
programmes was completed in 2002 as part of a
school-wide restructuring of provision and this
resulted in a number of changes to improve the
student learning experience, student support and
retention. On the basis of the evidence available, the
audit team was satisfied that monitoring and review
processes were working as intended.
Institutional Audit Report: main report
185 The Department produces action plans in
response to reports from external examiners which
the Head of Department subsequently confirms in
writing to the related examiners. Recent examples of
external examiner reports and response letters were
seen by the audit team and these indicated that
most key issues were being addressed satisfactorily.
Departmental staff, in their discussion with the
team, stressed that they maintained a dialogue with
their external examiners and emphasised that this
was a crucial element in the process of responding
to their comments. The departmental report also
showed a comprehensive listing of responses to
issues raised by external examiners. Mostly
comments had been complimentary about teaching
quality and the overall standard of student work.
However, a continuing problem of lower
performance students producing work showing
inadequate analytical and critical content was being
addressed in the Department by additional student
support measures. In general, the team was satisfied
that appropriate and timely responses were being
made to external examiner reports.
186 The Business School assessment policy, along
with all school assessment policies, is currently being
reviewed by LQC for consistency with the Code of
practice. The audit team noted that local assessment
regulations were embedded within the policy which
it considered might cause confusion about their
status. Nevertheless, students who met with the
team confirmed that, on the whole, they were clear
about the assessment requirements and that the
feedback they received was generally helpful and
timely, although they also drew attention to some
inconsistent practice in the granting of extensions to
submission deadlines.
187 The audit team reviewed a range of assessed
student work from all levels of the programmes,
including both examination scripts and coursework.
Assignment briefs provided clear assessment and
grading criteria, although they did not always
indicate which learning outcomes were being
assessed. There was evidence of internal moderation
and double-marking being undertaken. Feedback to
students was good on the whole, although there
were examples of sparse as well as comprehensive
feedback in the range examined by the team.
Overall, the team found the standard of student
achievement to be appropriate to the titles of the
awards and their location within the FHEQ.
188 The audit team was provided with a selection of
programme and course handbooks. The programme
handbooks were produced according to a standard
format and, in most respects, provided students with
the range of information needed to understand the
requirements of the programme, although there was
no reference to the existence of the programme
specification. There were, however, helpful sections on
study skills, plagiarism, placement and the availability
of support. Reproduced in each programme
handbook was an extract from the Business School
Assessment Regulations giving the general
requirements for grading, failure and reassessment,
although there was no reference made to the
Academic Regulations (for taught courses), which is
the overarching source of regulatory information.
Course handbooks were found to be comprehensive,
with learning and assessment expectations and
student responsibilities clearly set out.
189 The Department had experienced some problems
in filling staff vacancies, leading to a worsening of the
student/staff ratio and increased reliance on visiting
lecturers and electronic methods of course delivery. In
relation to other learning resources, students told the
audit team of problems encountered with computer
availability, although departmental staff assured the
team that this difficulty had now been overcome. A
recent SSS had also raised concerns over book
provision, although the team found no evidence that
this was a significant problem.
190 Student support mechanisms have been modified
in response to the SEI, with the introduction of a twoweek induction period, new courses in business skills
and extra study skills tuition, in addition to general
improvements to the personal tutor system. The DSED
indicated that these changes had led to a positive
impact on student performance and retention.
Students who met with the audit team confirmed that
the changes to the personal tutor system had been
well received. They also highlighted the useful
information they obtained from the Department
about placements and from guest speakers about
career opportunities, although they were less positive
about information given on PSRB exemptions which
staff acknowledged could be better promoted in
programme handbooks.
191 Students reported that they were able to provide
feedback to staff through the SSS and through
special feedback sheets at the end of each course.
Although some students said they did not hear the
outcomes from this feedback, others confirmed that
they had been told of resultant changes.
192 There are staff-student committees operating at
the levels of school, programme, programme cluster
and stage, and, in addition, a campus committee.
Among the issues raised recently have been
computer and printer availability problems, access to
student counsellors, accommodation and
page 31
University of Greenwich
assessment. There had also been positive feedback
about course delivery and personal tutors. However,
it was clear to the audit team from minutes and
comments from student representatives that
programme meetings were not always being held as
frequently as intended, thus limiting the
opportunities for students to be involved in the
quality management of their programmes.
193 Overall, the audit team was satisfied that:
z
z
the standard of student achievement in the
programmes covered by the DAT is appropriate
to the titles of the awards and their location
within the FHEQ;
the quality of the learning opportunities is
suitable for the programmes of study in
marketing leading to the named awards.
Section 4: The audit investigations:
published information
The students' experience of published
information and other information available
to them
194 The University publishes a range of core
publicity and information material in support of
student recruitment, marketing and induction:
prospectuses, covering undergraduate, postgraduate
and part-time programmes, as well as a specific
School of Health and Social Care prospectus;
promotional literature targeted at particular student
groups; various guides; and 'joining instructions' for
new students. These are the responsibility of the
publications and web teams within the Marketing
Office, which additionally deals with the University's
advertising, in conjunction with an external agency.
195 Published information also falls within the remit
of the Office of Student Affairs whose publications
produced in printed, CDROM and internet-based
formats, are primarily aimed at current students, and
include the Guide for New Students; the Skills for
Learning Handbook; Rules, Regulations and Policies:
a Student Guide; as well as the University Calendar.
Supplementing this material is the information
produced by schools about their courses and
programmes in the form of individual course
documents or bound within programme handbooks.
196 The SED stated that core publications were under
constant review in terms of content, design and value
to the end user, explaining that the Marketing Office
used focus group techniques, involving current and
potential students, as well as internal users, for
evaluating promotional literature. Specific checks
page 32
were made for compliance with equal opportunities
requirements and expectations as to the use of 'plain
English', while responsibility for accuracy of content
was shared with the relevant schools and central
services. The SED also referred to the strict editing
and corporate style guidelines that bound core
publications, but acknowledged 'a need to further
integrate the publications produced by schools and
their use of multi-media to maintain consistency of
brand, style and image', adding that this was being
addressed by issuing templates to assist schools and
support services in developing literature.
197 The SWS stated that the prospectus and web site
information was, in general, comprehensive, accurate
and reliable, although not completely free of
academic jargon. It also indicated that some students
found the web site to be overloaded with information
and sometimes challenging to navigate. Overseas and
mature students pointed to some deficiencies in the
joining instructions they had received, but their
general experience was that discussions with staff
resolved most issues. Students who met with the
audit team expressed similar views.
198 To an extent, the audit team identified with
students' perceptions of the web site, the Guide for
New Students (2004-05) providing a case in point.
While accepting the instructions on how to use this
Guide make clear that its content can be read on a
'need to know' basis, the team considered that its
compilation was somewhat counter-intuitive if read
section by section. For example, the link to the topic
'Getting Your Voice Heard' begins with information on
studying and working abroad; this is followed by a
brief half-page summary of the student representation
system, and further into the document by much
useful information for different student groups on
obtaining relevant individual advice, although not on
how to access representative or interest groups as a
means of 'getting their voice heard'. Similarly the
topic 'About Being a Student' opens with information
on combined honours, with no equivalent for
students on other programmes or links to such
information. The team recognised that the location of
the Guide was on a central web site, and also that the
University was addressing the issue of linking central
and school-produced information (see paragraph 122
above), so might wish to consider these observations
in that wider context. The team was also aware that
SUUG, with LQO support, was intending to produce
a student representatives' handbook (see paragraph
90 above) which would expand on the summary
information currently available.
199 Through the DATs the audit team was able to
verify that students generally perceived the
Institutional Audit Report: main report
information they received at course and programme
levels to be comprehensive and helpful, although
there were variations between schools, which the
SWS had highlighted and, to a lesser extent, between
programmes within schools. The team has already
indicated its concerns that such variations are to the
potential disadvantage of combined honours
students, particularly those whose programmes cross
schools (see paragraph 78 above). However, the team
found that internal monitoring systems were effective
in providing feedback on students' experience of the
published information made available to them, and
that the University was accordingly reviewing its
provision of information, notably the web site, to
improve both layout and content.
201 From its study of documentation and its
discussions with staff, the audit team was able to
confirm that the University was working within the
framework and timescale set out in HEFCE 03/51 for
publishing information on the TQI web site. The team
considered that existing and newly devised approaches
appeared to provide suitable vehicles for reporting the
requisite summaries, including the summary of the
University Learning and Teaching Strategy. Overall, the
team concluded that reliance could reasonably be
placed on the accuracy and completeness of the
University's published information.
Reliability, accuracy and completeness of
published information
200 The SED outlined how the University intended
to comply with the requirements for HEFCE's
document, Information on quality and standards in
higher education: Final guidance (HEFCE 03/51). In
respect of the qualitative summaries for publication
on the TQI web site, the QASC working group on
the external examiner system (see paragraph 66
above) had revised the external examiner report
form to enable direct submission of the necessary
information, while the LQO, in conjunction with
CMS, was developing a mechanism for receiving the
reports electronically that also permitted analysis
and uploading of summaries onto the TQI web site.
Similarly, the University had devised, for use by
SQAOs, a template and guidance for producing
summary output relating to programme approval
and review, to meet the requirements of HEFCE
03/51 for periodic review summaries. The LQO had
been given the task of listing employer links, and
together with the other aspects of the qualitative
information set, would monitor, maintain and
annually update the University's entry on TQI. The
quantitative information set was to be supplied by
the Higher Education Statistics Agency (HESA). On
this basis, the University expressed confidence that it
would fully meet its obligations under HEFCE 03/51.
page 33
Findings
University of Greenwich
Findings
202 An institutional audit of the University was
undertaken by a team of auditors from the Agency
during the week 14 to 18 June 2004. The purpose
of the audit was to provide public information on
the quality of the University's programmes of study
and on the academic standards of its awards. As
part of the audit process, according to protocols
agreed with HEFCE, SCOP and UUK, four DATs were
selected for scrutiny. This section of the report
summarises the findings of the audit. It concludes by
identifying features of good practice that emerged
from the audit, and by making recommendations to
the University for enhancing current practice.
The effectiveness of institutional procedures
for assuring the quality of programmes
The quality assurance framework
203 In the 2001 academic restructuring, schools
were given responsibility for both the delivery of
academic quality and for the management of quality
assurance. These responsibilities are specifically
vested in heads of school who through their
membership of Academic Council and the Executive
Committee (respectively the senior deliberative and
managerial committees) provide the linkage between
deliberative policy and management, and between
school and centre. The quality assurance remit
devolved to schools includes programme approval,
monitoring and review; the production of the ARPD;
management of the assessment cycle, including
arrangements for SAPs and PABs; appointment of
external examiners and responses to their reports;
preparation for school and departmental reviews;
and the maintenance of links with PSRBs.
204 Central oversight of the quality assurance
system is maintained through LQC, the LQO and a
regulatory framework within which schools must
operate. A range of documents contributes to this
framework, including the Quality Assurance
Handbook which contains procedures and guidance
for approval, monitoring and review. The PVC
(Academic Planning), who has central supervisory
responsibility for quality and standards, chairs LQC.
Central administrative support for the University's
quality assurance system is provided primarily
through the LQO, headed by the UDLQ, who also
chairs both QASC and QESC.
Programme approval
205 The approval of new programmes or combined
honours packages requires their prior authorisation to
establish the academic and business case, to ensure
that adequate resources will be available in the host
page 36
school and to trigger preparations for marketing and
recruitment. The normal cycle is based on an 18month lead time from authorisation to first intake,
although there are fast-track procedures to allow a
rapid response to changes in market demand.
Following authorisation, programmes, packages and
their constituent courses are developed in detail and
submitted for approval on the basis of the school's
chosen arrangements, although all programme
approvals must involve someone external to the
school. Outcomes are reported upwards through the
committee system to school boards and, where
applicable, to the Combined Honours Committee.
The process for substantial revision to existing
programmes, known as review, is essentially the same
as for approval, except that it includes a critical
appraisal of the programme, informed by current and
former students' views. In the context of programme
approval/review, distance delivery via e-learning
without external partner involvement is treated as
internal provision.
206 In the SED, the University acknowledged the
high volume of authorisations being requested by
fast-tracking which was being monitored. It viewed the
approval process as being transparent and effective,
producing clear and accessible reports that provided
the basis for monitoring compliance with procedures
and for identifying any concerns. It also indicated that
so far schools had mostly chosen to retain the
traditional event-driven process, although they were
increasingly tuning events to the nature of the approval
and the extent of perceived risk. In addition, the SED
pointed to the rising proportion of campus-based
students having direct access to, or the opportunity,
for, e-learning as an illustration of its commitment to
FDL through a 'blended learning' approach.
207 The audit team considered that the University
was taking suitable steps to limit the incidence of
fast-tracking programme authorisation. In general, it
found that the procedures for programme approval
and review being operated by schools were
essentially a continuation of the comprehensive 'old'
process. It concluded that they were thorough and
effectively managed, with the LQO and QASC
having appropriate institutional oversight. The team
further noted that the approval procedure for elearning programmes allowed for LQO involvement,
which in practice meant additional stringency in the
quality assurance arrangements because of the
higher risk factor attached to 'innovative' provision.
Annual monitoring
208 The principal vehicle for monitoring academic
provision is the ARPD which is produced in a series
of sections under specified headings. Feeding into
Institutional Audit Report: findings
the quality and standards section are departmental
reports and programme AMRs, which are prepared
during the autumn and based on analysis of
statistical data, student feedback, reports from SAPs
and PABs, and external examiner reports. Schools
have considerable discretion over the frequency and
timing of monitoring for individual courses and
programmes; there is no requirement for every
programme to be monitored each year.
209 Schools submit their ARPDs to the VCG in
January and, subsequently, these are separated into
their component sections which are distributed to
the appropriate committees of Academic Council
and the Executive Committee, with the quality and
standards section being considered by LQC. The
parent committees later receive summary reports
from their respective subcommittees. This iterative
reporting process takes place in February and
March, resulting in formal feedback to school boards
in March and April.
210 In the SED, the University referred to the ARPD
as the main innovation of the revised quality
assurance system, implemented to provide a more
focused information flow from schools to the centre.
It added that schools had initially been unable to
achieve the brevity of reporting envisaged, although
refinement of the process had led to reports that
more effectively met institutional requirements.
211 It became apparent to the audit team that the
University had responded to the expansion of the
ARPD by schools by developing an efficient
mechanism for evaluating the information and giving
feedback within a relatively short timescale. This, in
the team's view, had preserved for schools the
coherence of progress reporting and action planning.
Like the University, it saw the portfolio planning
section as a significant tool. There is good practice in
the holistic approach to reporting and planning
through the ARPD, combining in a single process and
a single document both the academic quality and
standards and human and financial resources aspects
of schools' activities, thus providing the University with
a valuable instrument for managing its current and
future portfolio (see 270 i below).
212 However, the audit team doubted whether
some of the sections of the ARPD could provide
sufficiently detailed and comparable information
about the management of quality assurance by
schools because, in producing its overview for the
ARPD, a school had considerable discretion over the
format, content and means of compiling the
underlying information. Specifically in relation to the
quality and standards section, the team considered
that a less tentative and more direct approach from
LQC might be needed for the identification of
common themes (see 271 i below).
Periodic review
213 Complementing the ARPD, the University has
recently introduced school review to provide a
regular check on the operations of a school across a
range of programmes within the broader context of
its management and resources. School review is
conducted by a team established by the LQO that
includes both student and external reviewers, with
external, in this case, being external to the
University (as opposed to just the school). Review
reports are submitted to the Executive Committee
and to LQC, together with an agreed set of actions
and timescales for their completion. LQC establishes
the overall programme for the review cycle and all
eight schools plus DECS have been scheduled for
review over the three-year period 2003 to 2006.
214 In addition to school review, the University
operates occasional reviews at both the level of the
department and the programme. Departmental
review may be triggered by either school or centre
and the process is similar to that for school review.
Programme review, in this context, is primarily a
failsafe for programmes that reach five years after
their initial approval without otherwise having been
formally reviewed, and the process is the same as
that for programmes involving a substantial revision.
Schools are first required to conduct a risk
assessment as the basis of deciding whether to
conduct a programme review and, if so, what its
scope should be.
215 The SED explained that the interaction of the
various review processes, including their relationship
to the ARPD and PSRB review, was being monitored
by the LQO to provide the relevant committees with
the means of evaluating the effectiveness of these
processes and their underlying principle of scrutiny
proportional to risk.
216 The audit team was unable to comment in
detail on the effectiveness of the periodic review
processes since, at the time of the visit, so few
school or departmental reviews had been
completed. With regard to programme review, staff
were mostly insistent that all programmes would be
reviewed within five years, although many were
seemingly unaware of the weight now being placed
by the University on the use of the risk assessment
tool for determining necessity or scope. Also relating
to the application of scrutiny proportional to risk,
the team suggests that the tendency for the
University to link together the concepts of
page 37
University of Greenwich
externality and cross-representation, as in the advice
on external involvement in programme approval,
clouds the important distinction between these
concepts for cases where independent sources
external to the University are paramount. In making
this point, the team acknowledges that external
involvement in periodic review is stipulated as
external to the University.
Feedback from students graduates and employers
217 Among the inputs to the ARPD and review
processes is feedback from students on the quality
of their programmes. The main method of collecting
student feedback across the University is the SSS,
although students additionally give feedback on
individual courses through locally conducted
questionnaire surveys. The SSS comprises studentdetermined questions and its results are analysed at
the level of campus, school and department, and in
terms of aspects such as mode and level of study,
gender, age and ethnicity. The full report and an
intended action report are widely disseminated to
students and staff. Other formal routes for students
to comment on their experience are committees
having student representation, such as staff-student
committees, programme committees, school boards
and relevant central committees. In addition, there
is an alumni office which can be used as a channel
for obtaining input from graduates to approval and
review processes, while employers provide feedback
through advisory boards and associated forums, as
well as through less formal or ad hoc arrangements.
218 The SED drew the link between the SSS and the
formulation of University policy, emphasising the
widespread attention given to its output at senior
committees and through the identification of resultant
action in school ARPDs. It acknowledged that at local
level, the system of student representation was
somewhat diffuse and that this might be leading to
insufficient coordination and transparency, although it
added that membership of school boards provided for
a degree of consistency across the institution. The SWS
pointed to a lack of coordination and integration
between the different levels of student representation,
notably between SUUG representatives and school
board representatives. In relation to feedback from
graduates and employers, the SED gave a number of
illustrative examples.
219 In the audit team's view, there is good practice
in the comprehensive SSS, the thorough
consideration of its findings and the well-publicised
and timely feedback of its results to both students
and staff (see 270 ii below). The team saw much
evidence of specific action points in school ARPDs
relating to lower survey ratings, while University
page 38
administrative departments and the SUUG were also
using SSS output to monitor their service provision.
220 However, the audit team found that the nature
and degree of student representation at local level
varied between schools, although it recognised that
differences in nomenclature of staff-student
committees might account for some of the diversity.
The team was concerned that students' input into
quality management should not be limited by
factors such as programme committees not
observing the agreed frequency of meetings,
particularly as there was no systematic collection of
survey feedback at programme level. The team
noted there were student vacancies on school
boards and on some institutional committees and,
while accepting that it was difficult to engage
students in higher-level committees because these
were of less immediate relevance to them, it also
viewed the lapse of induction and training for
student representatives, previously provided by
SUUG, as having a likely bearing on the position
(see 272 iii below). In this context the team
regarded the planned production of a student
representatives' handbook, led by SUUG but with
support from LQO, as a practical step in
demonstrating that the University placed value on
students getting their voice heard.
221 Notwithstanding the recommendations for
action by the University, the findings of the audit
confirm that broad confidence can be placed in the
University's present and likely future management of
the quality of its programmes.
The effectiveness of institutional procedures
for securing the standards of awards
Assessment policies
222 The Academic Regulations govern assessment
arrangements and external moderation of academic
standards, and any changes to them or exemptions
from them require approval by Academic Council. In
respect of taught awards, the Academic Regulations
define the terms of reference for SAPs, which have
primary responsibility for addressing academic
quality and standards, and for PABs, which deal with
individual student profiles, making decisions about
students' progression and award. Programme-specific
regulations, which also include relevant PSRB
requirements, are published in student handbooks
and schools have their own assessment policies.
223 The SED pointed to several aspects of the
management of assessment that were likely to
change, largely in response to the implementation
of a three-term academic year (rather than two
Institutional Audit Report: findings
semesters), and the associated move to 30-credit
(rather than 15-credit) courses as the norm, and to a
single main assessment period at the end of the
academic year (rather than separate periods
following each semester). School assessment policies
were being further developed to explicate their
consistency with the Code of practice and, as part of
this exercise, the harmonisation of some aspects of
assessment across schools was under consideration.
224 The audit team found the development of
schools' assessment policies to be illustrative of the
difficulties that can arise when processes are
devolved and the importance of requirements being
communicated clearly. For instance, schools were
expected to develop policies in a situation where
assessment issues were addressed in several different
documents and within a regulatory framework that
was itself having to keep pace with significant
change. The exercise was now becoming protracted,
with draft policies needing to be revised by schools
to include signposting to other source documents,
delaying key decisions about which aspects of policy
should be divergent and which should be the same
(see 271 ii below).
225 The audit team saw assessment policy as an
area of complexity facing combined honours
students; where their programmes crossed schools
these students had to deal with two assessment
policies. In the team's view, the emphasis on the
relationship with the first-named subject for
combined honours students could work to the
detriment of the students' ability to resolve difficulties
that related to the second school or crossed school
boundaries. In addition, the team noted that there
were three separate PAB arrangements for combined
honours students and that these had been approved
following extensive debate about the parity of
treatment for combined honours students. The team
recognises that the University has paid considerable
attention to standardising the operation of PABs, but
nevertheless encourages it to keep under review the
complexity of arrangements for combined honours
in relation to assessment, progression and support
(see 271 iii below).
Use of statistical information
226 With regard to the use of statistical information
relating to student progression, completion and
achievement, the SED explained that it was an
institutional objective to work from a centrallydriven dataset used by schools to inform the ARPD
process. However, it also acknowledged that the
production of the dataset was driven by external
timescales which did not coincide with those of the
internal annual monitoring cycle. This had been
addressed by schools producing a separate
commentary on identified trends, after having
submitted their main ARPDs.
227 The audit team found that there was
considerable variation in the detail of statistical
analysis between school commentaries and that with
few exceptions aggregation of the data meant it was
difficult to gain a clear impression of cohort
progression between stages (years) of study.
However, the team also learned that the inclusion of
'student stage' in the dataset was being pursued
centrally and that the data available to school staff
were improving. Therefore, it concluded that the
University was making progress in providing the
necessary tools for comprehensive lower-level
analysis and had itself identified areas for
improvement to which it was giving high priority.
External examiners and their reports
228 The external examiner system operates within
the framework prescribed by the Academic
Regulations, which are supplemented by the notes
of guidance on external examiners, published within
the Quality Assurance Handbook. School boards
approve the appointment of external examiners,
with central oversight being maintained by the
Regulations, Standards and Examinations Office
which sends out appointment letters on behalf of
the University and also offers an induction
programme for new external examiners and a forum
for existing ones. The appointment letter encloses
details of the role of the external examiner, the
arrangements for presenting reports and the general
regulations for assessment. Schools follow up this
letter with a statement of the attendance
requirements for external examiners at SAPs and
PABs and a range of contextual information.
229 Reports from external examiners, which are
produced to a standard template, are received by the
Regulations, Standards and Examinations Office
which circulates them centrally for information and
to schools for response. Schools are required to
provide summaries of action taken in response to
last-year's external examiner reports in their ARPDs.
Complementing the annual monitoring cycle within
schools, the LQO has started to conduct an annual
analysis of external examiner reports. The first of
these pertains to the 2002-03 session, when a
procedure was also introduced, requiring a formal
response from a school to the LQO in relation to any
report suggesting a threat to standards. In addition,
the Regulations, Standards and Examinations Office
produces an annual report for Academic Council on
those aspects of external examiner reports relating to
the central regulatory framework.
page 39
University of Greenwich
230 In the SED, the University described its external
examiners as 'a central component in maintaining
and enhancing its standards', and expressed the
belief that it was meeting the Code of practice. It
pointed to changes in the institutional remit of
external examiners to reflect the greater emphasis
nationally on the use of outcome-based benchmarks
to elucidate standards. In addition, the SED gave
examples of how the University had addressed
external examiners' concerns over regulatory matters.
231 In the audit team's view, the University was
making effective use of external examiners in its
summative assessment procedures. The team was
able to verify that external examiner reports were fed
through programme AMRs and departmental reports
into the ARPD process, although there was some
ambiguity over what constituted 'last year' in
reporting action. The revised 'discounting system'
aptly illustrated how the University had made every
effort to appreciate the balance of external examiner
opinion on a central regulatory matter and then
made modifications based on their advice. The team
considered the LQO analysis of external examiner
reports to be a particularly useful innovation, but was
concerned that its full benefit might not be realised
because of the length of time between its initial
presentation to LQC in October and its consideration
by Academic Council the following June.
232 In relation to accreditation as a means of
providing externality for the monitoring of
standards, the audit team noted that the main
channel for upward reporting on PSRB reports was
the quality and standards section of the ARPD and
that there was no other institutional mechanism for
providing an overview. Given the team's views about
the sufficiency of detailed and comparable
information in this section of school ARPDs (see
paragraph 212 above), it saw the risk of rather
cursory attention being given to the subject of PSRB
reports, meaning that issues of broader significance
might not be picked up by LQC at the next stage of
the process (see 272 ii below). In making this point,
the team acknowledges that PSRB reports are sent
to the LQO, which would highlight critical
comments for the attention of LQC.
233 Notwithstanding the recommendations for
action by the University, the findings of the audit
confirm that broad confidence can be placed in the
University's present and likely future management of
the academic standards of its awards.
page 40
The effectiveness of institutional procedures
for supporting learning
Learning support resources
234 The University's principal learning support
resources are the responsibility of ILS, formed in
August 2003 to bring together library, computing and
classroom services, as well as support of the University
VLE. The overall level of purchasing for both library
materials and ICT is determined by the University's
annual budgeting procedures. However, there is a
'mixed economy' whereby schools are able to fund
lesser requirements from their own budgets, while the
main infrastructure is provided by ILS. The SSS is the
primary means of determining student feedback on
the delivery of library and computing provision, but in
the latter case benchmarks are also used to evaluate
provision against comparator institutions.
235 The SED stated that planning for the future of
learning support services had 'inevitably been
dominated' by the restructuring over the past three
years. It also stated that operational aims were
related to those of the Learning and Teaching
Strategy. However, the University recognised that
further advances in the use of the VLE platform were
dependent on the availability of additional
resources, and also that its ability to effect significant
improvement in library purchases had been limited.
236 In respect of library provision, the audit team
considered that there were appropriate mechanisms
for collaboration between schools and ILS, as
demonstrated by their working together to achieve
more accurate targeting of purchases. While it was
clearly too soon to reach a firm view about the new
ILS structure, given that it had been so recently
established, the team was supportive of the moves
towards stronger institutional overview and the
'levelling-up' of provision to guard against variability
of provision between campuses.
Academic guidance and personal support
237 The University has a range of mechanisms for
the academic guidance and personal support of
students, starting from induction, then through the
duration of their studies, to preparation for
employment. Once students begin their
programmes, primary responsibility for academic
guidance rests with schools and departments. These
arrangements are complemented by the support
services provided through the central Office of
Student Affairs, which manages a range of welfare
and advisory services delivered locally to students
through campus-based student centres. The Office
also provides assistance with generic study skills to
Institutional Audit Report: findings
supplement skills training embedded in the curricula
or provided by schools, and has recently
incorporated a new Research Student Administrative
Office, working closely with schools which have
responsibility for research supervision arrangements.
238 The university-wide SEI was implemented in
2002-03. Envisaged as a series of stages to be applied
gradually in tandem with students' academic
development, it has initially focused on first-year
personal tutoring, stipulating particular forms of
entitlement for students, translated into explicit
standards against which schools have to align their
practices. Student support, including specific
initiatives linked to the SEI, is regularly monitored
through the quality and standards sections of school
ARPDs, with LQC maintaining the overview and QESC
having responsibility for more detailed evaluation.
Supervision arrangements for research students are
approved by RDC, which also has responsibility for
monitoring the quality of their supervision.
239 The SED described the impact of the SEI as
'significant', in that all schools, even those with a good
track record in academic guidance, had to review their
procedures to ensure they were meeting its
requirements. While the SED acknowledged that some
schools were facing difficulties in meeting SEI
standards, it cited the SSS as providing evidence of
high student satisfaction with personal tutoring
arrangements. In relation to pastoral support, the SED
gave a number of illustrative examples from subject
review and PSRB reports, commending the University's
arrangements in a variety of contexts, as well as
comments from its internal feedback mechanisms.
240 The audit team found first-year students typically
to be having group tutorial sessions on a fortnightly
basis, indicating that SEI standards were being
achieved. However, combined honours students
reported difficulties, mostly relating to the second
subject of their combined award for which they had
no formal personal tutor support. The team considered
this to be an example of how such students were
potentially disadvantaged by having only an indirect
route into the second school (via the combined
honours adviser in their first school) and no
independent channel for resolving cross-school
difficulties. The team also considered the PASH,
provided by the Office of Student Affairs, to be a
valuable resource for personal tutors, acting in an
academic, professional or pastoral capacity. There is
good practice in the SEI, stage one of which has been
successfully implemented university-wide to strengthen
the personal tutor system (see 270 iii below).
241 The audit team found postgraduate students to
be generally positive about their supervision
arrangements, but it became apparent that there
was no training requirement for students with
teaching or demonstrating duties, or formal
arrangements for monitoring their teaching.
Although it heard several examples from staff of
skills training for research students in different
schools, in the team's view this did not represent a
systematic approach (see 271 iv below).
242 The audit team concluded that, overall, the
University was building on its recognised strengths
in the area of student support and guidance and
maintaining an institutional overview of the
devolved arrangements within schools. The team
considered arrangements for pastoral support to be
responsive to diverse student needs, and consonant
with the University's policies and strategies.
Teaching staff appointment, appraisal and reward
243 The Regulations for the Appointment and
Promotion of Staff and the accompanying staff
Recruitment and Selection Procedures set out the
University's processes for staff appointment, appraisal
and reward. Accordingly, all established teaching posts
have both a job specification and a person
specification, produced to a standard format;
appointments are made by selection panels whose
members must have undertaken training in
recruitment and selection; and newly-appointed
teaching staff undergo a 12-month probationary
period, entailing observation of their teaching, with
reports sent to the central Personnel Office after six and
12 months' service. For new staff without a recognised
qualification or three years' experience in teaching,
successful completion of the probationary period is also
dependent on making satisfactory progress on either a
PGCE PCET or a PgDip Higher Education.
244 There are University guidelines for the award of
merit-based increments which the SED explained
had been used particularly to reward junior
members of staff who obtained postgraduate
qualifications in the early years of their employment.
The University also explicitly links teaching quality to
reward through its scheme for special promotions to
PLT posts. To deal with the fact that the University is
located in areas of expensive housing where
competition for staff is severe, market premium
payments have been introduced within the scope of
the HR Strategy. Schools report specifically on
recruitment and retention issues, appraisal and PLT
posts through the staffing section of their ARPDs,
which are reviewed centrally by both the Executive
Committee and the Staff Development Focus Group.
page 41
University of Greenwich
245 The SED indicated, through a 2003 progress
report from the Director of Personnel, that there
were measurable improvements in staff recruitment
and retention, although the proportion of younger
teaching staff had remained static over recent years.
Regarding appraisal, the SED acknowledged that the
annual participation rate was currently 10 per cent
behind target, giving as an explanation the lower
levels of participation among manual staff, coupled
with the impact of restructuring.
246 From its study of school ARPDs, the audit team
was able to verify that the market premium
payments scheme was delivering results in some
schools, but also to see the variability across schools
in relation to appraisal reporting, appraisal practice
and the impact of appraisal on staff development
planning. In particular, there was no consistent
reporting of the percentage of staff covered by
appraisal, but some schools were reporting appraisal
rates for teaching staff significantly below target (see
272 iv below). However, the team recognised that
the Staff Development Focus Group was keeping a
watching brief on the appraisal participation rate
and, in general, considered the University to be
maintaining an appropriate institutional overview of
its procedures for the appointment, appraisal and
reward of teaching staff.
Staff support and development
247 The core of the University's framework for staff
support and development is the institutional Staff
Development Policy and the staff development
aspects of the HR and Learning and Teaching
Strategies. This framework is supported by a number
of associated institutional policies, including
Nurturing Staff and Mentoring of New Staff. At
institutional level, the mechanisms for monitoring
implementation of this framework include progressreporting against the HR Strategy, review by the
Executive Committee through the ARPD process,
and the exploration of common themes by the Staff
Development Focus Group.
248 According to the SED, priorities up to 2002
were orientated towards training; they included
developing schemes relating to the PgCE/PgDip for
new, inexperienced staff, and training staff in new
technologies. More recently, the emphasis in the
institutional Learning and Teaching Strategy on
schools building up their own internal staff
development expertise has led to an increased focus
on expanding the PLT role within schools into a
corporate resource for promoting innovation in
learning and teaching.
page 42
249 The audit team was unable to gain any strong
impression of the linkage between institutional
strategies and staff development at school level; in
particular, it appeared to the team that most schools
were not capitalising on the PLT resource and that
PLTs were not perceived by staff as a coherent
group. The team found the level of detail on budget
allocation and expenditure for staff development to
vary between school ARPDs, although the same
point had been identified by the Staff Development
Focus Group. The team also found both full and
part-time staff to be positive about the development
opportunities available to them both internally and
externally, and learned of many examples of staff
development undertaken on an individual basis.
Induction and mentoring arrangements for new staff
were also in operation. However, little reference was
made in ARPDs to the training and development of
research student supervisors although staff provided
some individual examples of training and support
mechanisms for inexperienced supervisors available
at a local level (see 272 v below). Overall, the team
is supportive of the University's commitment to
achieve greater integration in its approach to staff
development and acknowledges the progress made
towards adjusting the balance between the
respective responsibilities of school and centre.
Outcomes of discipline audit trails
Chemical sciences
250 The scope of the DAT included provision in the
Department of Chemical Sciences, within the School
of Sciences, focusing on the BSc (Hons) programmes
in Chemistry; Analytical Chemistry; Pharmaceutical
Chemistry; the MSc programmes in Pharmaceutical
Sciences; and Industrial Pharmaceutical Sciences; and
the HNC Chemistry. Programme specifications set out
appropriate learning outcomes, linking these clearly
to teaching, learning and assessment, with reference
made to relevant subject benchmark statements. The
Department is concerned over the progression and
completion rates of some undergraduates and has
adopted a range of strategies to address the problem.
From its study of the students' assessed work and
from its discussions with staff and students, the audit
team found the standard of student achievement to
be appropriate to the titles of the awards and their
location within the FHEQ.
251 Student evaluation of the provision was largely
positive, and students were satisfied with both the
nature and extent of support they received from
staff and the learning resources placed at their
disposal. Students confirmed that feedback from
staff was both helpful and timely and praised all
Institutional Audit Report: findings
aspects of support and guidance, both academic
and pastoral. The audit team found the quality of
learning opportunities available to students to be
suitable for the programmes of study leading to the
awards covered by the DAT.
and the optional extra curricular activity provided to
support and enhance their learning. The audit team
found the quality of learning opportunities available
to students to be suitable for the programmes of
study leading to the awards covered by the DAT.
Mathematics and statistics
Marketing
252 The scope of the DAT included provision in the
Department of Mathematical Sciences, within the
School of CMS, focusing on the BSc (Hons)
programmes in Mathematics; Statistics;
Mathematics, Statistics and Computing; and
Decision Science. Programme specifications set out
appropriate learning outcomes and link these clearly
to teaching, learning and assessment, with reference
made to the Subject benchmark statement for
mathematics, statistics and operational research.
From its study of the students' assessed work and
from its discussions with staff and students, the
audit team found the standard of student
achievement to be appropriate to the titles of the
awards and their location within the FHEQ.
256 The scope of the DAT included provision in the
Department of Marketing and Operations
management within the Business School, focusing on
the BA (Hons) programmes in Marketing;
International Marketing; Marketing Communications;
Tourism Management; Arts, Gallery and Heritage
Management; and Events Management. Programme
specifications set out appropriate learning outcomes
and link these clearly to teaching, learning and
assessment, with reference made to the Subject
benchmark statement general business and
management. Assessment techniques were varied,
related to the vocational areas and used clearly
articulated criteria. From its study of the students'
assessed work and from its discussions with staff and
students, the audit team found the standard of
student achievement to be appropriate to the titles
of the awards and their location within the FHEQ.
253 Student evaluation of the provision was largely
positive and students were satisfied with both the
nature and extent of support they received from
staff and the learning resources placed at their
disposal. Students particularly praised the on-line
resources available to them. The audit team found
the quality of learning opportunities available to
students to be suitable for the programmes of study
leading to the awards covered by the DAT.
Law
254 The scope of the DAT included provision in the
Department of Law, within the School of Humanities,
focusing mainly on the LLB (Hons) programme.
Programme specifications set out appropriate
learning outcomes and link these clearly to teaching,
learning and assessment, with reference made to the
Subject benchmark statement for law The 2003
departmental review identified a range of interrelated
factors leading to low progression and retention
rates, particularly among first-year students, and
various counter measures have been introduced from
the 2003-04 session. From its study of the students'
assessed work and from its discussions with staff and
students, the audit team found the standard of
student achievement to be appropriate to the titles
of the awards and their location within the FHEQ.
255 Student evaluation of the provision was largely
positive and students were generally satisfied with
both the nature and extent of support they received
from staff and the learning resources placed at their
disposal. Students were particularly complimentary
about the helpfulness and approachability of staff
257 Student evaluation of the provision was largely
positive and students were generally satisfied with
both the nature and extent of support they received
from staff and the learning resources placed at their
disposal. Students confirmed that changes to the
personal tutor system, as part of the SEI, had been
well received and were also complimentary about
course delivery. The audit team found the quality of
learning opportunities available to students to be
suitable for the programmes of study leading to the
awards covered by the DAT.
The use made by the institution of the
Academic Infrastructure
258 The SED outlined the University's approach to
the Code of practice as having been to amend its
own procedures, where deemed appropriate, as
each section of the Code was released. The sections
relating to mainstream quality assurance were
initially dealt with by a central quality assurance
committee, while those relating more directly to the
student experience were dealt with by the
appropriate administrative offices. QASC has since
taken on the mainstream quality assurance brief and
has this session organised 'spot audits', through the
LQO, to ensure continued alignment with the Code.
259 The University took a similar approach to the
FHEQ, formally adopting the qualification
descriptors in March 2002, following a review
page 43
University of Greenwich
comparing these with the level descriptors it had
previously been using in the design of programmes.
The SED referred to subject benchmark statements
as 'a useful tool' for programme teams when
checking curriculum coverage, adding that as part
of the approval process, external advisers were asked
to comment on curriculum content against subject
benchmark statements. The recently revised report
form asks external examiners for confirmation of the
appropriateness of standards in relation to national
reference points.
260 Since September 2002, the production of
programme specifications has been integrated with
programme approval and review procedures,
although the University also imposed a deadline of
January 2004 for the implementation of programme
specifications, applicable to all courses, including
those not due for review. The SED indicated that the
University had historically used outcomes
terminology in describing its provision and that the
main developments had been a more explicit
mapping of skills acquisition within the curriculum
and a stronger identification of the relationship
between learning outcomes and teaching, learning
and assessment practices.
261 The audit team was satisfied that initial
alignment with the Code of practice had been
achieved and that the University had in place a
suitable mechanism through QASC for dealing with
revisions. However, the team was unable to identify
any systematic process for monitoring consistency
over time and was concerned that, within a highly
devolved structure where there was considerable
flexibility over the form of processes and the format
of documentation, 'spot audits' would not be
sufficient to inhibit any drift in practices from their
aligned position (see 272 i below).
262 The above observations notwithstanding, the
audit team considered that the University's approach
to reference points demonstrated that it appreciated
their purposes and was reflecting on its own
practices in relevant areas. The team was able to
verify that subject staff were considering programme
outcomes in terms of subject benchmark statements
and level/qualification descriptors. It was able to
track the progress made by schools in meeting the
deadline for complete publication of programme
specifications and the firm line taken by LQC to keep
schools on target. The team also noted the efforts of
QASC and the LQO in encouraging approval panels
to pay closer attention to programme specifications
and in reducing the variability in format. In the
team's view the University's approach to programme
specifications provided a good example of schools
page 44
and the centre successfully working together within
clear areas of responsibility.
The utility of the SED as an illustration of the
institution's capacity to reflect upon its own
strengths and limitations, and to act on
these to enhance quality and standards
263 The SED provided a comprehensive account of
the quality assurance developments since the 1999
audit and a description of the framework for
managing quality and standards, following the 2001
academic restructuring, which gave schools dual
responsibility for the delivery of academic quality
and the management of quality assurance.
264 An important theme running through the SED
was the means by which the strengthened internal
managerial framework in schools was leading to a
stronger sense of ownership of quality assurance.
The SED identified key posts within schools that
were instrumental through membership on central
committees in assuring university-wide procedural
consistency and rigour, and in enhancing teaching
and learning and the student experience. However,
these links between school and centre are still
evolving and the audit team spent much time
elucidating how relevant committees and
administrative offices carried out their responsibilities
for central oversight of quality management within a
highly devolved quality assurance system.
265 Overall, the audit team found the SED to be a
clear and well-structured submission demonstrating
the University's capacity to reflect on its strengths
and limitations in relation to significant aspects of its
provision, drawing attention to good practice as
well as areas for improvement. The linkages drawn
between University strategies and policies helped
the team to gain ready insight into the University's
ongoing achievements and the challenges it faces,
and also to ask appropriate questions to test
whether institutional procedures were gaining wider
ownership and working in practice.
Commentary on the institution's intentions
for the enhancement of quality and standards
266 The University's current Learning and Teaching
Strategy (2002-03 to 2004-05) is still in the course
of implementation and its plans for future
enhancement beyond 2005 are therefore necessarily
limited. In terms of institutional change having
implications for enhancement, the SED highlighted
the adoption of 30-credit courses, emphasising
depth rather than just choice, and the benefits of
the SEI, which was about to be extended to the next
Institutional Audit Report: findings
stage, with student self-development as the
overarching theme. However, in the SED, the
University explained that it was increasingly acting
as a facilitator for school-level staff rather than
attempting to advance a central enhancement
agenda. It discussed enhancement in the context of
having developed mechanisms to resource and
manage change effectively, with QESC and school
LQCs acting as the respective focal points at central
and local levels for the increased targeting of
resources to school-based staff, and for sharing good
practice across schools and departments.
267 The audit team considered that, while crossrepresentation on committees and the functional
relationships between central and school-based roles
were providing mechanisms for exchanging good
practice, there were cases where the approach
appeared to be somewhat reactive in that schools
were independently formulating policies and sharing
their respective practice afterwards. Seemingly, the
PLT role provided an example where there was no
designated forum for networking and the team
concluded that PLTs were not currently functioning
as a corporate resource, driving forward the
institutional Learning and Teaching Strategy. These
observations, notwithstanding, the team recognises
the University's considerable achievement, within a
relatively short time, of establishing mechanisms
within schools for the effective utilisation of
resources for enhancement.
Reliability of information
268 The SED outlined how the University intended to
comply with the requirements of HEFCE 03/51 for the
publication of qualitative and quantitative information
sets on the TQI web site, and the audit team found
that the University was working towards meeting its
responsibilities in this area within the proposed
timescale. The external examiner report form had
been revised to enable direct submission of the
necessary information, and a template plus guidance
devised for school use in producing summary output
relating to programme approval and review. The
listing of employer links was envisaged to be
prepared without difficulty, while the quantitative
information set would be supplied by HESA.
269 The audit team was able to review a variety of
University and school publications as well as the
University web site, and discuss their effectiveness
with different student groups. In general, students
stated that the information they received was
accurate, reliable, comprehensive and helpful,
although they indicated that there were variations
between schools and that the web site was
sometimes difficult to navigate. To an extent the
team identified with students' perceptions of the
web site, but recognised that the University was
addressing the integration of central and schoolproduced information. The team found that internal
monitoring systems were effective in providing
feedback on students' experience of the published
information made available to them and that the
University was accordingly reviewing its provision of
information, notably the web site, to improve both
layout and content.
Features of good practice
270 The following features of good practice were
noted:
i
the holistic approach to reporting and planning
through the ARPD, combining in a single
process and a single document both the
academic quality and standards, and human and
financial resources aspects of schools' activities,
thus providing the University with a valuable
instrument for managing its current and future
portfolio (paragraph 54);
ii
the comprehensive SSS, the thorough
consideration of its findings and the wellpublicised and timely feedback of its results to
both students and staff (paragraph 95);
iii the SEI, stage 1 of which has been successfully
implemented university-wide to strengthen the
personal tutor system (paragraph 131).
Recommendations for action
271 Recommendations for action that is advisable:
i
to provide schools with more explicit guidance
on the expectations for reporting on matters
relating to the quality assurance of provision
through the ARPD, in order to improve
consistency and comprehensiveness and thereby
to make the ARPD a more effective channel for
institutional oversight within the University's
framework for managing quality and standards
(paragraph 56);
ii
In the interests of improving transparency in the
information provided to students, to expedite
the process of determining those aspects of
assessment policy that should be universally
applicable and either incorporated in the
Academic Regulations (for taught awards), or
standardised across schools' assessment policies
(paragraph 77);
iii to strengthen arrangements for ensuring parity
of treatment for combined honours students
whose programmes cross schools with those
page 45
University of Greenwich
whose programmes operate within a single
school, given the scope for variation in the
content of school policies and the format of
documentation given to students, together with
the system of allocating personal tutor support
solely on the basis of the school responsible for
the first-named subject of a combined
programme of study (paragraph 78);
iv to review the provision of skills-training for
research students and, in particular, to establish
a training requirement for those students
involved in teaching or demonstrating activities,
together with a mechanism for subsequent
monitoring and support (paragraph 133).
272 Recommendations for action that is desirable:
i
to make explicit the University's approach to
maintaining consistency of its procedures with
the Code of practice, including how central and
local responsibilities are to be distributed
(paragraph 76);
ii
given the importance placed by the University
on accreditation by PSRBs in providing
externality to the monitoring of its standards,
to ensure that PSRB reports are routinely
considered centrally for the purpose of
identifying generic issues, emerging themes or
good practice (paragraph 83);
iii to give greater priority to promoting the
involvement of students in quality management,
including working more cooperatively with the
SUUG to reinstate training for student
representatives and encouraging all schools to
adhere to regular meeting schedules
(paragraph 90);
iv to take the necessary steps to ensure full
implementation of teaching staff appraisal,
particularly given its linkage through staff
development to delivering both the HR and
Learning and Teaching institutional strategies
(paragraph 106);
v
to provide more systematic training and
continuing staff development for research
supervisors (paragraph 112).
page 46
Institutional Audit Report: appendix
Appendix
The University of Greenwich's response to the audit report
The University welcomes the outcome of the institutional audit. It confirms that there is broad confidence in
the soundness of the University's management particularly in relation to our academic programmes and in
the standards of our awards.
In particular, the University welcomes the audit team's praise of the holistic approach we take to reporting
and planning, through the Annual Reporting and Planning Report produced by each School, the
comprehensive Student Satisfaction Survey and the Student Experience Initiative. We also note the team's
positive comments on our devolved quality assurance structure and the numerous references in the report to
aspects of good practice at institution, School and programme level.
Where the audit team have identified the need for further improvement the University will consider these
carefully and take appropriate action.
page 47
RG 093 10/04
Download