Quality assurance at programme level: establishing and assessing

advertisement
Quality Assurance within Higher
Education Institutions
Professor Phil Cardew
Pro Vice Chancellor (Academic)
London South Bank University
Objectives of the Day
• To establish the concepts of ‘standards’ and ‘quality’
and their place within a higher education institution.
• To consider core processes of benchmarking, reporting
and review needed to manage quality and standards.
• To discuss the relationship between management
processes and resourcing and quality assurance
systems.
• To consider the interrelationship between internal and
external assurance systems.
• To consider reporting mechanisms, and risk assessment
and management from a management perspective.
Agenda
0930:
Welcome, introductions, discussion of core concepts.
1000:
Quality assurance at programme level: establishing and assessing to a standard.
1100:
Coffee Break
1130:
Validation, monitoring and review: process and reporting.
1230:
Lunch
1300:
Student engagement: feedback and representation.
1400:
Using reports, risk assessment and management.
1430:
Plenary Discussion
Quality assurance at programme
level: establishing and assessing to a
standard.
Standards and Quality
• What is ‘a standard’?
– Thresholds attainment
– Levels of achievement
– Benchmarking and equity
• What is ‘quality’?
– Customer service models
– Enhancement
The Building Blocks of degree awards
– working with institutional variation:
• ‘One size fits all’ approaches.
• Establishing ‘labels’ – understanding
structures.
• Common ‘labels’:
– Programmes and courses
– Frameworks and Pathways
– Modules and Units
• Variability of approach
• Modification
Structures of delivery
•
•
•
•
•
Full-time and part-time
Single honours and combined honours
Distance, blended and distributed delivery
Delivery by partner institutions
Multi-site delivery
– Comparability
– Assessment of standards
• Research degrees and learning contracts
• Relationship to academic regulations
How do we establish a standard at
programme level?
•
•
•
•
•
National Qualifications Frameworks
Subject benchmark statements
Professional body requirements
Employer requirements
External examiners
– Academic
– Professional
Standards, levels, awards
•
•
•
•
Award outcomes – graduate attributes
Levels within an award
Exit qualifications
Assessment
– Type
– Variation
– Loading
• Embedded, dual and articulated awards.
Employability
• ‘Professional’ and ‘Academic’ qualifications
• Subject knowledge, technical ability, specialist
skills, core skills
• Currency of knowledge
• Work-based learning
Conclusions
• No ‘one size fits all’ or ‘standard’ model
• However – especially in early stages – consistent
approach pays dividends
• Important to establish an ‘outcomes based’
approach
• Clear understanding of overall learning outcomes
• Clear understanding of level and progression
• Clear assessment strategy
Coffee Break
Validation, monitoring and
review: process and reporting.
Basic Questions
What are we trying to
do ?
Why are we doing it ?
PURPOSES
How are we going to do
it ?
Why is this the best way
to do it ?
How will we know it
works ?
How can it be improved
?
METHOD
REASON
OPTIMISATION
EFFECTIVENESS
ENHANCEMENT
The Building Blocks
• Validation – programme approval
• Annual monitoring:
– Action planning
– Relationship to other processes
• Periodic review:
– Cycle of operation
– ‘End of cycle’ and ‘mid-cycle’
Working with collaborative partners
• Types of relationship
– ‘Flying faculty’
– Part-franchise
– Franchise
– Validation
– Accredited Partner
• Approval of delivery
• Periodic review
Validation
• Initial approval in principle:
–
–
–
–
Strategic ‘fit’ within overall academic portfolio
Clarity of award title
Market
Desirability for professional and/or employment market
• Validation event:
– Programme specification
– External involvement
• Academic
• Professional
• Employer
Annual Monitoring
• Cyclical action planning
• Responding to data:
– External Examiners’ report(s)
– Progression and Award Statistics
– Module Evalution Questionnaire results
– National Student Survey
– Employment Statistics
• ‘Sign off’ of minor modifications
Periodic Review
• Relationship between review, validation,
monitoring and minor modifications:
– Incremental change and re-validation
– Stability of award title and learning outcomes
– (Advantage of frameworks and pathways)
• Gives experience of programme over a longer
time-scale
• Allows for ‘major’ changes
• MUST include appropriate externality
Action Planning and Reporting
•
•
•
•
•
•
Identifies short and medium-term actions
Includes responsibility
Identifies activities to be undertaken
Includes review point
Establishes benefits of activity
Reports on:
– Conclusion of activity
– Results of action
Conclusions
• Nested activities – not separate processes
• Should establish continuum of evidenced
action planning
• Can work in clusters of programmes as well as
individual programmes
• Should lead to clear, concise reports
• MUST include externality in all aspects and at
all points.
Lunch
Student engagement: feedback and
representation.
• Why engage students with quality processes?
– Identify strengths and weaknesses of delivery
from a student perspective
– Engage with aspects of delivery outside teaching:
•
•
•
•
Classroom and lecture space
Library
IT
Specialist equipment
– Engage with assessment, marking, moderation
and feedback to students
Basic Methods
•
•
•
•
•
Module Evaluation
Annual Surveys
Course Boards
Student Meetings
Student involvement in Periodic Review:
– In meetings
– As Reviewers
• Senior engagement with the Students’ Union
Module Evaluation
•
•
•
•
•
•
Standard questions
Scoring
Similar timescales of delivery
Anonymous completion
Comments as well as scores
Standard reports
–
–
–
–
Module
Programme
Department
Faculty
• Focus on under-performing modules
Annual Surveys
•
•
•
•
New entrants
International Students
National Student Survey
Postgraduate Surveys:
– Taught programmes
– Research Students
• ‘Pulse’ surveys
Course Boards and Student Meetings
•
•
•
•
•
•
•
Elected representatives
Training
Timescales for meetings
Standard Agendas
Gathering information
Feedback
Relationship to other processes
– External examining
– Annual monitoring
Students within validation and review
processes
• Student meetings
–
–
–
–
Engagement with new proposals
Feedback on existing courses
Recent graduates reflecting on employability
Engagement with department – responsiveness to
feedback etc
• Students as reviewers
– Experience on QAA reviews
– Training
– Limits of process
Conclusions
• Student input adds value to processes.
• Needs to happen in collaboration with Students’ Union
(or a Student Society).
• Representatives need training.
• Need to establish clear understanding of goals of
engagement.
• Needs careful handling not to patronise or antagonise.
• Need to reassure staff that they are in control of their
programmes – but that student input is valuable!
Using reports, risk assessment and
management.
What are the aims of quality assurance
processes?
• Confirmation of standards
• Reassurance that processes have been
completed
• Reflection on performance (data monitoring)
• Enhancement of future delivery (programme
structure and quality of
delivery/environment).
What should processes focus on?
• Specialist understanding of the academic
discipline
• Statistical data – progression and
achievement
• Feedback from external examiners
• Feedback from students
• Employment statistics
• Resources
What should we avoid?
•
•
•
•
Long and tedious reports with nothing to say.
Repetition from previous years.
Narrative description with no analysis.
‘Open-ended’ action planning.
What should we promote?
• Focused reports.
• Clear analysis of data.
• Action plans which show monitoring and
completion of actions.
• Forward planning related to analysis.
• Responsiveness to feedback.
• Development, not stagnation.
Becoming ‘risk aware’
• Can we focus only on key areas (programmes)
of risk?
• What should lead to investigation?
– Threats to standards
– Poor progression and/or achievement
– Negative feedback (from examiners or students)
– Poor satisfaction
– Poor employability
– Lack of action
Reporting as part of a cycle
• Reflection on previous year’s report (and
actions).
• Analysis of data – including comparison with
past performance (what is ‘direction of
travel’?)
• What action is needed as a result?
• Who will do it – by when?
Conclusions
• Reporting need not be a huge burden (either
to the author or the reader).
• Must have clear outcomes and be useful.
• Must be used.
• Must have place in future activity and
reflection.
• Poor performance must be dealt with (both in
terms of activity and reporting).
Basic Questions
What are we trying to
do ?
Why are we doing it ?
PURPOSES
How are we going to do
it ?
Why is this the best way
to do it ?
How will we know it
works ?
How can it be improved
?
METHOD
REASON
OPTIMISATION
EFFECTIVENESS
ENHANCEMENT
Plenary Discussion
Download