Surveying the experience of postgraduates

advertisement
Student surveys and quality enhancement
Dr. Paul Bennett, Head of Surveys, Higher Education Academy
The HEA’s surveys role
•
Deliver national postgraduate surveys (PRES and
PTES)
•
NSS further analysis and best
practice sharing
•
Support and consultancy
around using survey data for
enhancement
•
Annual Surveys for
Enhancement conference
•
Survey related research and
policy advice
www.heacademy.ac.uk/surveys
Outline
1. The HEA’s surveys role
2. Student surveys
3. The NSS and quality enhancement
- Communication
- Staff attitudes
- Analysis and exploration
- Staff-student partnerships
4. Surveying student engagement
5. Surveying the experience of postgraduates
Student surveys
Love?
Efficient
Democratic
Digestible
Comparable
Reliable
Useful
Motivating
Hate?
SURVEYS
STUDENT EXTRACT
Overused
Consumerist
Shallow
Misleading
Invalid
Dangerous
Demotivating
Student surveys
• Research method providing a
partial – but often useful –
representation of experience
• Need to be triangulated with
other information – especially
qualitative
• Comparing results useful but
comes with health warnings
• Surveys can be a good starting
point for student engagement –
not the end point
The NSS and quality enhancement
•
NSS intended to inform student choice and quality
assurance
•
HE White Paper (England 2011):
“Well informed students driving teaching excellence”
•
Publication of NSS results has increased focus on learning
and teaching in many institutions and a powerful lever
for change…
•
…though questions remain over the focus of the survey
and validity of making comparisons
•
NSS becoming better used in stimulating enhancement
HEA’s NSS Institutional
Strategy Working Group:
‘Making it Count:
Reflecting on the National
Student Survey in the
process of enhancement’
(October 2012)
Available via:
www.heacademy.ac.uk/nss
The NSS and quality enhancement
Progress in communication of NSS results:
- ‘Top-down’ dissemination to staff
- Dissemination of results to students unions
- ‘You said, we did’
- Staff-student discussions in formal structures
- ‘You said, we didn’t’ and wider dialogue
- Staff-student partnerships at all stages – promotion,
analysis, dissemination, further research, discussions,
decision-making
The NSS and quality enhancement
Staff attitudes
• Important to mitigate perception that the NSS is ‘a stick to
beat us with’
• ‘Deficit model’ focusing on poor performance, statistical
rankings and ‘red lights’ – can be counter-productive with
academic staff
• More effective use focuses on sharing best practice, casestudies, mutual support and use of qualitative information
(including free-text comments)
• Being open about the strengths and limitations of the
survey
The NSS and quality enhancement
Analysis and exploration
• NSS limitations include detail, focus and one-method
• Vital to ‘triangulate’ with other information, e.g.
more detailed internal and module surveys
information about other (non-satisfaction) aspects of
experience – e.g. engagement
information beyond ‘experience’, including grades,
employability, retention, examiners’ reports
non-survey methods, including qualitative
information for depth
The NSS and quality enhancement
Staff-student partnerships
• Student analyses can add interpretations – and find
solutions – that staff committees can’t do on their own
• Student unions, reps and other students need support in
analysing, understanding and using NSS data
• Vital for effective discussion of NSS and enhancement in
system of student representation
• Some institutions have used NSS results as a springboard
for staff-student workshops and conferences
Using NSS data
Partnership approaches
The NSS and quality enhancement
Scale
Rank
Beta
The teaching on my course
1
0.326
Personal development
=2
0.211
Organisation and management
=2
0.209
Academic support
4
0.156
Assessment and feedback
5
0.082
Learning resources
6
0.027
Impact of learning experience on overall satisfaction
Multiple regression of National Student Survey 2011 dataset
13
The NSS and quality enhancement
Are the questions valid measures of quality (and does that
matter)?
Graham Gibbs (2010) Dimensions of Quality
• Valid, comparable measures of educational quality in
institutions should relate to educational gain
• Measures of effective practice (e.g. student engagement,
intellectual challenge, deep learning) are good predictors
of educational gain
• Such measures can be found in the US National Survey of
Student Engagement (NSSE) but not, on the whole, in the
UK’s NSS
The NSS and quality enhancement
• CEQ research suggested some items indicate deep vs.
surface learning:
A perception that workload is too high is strongly related to a
surface approach
A perception that assessment is focused on memorisation and
reproduction is strongly related to a surface approach
A perception that teachers are enthusiastic, give good feedback,
make the subject interesting and communicate well is partly
related to a deep approach
• Questions in the NSS ‘optional bank’ are relevant for
enhancement
The NSS and quality enhancement
Surveying student engagement
•
Amongst other things, engagement surveys like NSSE and
AUSSE ask about:
- learning interaction with peers (in and outside class)
- learning interaction with staff (in and outside class)
- engagement in own study – self-directed learning
and effort
- course challenge / depth of learning
• Research suggest these ‘predict’ educational gain,
especially ‘depth of learning’
• Less clear to what extent they predict NSS results and less
researched in a UK context
Surveying student engagement
Surveying student engagement
Surveying student engagement
HEA’s UK Engagement Survey Pilot
• 14 items from NSSE adapted for UK use in internal surveys
in 9 institutions (plus 2 more running variants of the
whole NSSE/AUSSE)
• HEA analysing pooled data, providing national aggregate
for participants, testing reliability and commissioning
cognitive testing
• Institutions will produce case-studies on use of results for
enhancement
• Year 2 of pilot will expand the number of institutions
involved and may test relationship with NSS items
Surveying the experience of postgraduates
•
Postgraduate Research Experience Survey (PRES)
- launched 2007, runs every two years
•
Postgraduate Taught Experience Survey (PTES)
- launched 2009, runs annually
•
Aim to inform enhancements to learning and teaching
and advise national policy
•
Not intended to inform student choice
•
Results are confidential = no league tables
•
Benchmarking clubs facilitate comparisons
•
Allow for inclusion of locally specific questions
Surveying the experience of postgraduates
Reason for using PRES
%
Identify specific areas for enhancement
94
Assess perceptions of quality of degree programmes
67
Assess the equality of experience and/or opportunities
58
Benchmark experience nationally or with comparator institutions
56
Evaluate consistency of experience across disciplines/departments
52
Help engage relevant staff groups in enhancement
29
Demonstrate to potential PGR students the quality of the
research training environment
Demonstrate to funders the institution's commitment to PGR
support
Other
17
8
8
Surveying the experience of postgraduates
•
Significant redevelopment of PRES for 2013
•
Focus on enhancement priorities in Researcher
Development Framework and Quality Code (B11)
•
Greater emphasis on research skills and professional
development – including supervisors’ role
•
Also importance of ‘Research community’ for PGRs
•
More detailed than NSS, but same principles apply in use
for enhancement
•
PTES being redeveloped for 2014
PDF of questionnaire available online via
www.heacademy.ac.uk/pres
Vitae study for HEA:
‘Using PRES to enhance the
experience of postgraduate
researchers’
(September, 2012)
Available via:
www.heacademy.ac.uk/pres
Download