Faculty Responses to the Postgraduate Research Experience

advertisement
Faculty Responses to the Postgraduate Research Experience Survey 2013
Science and Engineering
Introduction
This paper outlines the Doctoral College's response to Queen Mary’s results in the
Postgraduate Research Experience Survey (PRES) for 2013 provided by the Higher
Education Academy.
An initial analysis of the PRES results, comparing Queen Mary to the sector as a whole and
to other Russell Group Universities and broken down by Faculty, was prepared by Planning
and provided to the Doctoral College Management Group in September 2013. The results
were discussed by the DC Management Group at its September meeting, with more detailed
results - collated by School, and including free text comments, circulated by the Deputy
Deans to the Directors of Graduate Studies in each School for consideration. Schools were
asked to produce an Action Plan detailing their responses to any issues where satisfaction
levels fell below the Russell Group average, for return to their Faculty Deputy Dean for
Research by the 1st week of November for consideration and summary by the Doctoral
College Management Group.
Whilst the Doctoral College values PRES, it recognises that on its own PRES does not
always provide the fine grained analysis that is most useful to assessing student experience.
As a result, from September 2013 the Doctoral College have circulated a more detailed Exit
Survey to all Queen Mary students on completion of their PhD. Together with existing forms
of feedback (individual student progress reports, reports from student representatives on
School Graduate Studies Committees, and on the Doctoral College Management Group)
and PRES, this survey will provide a much fuller picture of the Queen Mary PGR experience.
Science and Engineering
As for HSS, the inability to analyse PRES results at the level of individual programs makes it
difficult for some Schools in S and E, where very different cultures often exist between
research groups. Although response rates were significantly increased on 2011 (33% in
2013 compared to 17% in 2011) and in-line with the sector average they are still relatively
low. The response rate across the Faculty was generally consistent, although the School of
Engineering and Materials Science was rather lower with only 31 (21%) students
responding. This, in part, is likely to be due to a lack of awareness of the PRES survey
amongst academics, which it is hoped will improve in future years. All Schools have noted
that this makes it difficult to interpret results from PRES and to develop appropriate and
proportionate responses to issues of concern that the results appear to show. Nonetheless,
the Faculty welcomes PRES, which provides useful additional feedback by which the Faculty
and individual Schools can assess the experience and satisfaction of their PGR students.
Every School in S and E has responded with specific action plans to address areas of
concern raised in the survey, which will be followed up with a “you said, we did”
correspondence with the PGR students.
Taken as a whole, S and E scored well in PRES, with overall satisfaction levels of 78% - inline with Queen Mary's average. Scores in a number of Schools were very high (SMS 89%
and SEMS 87%) with all Schools, with the exception of EECS, recording scores over 80%.
1
Unfortunately the aggregate score for S and E was lowered by the poor overall satisfaction
score in EECS of 78%, which has by far the largest number of PhD students in the College
with ca. 15% of the entire population. The School has taken this result very seriously and
although it is almost certain that this is being affected by poor quality space, over which the
School has no control, all staff have been informed of the main issues arising from PRES
and these will be covered in ongoing supervisor update training in the School.
Within the overall College responses, students in S and E reported especially high levels of
satisfaction with Library facilities and with induction. Within the general Faculty responses
there is significant variation between Schools and it is at this level that individual Schools
have identified areas of concern raised by their students and have drawn up action plans,
which in addition to those identified by the Doctoral College and Faculty, are outlined below.
These action plans should also result in wider knowledge of PRES within the Faculty.
Supervision
All Schools in S and E generally scored highly on PRES questions relating to the quality of
supervision. When asked whether their supervisors had the skills and subject knowledge to
support students’ research, all Schools scored a student satisfaction level of over 79% with
the School of Physics and Astronomy scoring 93%. All Schools also scored highly with
regard to the frequency of supervisory contact. Feedback scores were also excellent with all
Schools polling over 81%, with SMS particularly impressive with 89%. The poorest
responses were seen for some Schools when students were asked whether their
supervisors helped them to identify training and development needs as a researcher. This is
an area of some confusion within the Faculty and which the Deputy Dean and Directors of
Graduate Studies are actively working on.
Actions: All Schools in S and E will continue to offer a high level of supervision to all
postgraduate research students. Schools will also ensure that all new supervisors receive
supervisor training run by the Doctoral College and refresher courses organised locally and
delivered by the Directors of Graduate Studies and the Deputy Dean for Research.
Doctoral College to review points-based researcher development programme and Deputy
Dean for Research to disseminate process to Schools via Directors of Graduate Studies.
Resources
Students’ satisfaction rates across the Faculty varied quite significantly in response to
questions regarding the provision of resources. Most Schools received very high satisfaction
rates in relation to space (>80%) but there was specific criticism of some space in SBCS and
more generally in EECS (69%), which was the subject of a number of comments in the free
text field and is clearly adversely affecting this School’s general performance in PRES.
Given the size of the PhD cohort in EECS this is something that also affects the overall
College performance and is outwith of the control of the School. This pattern in responses
was also reflected in levels of student satisfaction with regard to computing facilities and
library provision with both SBCS and EECS scoring noticeably lower satisfaction than the
Faculty’s other Schools. The lowest level of satisfaction related to the provision of specialist
resources. Given the diversity of research undertaken in the Faculty it is impossible to satisfy
the needs of all researchers, but the return of 88% for SMS was a highlight, which may
reflect the fact that mathematicians do not have recourse for expensive technical resources
required by other disciplines in the Faculty.
Actions: There is little EECS can do with regard to space. SBCS has major projects
ongoing to improve space in both Joseph Priestley and Fogg buildings. The SBCS PGR
2
manual has been modified to include information on space available to PGR students and
the School induction has been modified to include more information on Library resources.
All Directors of Graduate Studies are afforded the opportunity to share best practice and
discuss issues through monthly meetings with the Deputy Dean for Research.
Research Culture
PRES asked students four main questions to gauge satisfaction with regard to their
experience of a broader research culture. Students were asked about their School’s seminar
programme, opportunities to discuss their research with other research students, the
research ambience in their School, and whether they felt there were opportunities for them to
become involved in the wider research community, beyond their own School. The responses
from Schools in most categories were above the sector average of 64% and frequently
above the Russell Group average of 67%. Seminar programmes were well-received, with
SMS scoring an impressive 89% satisfaction score, although SPA scored rather lower in this
category. Opportunities to discuss research with other students produced a rather uneven
response and it is clear that individual Schools, the wider Faculty and the Doctoral College
can all work to improve this aspect of the PGR experience. Both SBCS and EECS scored
below the Faculty for research ambience. The former was almost certainly as a result of the
restructuring of the School, which adversely affected morale, whilst the latter is again a
reflection of the building about which the School can do nothing. Opportunities to engage
with the wider research community were above the Sector average for all Schools, with SMS
again poling an impressive 89%. Schools are already working hard to engage the PGR
students and to develop strong research environments. Fostering further such engagement
requires action at a College and Faculty level and is being led by the Doctoral College with
twice yearly cohort days designed to bring Queen Mary PGR students together and foster
inter-disciplinary exchange, and – building on existing initiatives (Café Scientifique and the 3
Minute Thesis Competition) – new social events to bring Queen Mary’s PGR community
together (the Doctoral College Annual Debate and Curry Night). The Doctoral College has
also implemented the Post-graduate Research Initiative Fund to allow students to run
collaborative cross-School and cross-Faculty activities. In addition, the Doctoral College
events calendar (that also lists School seminar and reading groups) should greatly increase
awareness of the wide range of research events across Queen Mary.
Actions: SPA to develop their seminar programme. SBCS are also running a re-focussed
seminar programme this year aligned more closely with the research divisions of the School.
Improved communication with PGR students of events through the Doctoral College web
pages.
Induction, Progression, Arrangements and Assessment
PRES monitors students understanding of, and satisfaction with, the ways in which their
progress is monitored, the standards required for the award of a PhD, knowledge of the
examination procedures, and student induction processes. There were some very high
levels of satisfaction across the Faculty in all areas, although responses in some sections of
the survey were rather mixed. Induction is an area where the Faculty can share best practice
more effectively as three Schools score highly (SEMS, SPA and SMS). All students in the
Faculty record high level of satisfaction with the process of monitoring progress with all
Schools poling over 81% satisfaction. The responses for questions relating to standards and
examination procedures were the most inconsistent between Schools and ranged from 63%
to 80% for the former, and 68-83% for the latter. Given that all Schools ensure that all
students and supervisors are aware of these aspects of the PhD process and it is covered in
inductions and PGR manuals, this is rather surprising.
3
Actions: All Schools to ensure that all students and supervisors are aware of what is
required for the progression and assessment processes.
All students and supervisors advised of content on Doctoral College web pages.
Some Schools to survey students to ascertain what aspects of induction could be improved.
Doctoral College to consider running training sessions on PhD standards and the
examination process.
Responsibilities
Queen Mary values feedback from all students and staff. Postgraduate Research students
are represented on the Doctoral College Management Group with a standing member from
the Students’ Union and the President of the newly formed Doctoral Society. At a Faculty
level, student feedback is provided to the Faculty and the Doctoral College via the Director of
Graduate Studies from each School who meet monthly. In general there were high levels of
satisfaction expressed across the Faculty in this section. The levels of feedback relating to
student responsibilities were good across the Faculty with all Schools scoring over the
average of the sector and the Russell group. Supervisor responsibilities are also good,
although the SMS score of 72% was somewhat lower than other Schools. The response to
awareness of who to contact regarding any aspect of the degree was rather lower for most
Schools, although SPA received a 90% satisfaction score. Again, this is rather surprising as
this information is covered extensively online, in induction programmes and in School
manuals. The most disappointing aspect of this section of the survey comes in response to
how valued PGR student feedback is by Schools, which is below 75% for all Schools. This is
reflected in the responses across the College but is clearly an area that attention needs to
be focussed on.
Actions: At the Faulty level Schools will exchange ideas for best practice to increase
opportunities for students to feedback.
Individual Schools are improving internal communications and providing appropriate training
for supervisors and students.
Schools and Doctoral College initiatives to mirror NSS “you said, we did” campaign, which is
also being mirrored in some Schools.
Research skills and development
There were high levels of student satisfaction in the section relating to research skills and
development. Satisfaction rates were recorded at over 82% in regard to the development of
research methodologies, tools and techniques during research. Critical analysis was more
uneven but all Schools scored over 71% with both SBCS and SPA receiving satisfaction
scores over 90%. The responses relating to students’ confidence to be creative or
innovative generally received a slightly lower score, although SPA again scored over 90%.
Ethics and integrity scored greater than 70% in all Schools. Given that ethics is now
embedded in the Faculty Postgraduate induction events, this is a pleasing outcome.
Actions: All Schools in S and E provide opportunities for their students to engage in
professional development. In addition, all students have access to the professional
development courses offered through Queen Mary’s Centre for Academic and Professional
Development (CAPD). Individual Schools, Institutes and Centre to offer increased provision
of appropriate researcher development training (also advertised on the Doctoral College
4
calendar so that this is available to students in other Schools. Schools to share best practice
via DGS forum.
The Doctoral College reviewed all CAPD courses against student feedback in July 2013 to
identify the areas of strength and concern, and CAPD and School/Institute provision is now
augmented with twice yearly cohort days organised by the Doctoral College focusing on key
areas of professional development (for example, academic networking and impact). In
addition, in September 2013 the Doctoral College introduced a new funding scheme, named
the Postgraduate Research Initiative Fund, to extend the financial support available to
Queen Mary PGR students through the Postgraduate Research Fund, for students who wish
to develop innovative training opportunities of their own. In Spring 2014 the Doctoral College
will carry out a review of the College’s training points system (with a particular focus on the
availability of training opportunities across different RCUK research training domains, and
the weighting of training requirements across those domains and each year of study), and
will introduce a new, more user friendly interface to the training database – now hosted on
the Doctoral College website.
Doctoral College Management Group to periodically audit researcher development provision.
Professional Development
Questions in this section cover students’ ability to manage projects, effective communication,
networking and personal professional development. As for the other sections of the PRES
survey there are some excellent responses in all sections, but the responses across the
Faculty are again a little patchy between Schools within each section of the survey and also
within the responses of individual Schools in this part of the survey. A significant amount of
effort has been directed to these areas of researcher development by Schools and the
Doctoral College in the last twelve months and this is an area where we should expect future
improvements. We ran a Year 1 cohort event on increasing networking effectiveness for the
first time in 2013, which was very well received, and it is anticipated that this will have a
positive impact on future surveys in this section. Schools continually review the provision of
developmental opportunities for their students. For example SPA is launching GradNet in
collaboration with six Physics departments to provide comprehensive and appropriate
opportunities for its students. SMS in response to this survey is seeking advice from CAPD
regarding Project Management training, an area that the School scored less well in.
Actions: Continue to run cohort training on networking as well as cohort days for each
developmental year. Individual Schools to continually review provision of researcher
development opportunities. Schools to share best practice through DGS meetings.
Doctoral College to review the points-based researcher development system.
5
Download