Transition to Online Student Evaluation of Teaching Interim Report

advertisement
Transition to Online Student Evaluation of Teaching
Interim Report – January 2013
ACIE Best Practices Committee
Executive Summary
The University of Pittsburgh is well on the way to transitioning to online student
assessments of teaching. Results of our pilot project, conducted in AY 2011-12, were
consistent with what other schools that have already implemented online surveys have
seen: that format (online vs. paper) had no effect on average teacher rating means for any
survey item, and that average response rates were lower for online surveys.
As expected, student response rates are lower in the online format than in the in-class
format for the Fall Term 2012. As we move to a fully online mode in Fall 2013, we will
continue to work toward increasing participation by students. Email reminders appear to
be the most effective university-level intervention. Graduate course instructors should
remind students to complete the survey once or twice, and undergraduate course
instructors should discuss the importance of student evaluations of teaching. All
instructors should be sent an email when the online evaluation period begins.
Did many faculty shift to online surveys?
In Fall 2012, of the 4,634 courses at the University of Pittsburgh that conducted student
evaluations of teaching through the Office of Measurement and Evaluation of Teaching,
71% (3,290) were requested to be done online, and 29% were requested to be done in
class. While there was some variation by department and school, overall Arts & Sciences
and the various Professional Schools showed a similar level of shift.
1
How were student responses rates in the online format?
One common concern about the online response format involves student response rates.
Overall, the average student response rate across all classes was 54%, and the mean rate
was above 50% for both Oakland campus and regional campuses, as well for both Arts &
Sciences and the Professional schools. This response rate is lower than the typical response
rate for in-class surveys (approximately 75%), but it is approximately what was expected
for a first semester transition to online.
Class type does influence response rates. The primary driver appears to be student level
rather than class size. For example, within Oakland Arts & Sciences, graduate classes
showing approximately 70% response rates and undergraduate courses showing just
under 50% response rates, with little effect of class size. In the graph below, Small is
defined as 30 students and smaller, Medium is defined as up to 100 students, and Large is
everything larger than 100 students.
Final Student response rate
100
Small
90
Medium
80
Large
70
60
50
40
30
20
10
0
Doctoral
Masters
UG-Upper
UG-Lower
Course Level
There is considerable variability in response rates across classes, reflecting by chance
variation. Most problematic, though: without instructor intervention, there is a significant
chance within small classes of receiving a low response rate. Approximately 28% of small
lower division undergraduate courses and 25% of small upper division undergraduate
courses have a student participation rate below 40%. It is recommended that instructors
with responses rates below 30% by the 8th day (half-way) be contacted to suggest some
discussion of survey importance and remind students to complete the surveys.
2
What can be done to increase student response rates?
Three kinds of university level interventions were examined: 1) email reminders, 2) an
advertisement placed in the Pitt News, and 3) notes placed on the student portal
(my.pitt.edu). Response rates were tracked on a daily basis. The figure below shows the
percentage of remaining students completing the survey each day, defined as: of the
students who have not yet completed the survey, how many completed it in the given 24hour period. For example, suppose in a class of 100 that 40 completed the survey on the
first day, and then another 20 students completed it on the 2nd day. This would translate
into 40% (40/100) and 33% (20/60 because there were only 60 left who could have
completed the survey on the 2nd day) completion rates for day 1 and 2.
From this analysis, it appears that email prompts are the most effective universitylevel intervention. There are diminishing returns (even when adjusting for a smaller pool
of remaining students left to complete the survey), but it would seem that a 4th reminder
likely would have garnered another 5% to 7% of remaining students (= 3% to 4% of total
students). Notices in the Pitt portal and the Pitt News appear to have little effect.
3
What can instructor to increase student participation rates?
An electronic survey was sent out to all faculty who had active courses in Courseweb. For
faculty with more than one course, we randomly selected the first listed course. The brief
survey asked them to note how often (if at all) they implemented the following actions:
1.
Talked in class about the importance of the survey
2.
Talked about the shift to online surveys this semester
3.
Gave time in-class for students to complete the online survey (via iPads, laptops, or
lab computers)
4.
Gave own survey earlier in semester seeking student input on the course
5.
Reminded students to complete the online survey
6.
Dismissed class early to complete the online survey elsewhere
Approximately 600 survey responses were obtained. The chart below presents the
frequencies of each action. We see that many instructors did none of these strategies, but
discussing the surveys and reminding students at least once was common.
For all survey respondents, student participation rates in the identified class were obtained
and statistical analyses were conducted on the relationship between instruction action and
student participation rates. Overall, only discussing importance of the survey and
reminding students to complete the survey were found to be effective instructor
interventions. The figures below present the mean student response rates for these two
key factors, after statistically controlling for the effects of other factors.
4
Graduate classes appear to benefit some from discussing the importance, but the
larger effect is for in-class reminders. Note, however, more than two reminders
appear to be associated with negative outcomes.
Undergraduate classes appear to benefit most from discussions of the importance of
the surveys, with little effect of reminders.
80%
Graduate
Undergraduate
Student Respone Rate
75%
70%
65%
60%
55%
50%
45%
40%
Never
Once
Twice
Many
Talk about Importance of Surveys of Teaching
Graduate
80%
Undergraduate
Student Respone Rate
75%
70%
65%
60%
55%
50%
45%
40%
Never
Once
Twice
Many
Remind Students to Complete Surveys
Many instructors reported being unaware of when the student evaluations of teaching is
ongoing. It is recommended that all instructors receive a reminder email when the
online evaluation period begins.
5
Download