Student Evaluation of Faculty: Policies and Processes

advertisement
STUDENT EVALUATION
OF FACULTY:
Policies and Processes
Kay Turpin
Alison Joseph
Western Carolina University
SAIR 2010 New Orleans, LA
Please…
 turn off cells phones and other devices
 if you must leave the session early – leave quietly
 avoid side conversations
This presentation is available at:
http://www.wcu.edu/27729.asp
SAIR 2010 New Orleans, LA
Western Carolina University
Location: Cullowhee, NC
Member of UNC System
Fall 2010 enrollment - 9407
1440 FTF
7503 UGs
1904 GRs
SAIR 2010 New Orleans, LA
History of Evaluations at WCU
 No university-wide policy on administration, instruments,





courses covered, and use of results
Scanned, paper forms administered during class time
Long wait for results
No flexibility in types of reports
Security dismal – esp. comments
IR office:
 Purdue University’s Cafeteria Evaluation System
 7-12 departments
 select undergraduate courses
 inefficient, time-consuming process
SAIR 2010 New Orleans, LA
Planning for University-Wide Evaluations






2002 through 2006
Faculty Senate estab. ‘teaching evaluation committee’
Provost mandated online evaluations (PACE initiative)
Developed SAI instruments based on course types
Developed guidelines for results usage
Initial discussion on admin. evals and accessing results
Researched available options for administering evals and
selected Academic Management System’s CoursEval
Faculty “own” the evaluation process!
SAIR 2010 New Orleans, LA
Course Evaluation Pilot – Spring 2007
• 5 departments (Mkt/BLaw, Chem/Phys, CJ, PoliSci, Psych)
• 328 courses (5689 surveys)
• 45% response rate
• Problems:
login & email issues
inexperienced staff oversaw administration
• End Result: CoursEval online system worked fine
BUT….
SAIR 2010 New Orleans, LA
Full Implementation – Fall 2007
Began Fall without established policies and processes in
place for administering the survey and interpreting results
 late start
 inappropriate default forms
 no scheduling guidelines
 login issues from Spring
 key people not involved
pilot not addressed
 questions added, doubling
length
 44% response rate
 minimal error checking
 lack of advertising and
incentives
These issues would directly affect acceptance of online evaluations
by both faculty and students!
SAIR 2010 New Orleans, LA
Establishing Policies & Processes
Dec. 2007 through Summer 2008
Faculty Senate
Implementation Committee
 scheduling guidelines
 advertising/incentives
 low-enrolled courses
 faculty training
 student incentives
 summer parts-of-term
 refined SAI instruments
 instrument defaults
 planned validity study
 student email policy
 login issues
SAIR 2010 New Orleans, LA
Implementation Review
Spring 2007: 45% response rate (Pilot)
Fall 2007:
44% response rate
Were these major or minor issues?
Spring 2008: 33% response rate (Drop of 11%!)
Was our effort worth it?
Fall 2008:
46% response rate
SAIR 2010 New Orleans, LA
Lessons Learned
Lesson 1: Involve everyone with a stake in the
evaluation procedures and results; everyone with
experience in processing & administering
evaluations
Lesson 2: Consider all issues before implementing
Lesson 3: Build a strong communication system
between faculty and survey administrators
SAIR 2010 New Orleans, LA
Overview of Current Policies










all course sections, all terms
open period set by course length
university-wide incentives allowed
low-enrolled courses identified
crosslisted courses
periodic response rates & student reminders
faculty evaluation (not course content)
results availability
use in faculty AFE/TPR process
IR office responsible for administration
SAIR 2010 New Orleans, LA
IR Responsibilities
EVERYTHING!
 Set-up & administer each semester’s
evaluations
 “Help desk” for faculty and students
 Pursue new technologies & procedures to
improve process, reporting, and response rates
 Communicate to Faculty Senate any issues
arising affecting senate policies and survey
administration
SAIR 2010 New Orleans, LA
Evaluation Set-up (each term) – Back End
 Set evaluation schedule for all courses
 Request SAI instrument modifications
SAIR 2010 New Orleans, LA
Default instrument examples:
1) instructional method of ‘OA=online activity’ the
instrument “Online Form” is default
2) instructional method of ‘F=face-to-face’ & course type ‘L=lab’
the instrument “Lab Form” is default
TERM
YR
COURSE
SECT
CRSTYPE
INSTMETH
XLISTCODE
FALL
2010
ATTR221
01
LEC
F
Scifers, James
1. Standard Lecture Form
FALL
2010
ATTR221
75
LAB
F
Scifers, James
2. Lab Form
FALL
2010
ATTR323
01
LEC
F
Scifers, James
1. Standard Lecture Form
FALL
2010
ATTR323
30
LAB
F
Scifers, James
2. Lab Form
FALL
2010
EDM322
59
LEC
OA
Berry, Robert
3. Online Form
FALL
2010
EMC445
50
LEC
OA
Hubble, Michael
3. Online Form
0Z
SAIR 2010 New Orleans, LA
FACULTY
Preferred Survey Form
DEFAULTINSTR
Evaluation Set-up (each term)
 Set evaluation schedule for all courses
 Request SAI instrument modifications
 Data extracts from Banner (withdrawals & grades)
 Error checks in Access database
 Upload files to CoursEval website:
 faculty records
 students records
 course information (includes primary instructor only)
 secondary faculty (not included in course info.)
 enrollment
SAIR 2010 New Orleans, LA
Evaluation Set-up (each term) – Front End
 Assign instrument to courses
 Establish open/close dates and times – automated
 Set-up group emails for students/faculty – automated
 Set release date for survey results - automated
For the main evaluation period each term – total time
for front end set-up = approx. 4-5 hours
SAIR 2010 New Orleans, LA
Features of CoursEval
 Course Designator – helps us match instruments to





courses
Students and Faculty – Find all associated courses and
surveys
Consolidation of people
Active Directory login
Automated emails, open/close, results release
Response rates – real time, easy to locate and email
SAIR 2010 New Orleans, LA
Features We Would Like
 Better reporting – more flexibility
 Advance notice of outages and problems
 Ability to access individual responses
 Faculty member attached to course cannot be changed
 Single sign-on with other systems (Blackboard)
 Separate surveys for each instrument and time period
add to workload (84 surveys for Summer 2010)
SAIR 2010 New Orleans, LA
End Results
 Consistent open periods
 No lost class time
 Single administrator; increased efficiency
 All faculty/courses under same policies
 100% class coverage
 Results available quickly, better reports, survey history
 Costs: $40,000 vs $84,000+ annually
 Lower response rate; more thoughtful comments
 High level of security
 Environmentally friendly
SAIR 2010 New Orleans, LA
Future Plans and Ongoing Issues
 Improve response rates
 Implement Blackboard plug-in
 Improve Reports Available
 Improve process to change SAI instruments:
 Set-up Sharepoint site for department heads
 Use Banner and Blackboard to automate changes
SAIR 2010 New Orleans, LA
WCU’s Course Evaluation website:
http://www.wcu.edu/8356.asp
“Guidelines & Procedures for Administration &
Oversight of Student Assessment of Instruction”
Institutional Planning & Effectiveness:
http://www.wcu.edu/12829.asp 828-227-7239
Kay Turpin
turpin@email.wcu.edu 828-227-3041
Alison Joseph
ajoseph@email.wcu.edu 828-227-3042
AMS website:
http://www.academicmanagement.com/
Download