Increasing  Responses in  Online Course  Evaluation

advertisement
Increasing Responses in Online Course Evaluation
Presented by Jennifer L. Hannigan, MPhil.
Research Analyst, Institutional Research & Effectiveness
Qualtrics: Online Survey Platform
Campus‐wide Faculty, Staff, & Student use
Users
28 Faculty Members 26 Staff
14 Students
Qualtrics Usage
Staff Point of Service Surveys
‐Writing Center
‐Business Office
Community Engagement
‐Grant Applications
Job Placement
‐At graduation
‐Alumni Follow‐up
Graduate Schools
‐Entrance & Exit Surveys
Academic Advising
Needs Assessments
Course Evaluation
Faculty
Rubrics
‐History Dept. Graduate student capstone ‐MJA
Assessments of Student Learning
‐Education Dept
Pre & Post Tests
‐Education Dept
IRB Review/Approval Research/Studies
‐Psychology
‐Athletic Training
Course Evaluation
April 21‐May 9
Course Evaluation
‰The Cause for Transition
‰Online Evaluation Advantages
‰‘Spreading the word’ ‰Initial Launch
‰Response Rates
‰Quality vs. Quantity
‰Improving Response Rates
Response Rates
The Cause for Transition
• Paper evaluations were too costly
‐Cost of paper & printing
‐Hiring temporary help
‐Additional student workers
• Reduction in human error • Very time consuming = slow turnaround time
• Several advantages to online evaluation
Online Course Evaluation
• Faster turnaround for reports to Faculty (3 months sooner than paper‐Fall 13)
• Real‐time response rates
• Customized evaluation
• Meaningful & useful comments
• Absentee student feedback
• Saved $13,000 in budget
• Security, greater insights, advanced logic, and higher quality results
Evangelism •
•
•
•
•
•
•
•
•
“spread the word” through email messages
Blackboard & Student portal banner messages
Course Evaluation stations in high traffic areas
Posted QR Reader Squares on bulletins
Signage at campus entrance and exit
Sidewalk chalk in foot traffic areas
Article in “SmallTalk” –student run newspaper
Course Evaluation power hours
Emails to faculty to encourage student participation
Initial Campus‐wide Launch
Accessibility
• Fall term 2013: an authenticator was created to enable anonymous distribution link
• QR scan Square
‐posted on bulletins
‐emailed to faculty • Student portal
• Blackboard
• Email
Fall 2013: Student View
Fall 2013 Online Course Evaluation
Response Rates
• Overall: 31%
Compared to a paper response rate of: 70 %
• No sig diff between mean online and paper answers+.
+Similar to research by Ardalan
Bird 2005)
et al 2007; Anderson, Cain, and Quality vs. Quantity Qualitative Feedback Word Counts
¾Online (Fall ‘13) Q29: Promoted Learning‐ 40,077
Q30: Specific, practical changes‐ 43,416
¾ Paper (Fall ‘12)
Q29:50,042
Q30: 42,314
Improving Response Rates
• Facilitate a culture of assessment through data transparency, positive attitudes, effective communication, and shared goals
• Created a survey policy to reduce survey redundancy: Awaiting approval for campus‐wide distribution
• Survey calendar: Soon to be published on web
• Faculty communication with students
• Incentives
• Streamlined evaluation process for student access
Increasing Future Responses
• Embedded data directly into email with customized ‘pre‐programmed’ course list for each student “Spring 14 Evening I”‐ RR 49%
• Encourage student participation by way of feedback use and possible incentives %pts. Or early grade preview
• Communicating openly about response rates
• A shortened evaluation with skip and display logic
• Single sign‐on with Qualtrics
• Improved course evaluation (fewer questions, useful information)
Please encourage your students… ¾ How you use course evaluations to improve student learning outcomes
¾ That you value their time & insights
¾ Reminders during Eval time (April 21‐May 9)
¾ Give specific requests for feedback (usefulness of textbook, presentations, assignments)
¾ Note the course code (ex. ENG 1100 001) on the board ¾ Option: Offer extra credit, early grade preview, etc.
Office of Institutional Research & Effectiveness
jhannigan@methodist.edu
@MU_Insights
Jennifer Hannigan, MPhil. Research Analyst, Office of Institutional Research & Effectiveness
References
Burton, W., A. Civitano, and P. Steiner‐Grossman. 2012. Online versus paper evaluations: differences in both quantitative and qualitative data. Journal of Computing in Higher Education, 24(1): 58‐69. Heath, N. M., Lawyer, S. R., & Rasmussen, E, B. (2007). A comparison of web‐based versus pencil‐and‐paper course evaluations. Teaching Psychology, 34, 259‐261.
Collings, D., & Ballantyne, C. (2004). Online student survey comments: A qualitative improvement? Paper presented at the 2004 Evaluation forum, Melbourne, Australia.
Hmieleski, K. & Champagne, M. V. (2000). Plugging in to course evaluation. The Technology Source Archives, Sept.
Oct. 
Download