Summary of the Marquette On-Line Course Evaluation System (MOCES): Fall semester 2008 Gary Levy, Ph.D. Associate Vice Provost of Institutional Research & Assessment Office of Institutional Research & Assessment Marquette University Office of Institutional Research & Assessment (OIRA) Marquette On-Line Course Evaluation System (MOCES): Fall semester 2008 Overview of Fall 2008 MOCES Metrics Overview of Concerns about MOCES Questions about MOCES Marquette University Office of Institutional Research & Assessment (OIRA) Overview of Fall 2008 MOCES Metrics Numbers of Classes Evaluated (table 1) Response Rates (table 2) Marquette University Office of Institutional Research & Assessment (OIRA) Overview of Fall 2008 MOCES Metrics Descriptive Statistics for the Four Core Questions IAS Core Questions MOCES Core Questions The course as a whole was: How was this class as a whole? The course content was: How was the content of this class? The instructor's contribution to the course was: How was the instructor’s contribution to this class? The instructor's effectiveness in teaching the subject matter was: How effective was the instructor in this class? * Both the IAS and the MOCES used the following six-point scale for these four core questions: Excellent, Very Good, Good, Fair, Poor, Very Poor Marquette University Office of Institutional Research & Assessment (OIRA) Descriptive Statistics for the Four Core Questions The Core Questions Combined Score (table 3) The Individual Four Core Questions (tables 4-7) Factor Structure & Internal Reliability of MOCES (table 8) Marquette University Office of Institutional Research & Assessment (OIRA) Overview of Concerns about MOCES Will response rates be lower using MOCES? (table 9) Will students’ responses be more negative using MOCES? (chart1) Did the point in time during the two week period that evaluations were live influence students’ evaluations using MOCES? (charts 2 & 3) Marquette University Office of Institutional Research & Assessment (OIRA) Overview of Concerns about MOCES Marquette University Office of Institutional Research & Assessment (OIRA) Overview of Concerns about MOCES • “When class time is allotted for the evaluations and students have the evaluation right in front of them, students will typically fill them out.” • “…. students who take the time to fill out evaluations online are those who strongly dislike or really love their experience.” “If the university changed online evaluations … to address these • “…. for evaluations to be useful, should poll student, or at least a issues, we would happy they to support the every effort.” great majority of students, participating in the course.” • “Flooding students' e-mail inboxes with reminders about online evaluations eats up space and, frankly, gets irritating.” Marquette University Office of Institutional Research & Assessment (OIRA) Conclusions (The Trib’s concerns): • “When class time is allotted for the evaluations and students have the evaluation right in front of them, students will typically fill them out.” • Not true, or at least not substantially “…. for evaluations to be useful, they should poll every student, or at least a great majority of students, participating in the course.” • Response rates on MOCES were more than adequate statistically speaking “…. students who take the time to fill out evaluations online are those who strongly dislike or really love their experience.” • Marginally true, but focus on response rate often overlooks issues of data validity and the statistical concept of sampling True, and this is what MOCES does and in-class paper methods do not “Flooding students' e-mail inboxes with reminders about online evaluations eats up space and, frankly, gets irritating.” True, and this did not affect students who completed all their course evaluations with MOCES Marquette University Office of Institutional Research & Assessment (OIRA) Conclusions: Compared to the three previous fall semesters (in-class, paper & pencil): More classes were evaluated using MOCES Response rates were slightly lower although they remained statistically “powerful” enough at the macro level using MOCES Evaluations on the four core questions, and the combined core question score, were similar if not identical at the macro level using MOCES Students’ responses were not more negative at the macro level using MOCES Evaluations on the four core questions, and the combined core question score, did not vary substantially at the macro level using MOCES as a function the point in time during the two week period that students completed their evaluations Evaluations submitted earlier were not more negative than those submitted later Marquette University Office of Institutional Research & Assessment (OIRA) Conclusions: Responses on the MOCES form a reliable and intuitive factor structure Internal reliabilities (Cronbach Alpha’s) for the four core questions indicate extremely high level of internal reliability / consistency Marquette University Office of Institutional Research & Assessment (OIRA) Marquette On-Line Course Evaluation System (MOCES): Fall semester 2008 Metrics Thank you! Questions? www.marquette.edu/oira/ceval Questions or concerns?: Gary Levy 202 O’Hara Hall, gary.levy@marquette.edu Marquette University Office of Institutional Research & Assessment (OIRA) 288-7906