January 22, 2009

advertisement
CALIFORNIA LUTHERAN UNIVERSITY
ONLINE COURSE EVALUATION TEAM
FACULTY MEMBERS
Date and Location: Thursday, January 22, 2009
SBET 127
Time:
10:00am – noon; lunch provided
Attendees:
Julius Bianchi, Carol Coman, Bruce Gillies, Herb Gooch, Veronica Guerrero, Ed Julius,
Halyna Kornuta, Henri Mondschein, Karissa Oien, Melinda Wright
Regrets:
Jim Bond, Deb Erickson, Paul Hanson, Myungsook Klassen, Paul Williams, Paul Witman
MINUTES
It is important to collect
data. It is more
important to review data.
The ultimate is to use data
to make data-informed
decisions.
A. Members were welcomed and introductions were offered
a. Suggestions were made to also include Kris Butcher, Mike McCambridge, and Dru Pagliassotti to the next meeting.
B. Reviewed Online Course Evaluations to date by offering link to Assessment Website
a. http://www.callutheran.edu/assessment/resources/CourseEvaluations.php
C. Discussed and responded to questions and issues raised by faculty (see below)
a. Some information came from Sam Thomas’ minutes from the December Faculty meeting
D. Report to faculty at the February Faculty meeting
QUESTIONS AND ISSUES
1.
Response Rate
a. What are the response rates?
b. What is a valid response rate?
c. Investigate response strategies used by faculty
d. What is the minimum number of responses needed
for the data to be useable by ART?
e. What is the profile of low RR classes?
RESPONSES / ACTIONS
a. See Chart – Suggested changes made in blue
1. To keep people informed of the current response rates, place the thermometer on
the CLU home page and the CLU Portal – Matt Ward owns the web page
b. Review of Literature Handout – Henri Mondschein
1. Melinda will add the handout links to the Assessment Website
c. January 2009 Survey to 11 week fall session and 15 week fall semester faculty to learn
about their strategies to inform / encourage students to complete forms
1. The survey was passed out for members to comment on and offer suggestions.
2. Melinda will make the suggested changes to the survey.
d. This is a judgment call that ART has to make.
1|Page
QUESTIONS AND ISSUES
2.
3.
Administration Time
a. When is the final two weeks of classes? The week of
finals or the week before?
Faculty Handbook states, "Student course evaluations
are conducted during the final two weeks of classes"
(Section Two, p. 26).
Question: Does the “final two weeks of classes” include
finals week or not? The Registrar’s office defines the last
week of classes to include finals.
Concern: If CoursEvals are distributed to include finals
week, the final evaluation may influence students’
comments on CoursEval, for example, comments /
scores will reflect how difficult the final may have been,
etc. Some faculty return finals before evaluations are
completed.
b. When students drop, are they included in CoursEval?
Course Evaluation items
a. How does ART use the data received?
b. Review of items? Example:
1) “uses highly effective teaching methods”
2) “The syllabus was…” [Disagree (1) Agree
(5)]
3) “Compared to other courses, this course
was…” [Excellent (1) Very Poor (5)]
4) Concern: the current procedure has too
many variables and may not be statistically
valid.
5) Rewrite items to better suit courses
c. Currently, classes with < 5 students to not have a
voice, for example, music instruction. Should they
be included in some way?
RESPONSES / ACTIONS
a. Based on the Memo that was distributed with the paper and pencil evaluations, the
majority of the people in attendance were under the impression that the final two
weeks of classes does not include finals week.
1. Possible recommendation to take to the Provost: As a pilot study, distribute the
Winter 09 evaluations during the two weeks before finals (weeks 9 & 10).
b. Class lists are checked up until the day prior to administration.
b. Bruce demonstrated a PowerPoint Presentation chart that showed the ADEP means
for the first 16 items on the course evaluations.
1. “Uses highly effective teaching methods” is consistently low and is the only question
to use the word “highly”
2. It is the assumption that students do not understand the meaning of the question
3. Possible actions to take to the T&L Committee:
a. Remove the word “highly” from the question
b. Remove the whole question
2|Page
QUESTIONS AND ISSUES
4. Student Reminders
a. How many email reminders are sent?
b. Can there be a change in the bookmark text to
indicate responses are used for promotion and
tenure? What about adding the purpose of the
evaluations?
RESPONSES / ACTIONS
a. Announcement/Invitation/4-5 Reminders
b. Suggestions for changes to bookmark:
1. Simple, easy steps on the front. Add important, detailed information to the back
2. Instead of adding that the evaluations help with promotion, instead say that they
are used to evaluate faculty.
3. Instead of the picture of the syllabus being “changed,” have it say “syllabus
improved”
4. Karissa is going to type a draft of the new bookmark and send to the members to
review
c. The key to get the students to buy in to the new system lays with the instructors
5.
Student Comments
a. What does a text analysis of responses indicate?
a. Karissa is going to organize a focus group with undergraduate students. Herb
Example: Ratio of positive to negative
mentioned doing the same for graduate students.
b. Is there a difference between UG and G responses? If b. Idea Ed plans to do with his students: have students anonymously answer the question
of whether or not they completed their Fall 08 course evaluations. If not, why?
so, what is it?
6. CoursEval comparative data
a. How do the results received during the pilot compare a. Communication 231
to results received from paper and pencil
administration (review comparative classes with
faculty permission)?
b. Is there a relationship between grades received and
evaluations given?
c. http://www.callutheran.edu/assessment/resources/sept
dec2008.php
d. Should the charts continue to be part of the public
d. Idea for another comparison chart: Compare adjunct responses to faculty. Bruce gave
CLU website?
permission to use ADEP as pilot.
7. CoursEval Faculty Report Data
a. Legend defining abbreviations needed
a. Exists – highlight in faculty video
b. Terms not clearly defined (e.g. 1=Disagree, 5=Agree, 2,
3, 4 not defined)
c. What does “E” (excellent) really mean? How is it
different from “good”?
3|Page
Augsburg Online Administration Dates
Summer 2007
Fall 2007-UG
Fall 2007-GR
Spring 2008
Summer 2008
Student Response Rate
50%
48%
54%
41%
50%
CLU Administration
Date
Courses
Paper & Pencil
Paper & Pencil
Summer 2007
Fall 2007
Online
Online
Summer 2008
Fall 2008
11 week
11 week
15 week
11 week
11 week
15 week
Student
Response Rate
62.9%
77.7%
70%
50%
54%
62%
# Students Surveyed
1416
8322
3255
7350
2152
# Students
Surveyed
1072
1496
10434
1182
1426
10266
# Students
Responded
674
1163
7353
587
764
6348
# Students Responded
715
3966
1771
3013
1084
Class Response
Rate
73.7%
83.2%
83.2%
100%
100%
98.9%
# Classes
Surveyed
76
95
643
85
97
590
# Classes
Responded
56
79
535
85
97
584
15 Week Course Evaluation completions by day:
Prior to Finals
Week
9
912
381
286
576
708
559
374
531
181
126
4643
Finals Week
188
444
307
361
352
53
1705
4|Page
11 Week Course Evaluation completions by day:
Prior to Finals
Week
24
49
46
55
53
49
38
78
25
22
Finals Week
40
31
53
46
54
20
14
61
6
5|Page
439
325
6|Page
Download