January 30, 2009

advertisement
CALIFORNIA LUTHERAN UNIVERSITY
ONLINE COURSE EVALUATION TEAM
FACULTY MEMBERS
Thursday, January 22, 2009
SBET127
Date and Location: Friday, January 30, 2009
HUM244
Time:
10:00am – 11:00am
Attendees: 1/22/09 Julius Bianchi, Carol Coman, Bruce Gillies, Herb Gooch, Veronica Guerrero, Ed Julius,
Halyna Kornuta, Henri Mondschein, Karissa Oien, Melinda Wright
1/30/09 Bruce Gillies, Paul Hanson, Halyna Kornuta, Karissa Oien, Dru Pagliassotti, Paul Williams,
Paul Witman, Melinda Wright
Regrets:
1/22/09 Jim Bond, Deb Erickson, Paul Hanson, Myungsook Klassen, Paul Williams, Paul Witman
1/30/09 Jim Bond, Kris Butcher, Deb Erickson, Myungsook Klassen,
Mike McCambridge
It is important to collect
data. It is more
important to review data.
The ultimate is to use data
to make data-informed
decisions.
AGENDA & MINUTES THURSDAY JAN 22 / AGENDA & MINUTES FRIDAY JAN 30
A. Members were welcomed and introductions were offered
a. Suggestions were made to also include Kris Butcher, Mike McCambridge, and Dru Pagliassotti to the next meeting.
B. Reviewed Online Course Evaluations to date by offering link to Assessment Website
a. http://www.callutheran.edu/assessment/resources/CourseEvaluations.php
C. Discussed and responded to questions and issues raised by faculty (see below)
a. Some information came from Sam Thomas’ minutes from the December Faculty meeting
D. Report to faculty at the February Faculty meeting
QUESTIONS AND ISSUES
1.
Response Rate
a. What are the response rates?
b. What is a valid response rate?
c. Investigate response strategies used by faculty
d. What is the minimum number of responses needed
for the data to be useable by ART?
e. What is the profile of low RR classes?
RESPONSES / ACTIONS
a. See Chart – Suggested changes made in blue
1. To keep people informed of the current response rates, place the thermometer on
the CLU home page and the CLU Portal – Matt Ward owns the web page
b. Review of Literature Handout – Henri Mondschein
1. Melinda will add the handout links to the Assessment Website
c. January 2009 Survey to 11 week fall session and 15 week fall semester faculty to learn
about their strategies to inform / encourage students to complete forms
1|Page
QUESTIONS AND ISSUES
RESPONSES / ACTIONS
1. The survey was passed out for members to comment on and offer suggestions.
2. Melinda will make the suggested changes to the survey.
d. This is a judgment call that ART makes and communicates to faculty.
2.
3.
Administration Time
a. When is the final two weeks of classes? The week of
finals or the week before?
Faculty Handbook states, "Student course evaluations
are conducted during the final two weeks of classes"
(Section Two, p. 26).
Question: Does the “final two weeks of classes” include
finals week or not? The Registrar’s office defines the last
week of classes to include finals.
Concern: If CoursEvals are distributed to include finals
week, the final evaluation may influence students’
comments on CoursEval, for example, comments /
scores will reflect how difficult the final may have been,
etc. Some faculty return finals before evaluations are
completed.
b. When students drop, are they included in CoursEval?
Course Evaluation items
a. How does ART use the data received?
b. Review of items? Example:
1) “uses highly effective teaching methods”
2) “The syllabus was…” [Disagree (1) Agree
(5)]
3) “Compared to other courses, this course
was…” [Excellent (1) Very Poor (5)]
4) Concern: the current procedure has too
many variables and may not be statistically
valid.
5) Rewrite items to better suit courses
c. Currently, classes with < 5 students to not have a
a. Based on the Memo that was distributed with the paper and pencil evaluations, the
majority of the people in attendance were under the impression that the final two
weeks of classes does not include finals week.
1. Possible recommendation to take to the Provost: As a pilot study, distribute the
Winter 09 evaluations during the two weeks before finals (weeks 9 & 10).
2. Leanne reviewed this recommendation and is allowing the 11 week evaluations to
be administered during weeks 9 & 10
3. Chart at the end of this agenda: Karissa will track how many evaluations are
completed each day.
4. Bruce: A line chart should be used to show the tracking of evaluations completed
by day.
5. Faculty handbook needs to be adjusted to demonstrate practice: not include finals
week.
b. Class lists are checked up until the day prior to administration.
b. Bruce demonstrated a PowerPoint Presentation chart that showed the ADEP means
for the first 16 items on the course evaluations. Means included all data from 2005 to
2008.
1. “Uses highly effective teaching methods” is consistently low and is the only question
to use the word “highly”
2. It is the assumption that students do not understand the meaning of the question
3. Possible actions to take to the T&L Committee (Paul Williams):
a. Remove the word “highly” from the question
b. Remove the whole question
c. Evaluate all of the questions
4. Similar questions: “Experience a high degree…” “Clearly explains the course
material” – Possible study: compare these questions for courses taught by the same
2|Page
QUESTIONS AND ISSUES
voice, for example, music instruction. Should they
be included in some way?
10. Student Reminders
a. How many email reminders are sent?
b. Can there be a change in the bookmark text to
indicate responses are used for promotion and
tenure? What about adding the purpose of the
evaluations?
11. Student Comments
a. What does a text analysis of responses indicate?
Example: Ratio of positive to negative
b. Is there a difference between UG and G responses? If
so, what is it?
12. CoursEval comparative data
a. How do the results received during the pilot compare
to results received from paper and pencil
administration (review comparative classes with
RESPONSES / ACTIONS
instructor
5. This is the same issue with traditional undergrad. Paul H. said the students tend to
base it on an entertainment value. We need to be looking for clarity.
6. There are no questions that ask “Did I learn something?” Did the course
accomplish the learning outcomes/goals (tie it in to the learning outcomes that
should be on the syllabus)
7. A lot of subjectivity in the questions. Ex:” …foster active student…” Need clarity!
Could take out “active”
8. Dru questioned the comparison questions (#17 & #18)
9. Melinda showed a mean chart that demonstrated Fall 08 11-week and 15-week
means for the first 16 items on the course evaluations
a. Announcement/Invitation/4-5 Reminders
b. Suggestions for changes to bookmark:
1. Simple, easy steps on the front. Add important, detailed information to the back
2. Instead of adding that the evaluations help with promotion, instead say that they
are used to evaluate faculty.
3. Instead of the picture of the syllabus being “changed,” have it say “syllabus
improved”
4. Karissa is going to type a draft of the new bookmark and send to the members to
review
c. The key to get the students to buy in to the new system lays with the instructors
a. Karissa is going to organize a focus group with undergraduate students. Herb
mentioned doing the same for graduate students.
b. Idea Ed plans to do with his students: have students anonymously answer the question
of whether or not they completed their Fall 08 course evaluations. If not, why?
c. Action Item: Committee members will conduct an “informal” survey on a piece of
paper to ask their students if they completed the online course evaluations last semester
and if not why.
a. Communication 231: Dru offered her opinions on her chart. Motivated students are the
ones who will complete the evaluations. Assumption that apathy crowd did not do
evaluations. Paul H: Going to have Course Evaluation Inflation with online
3|Page
QUESTIONS AND ISSUES
RESPONSES / ACTIONS
faculty permission)?
b. Is there a relationship between grades received and
evaluations given?
c. http://www.callutheran.edu/assessment/resources/sept
dec2008.php
d. Should the charts continue to be part of the public
CLU website?
13. CoursEval Faculty Report Data
a. Legend defining abbreviations needed
b. Terms not clearly defined (e.g. 1=Disagree, 5=Agree, 2,
3, 4 not defined)
c. What does “E” (excellent) really mean? How is it
different from “good”?
Augsburg Online Administration Dates
Summer 2007
Fall 2007-UG
Fall 2007-GR
Spring 2008
Summer 2008
evaluations. The graph not really representative.
Other courses to create comparison charts: Chemistry 151, History 121, EDCG526
d. Idea for another comparison chart: Compare adjunct responses to faculty. Bruce gave
permission to use ADEP as pilot.
a. Exists – highlight in faculty video
Student Response Rate
50%
48%
54%
41%
50%
CLU Administration
Date
Courses
Paper & Pencil
Paper & Pencil
Summer 2007
Fall 2007
Online
Online
Summer 2008
Fall 2008
11 week
11 week
15 week
11 week
11 week
15 week
Student
Response Rate
62.9%
77.7%
70%
50%
54%
62%
# Students Surveyed
1416
8322
3255
7350
2152
# Students
Surveyed
1072
1496
10434
1182
1426
10266
# Students
Responded
674
1163
7353
587
764
6348
# Students Responded
715
3966
1771
3013
1084
Class Response
Rate
73.7%
83.2%
83.2%
100%
100%
98.9%
# Classes
Surveyed
76
95
643
85
97
590
# Classes
Responded
56
79
535
85
97
584
4|Page
15 Week Course Evaluation completions by day:
Prior to Finals
Week
9
912
381
286
576
708
559
374
531
181
126
4643
Finals Week
188
444
307
361
352
53
1705
11 Week Course Evaluation completions by day:
Prior to Finals
Week
24
49
46
55
53
49
38
78
25
22
439
Finals Week
40
31
53
46
54
20
14
61
6
325
5|Page
Download