Richard Dean's Presentation

advertisement
ICE Evaluations
Some Suggestions for
Improvement
Outline
• Background information and assumptions
• Content of evaluation forms
• Logistical problems with processing ICE
information
Background Information
• Exchange of e-mails by professors last summer
• Arts and Sciences “Task Team” currently looking
at various ways of evaluating teaching
• My points here are mostly compatible with both
Background Assumptions
• Student Evaluations will continue to be used
• They will be used for two purposes:
– Instructors’ own improvement of courses and teaching
– Assessment of teachers by administrators
• We should make ICE evaluations as effective as
possible for both purposes
Suggestions About Content of
ICE Forms
(go to evaluations file)
Remove the “One Number” Overall
Average At Bottom of Page
• It gives less information, not more
• It is all people will look at if it’s available
• Administrators assessing teachers
• Teachers planning future courses
• Not all the categories have to do with the
instructor, so it’s unfair to assign these
ratings to instructor
• FAS Task Team unanimously agreed
Keep the text of individual
questions
• In some formats of ICE reports, the
questions are missing
• This encourages looking only at numbers
• So include the actual questions
Why Not Also Get Rid of the
“Category” Average Numbers?
• All the same reasons apply
• But if this is too much, then really, please,
please get rid of the “one number” average
Some Specific Questions on ICE
Form Need Revision
• #20 “The Material was not too difficult”
means that the highest rating is for
material that is far too easy
• Combine questions 18-20 to make
question “The difficulty and pace of the
course were appropriate”
• Question #10 “Demonstrated Favorable
Attitude toward students”
• Task Team recommendation: change to
“treated students with proper respect”
• Reason: the old wording favors teachers
who are lenient about, for example,
plagiarism, arriving to class late, talking
during class…
Other questions to revise
• #7 “Was readily available for consultation
outside of class”
• Question #12 “Evaluated Work Fairly”
Too Many Questions
• Researchers seem to agree with the
common-sense idea that too many
questions on an evaluation form leads
students to give up
• Some ICE questions seem repetitive or
unnecessary
How to include fewer questions
• Again, combine Questions 18-20 to make
question “The difficulty and pace of the
course were appropriate”
• Drop Question #15 and #16 about stating
and covering objectives of course, since
#17 “Course organization was logical and
adequate” covers these
“Additional Items” on ICE form
• After the university-wide questions, a
section of “additional items” is included
• Currently, each faculty (FAS, Engineering,
etc.) can choose from an “item bank” of
approved questions
• Instead, each department should choose
any questions they want, whether from
item bank or not
Why let Departments Choose?
• Departments are in the best position to
design questions that are appropriate for
their discipline
• For example, why think that the same
questions would be appropriate to a
chemistry course, an education course,
and an English literature course?
• Too much bureaucratic regulation is not
beneficial to a university
Logistical Problems with
Processing ICE Information
Course evaluations are often “lost”
or assigned to wrong course
• Intstructors have students fill out
evaluation forms, then no ICE report
appears for that course
• Has happened at least five times in
philosophy department in three years
• Other professors reported the same
problem in last summer’s e-mail exchange
The cause?
• If students fill in the wrong section number,
or department number, or course number,
then the evaluations all automatically are
assigned to the wrong course (or to no
course)
The Solution
• Is not to assign blame (as in “Well, this is
the department’s fault, because the
graduate assistant who gave the
evaluations must have told students the
wrong numbers”)
• But instead is to try to redesign system so
that this mistake (which is easy to make)
does not lead to corruption of data
The Solution (part II)
• A simple but less effective solution: Tell all
instructors to give the course information
themselves to students themselves, by e.g.
writing on the board (this at least makes
instructors responsible)
• A (slightly) more difficult but more effective
solution: have some kind of “cover sheet” for
each course, which the computer will read. If the
individual ICE forms disagree with information
on cover sheet, automatically assign it to the
correct course
A More Widespread Problem
• When the evaluations for a course are
mysteriously absent, sometimes evaluations
from one or two (or more) students appear
anyway
• Or, when a teacher doesn’t administer
evaluations, she still gets results from one or two
students anyway
• And probably this “phantom evaluation” process
occurs, undetected, in MOST courses
Cause of Phantom Evaluations
• It’s the same cause as for the missing
evaluations for a whole course
• If one or two (or more) students write the wrong
course numbers, their evaluations will be
assigned to the wrong course (even if all the rest
of the student forms go to the right course)
• This probably happens VERY OFTEN
• So it’s all the more reason to fix the problem
How to Avoid “Phantom Evaluation”
Problem
• The same way as avoiding the more largescale assignment of evaluations to wrong
courses
• Have some kind of “cover sheet” for each
course, which the computer will read. If the
individual ICE forms disagree with
information on cover sheet, automatically
assign it to the correct course
Another Logistical Problem
• The ICE form includes a “response rate”
for indicating the percentage of enrolled
students who fill out an evaluation form
• But for at least two of the last four
semesters, these figures are inaccurate
Why is the “Response Rate” Often
Inaccurate?
• The response rate is, of course, meant to be an
indication of the percentage of students enrolled in the
course who actually fill out the ICE form
• But the total number of “enrolled students” is not
accurate
• The AUBsis site in fall 2003-2004 and fall 2004-2005
gave a total number of enrolled students at the
BEGINNING of the term, not at the end
• So any students who dropped the class were still
included in the “enrolled students” total
• So suppose 25 students were enrolled at the beginning
of the term, but 5 dropped. And suppose 15 students
filled out the ICE form. The official “response rate” would
be 60%. But the real response rate, of students still
enrolled, would be 75%.
Solution to the “response rate”
problem
• If OIRA uses the AUBsis information for
this, OIRA and the registrar should
coordinate the uses to which the data will
be put. So the “enrolled students” number
must reflect the number of students
enrolled at the end of the term, not the
beginning.
OIRA office responses to faculty
OIRA has not Responded to Faculty
Correspondence About Problems
•
•
•
•
A delicate issue
Numerous examples
Why it matters
Solution? I admit I don’t know. Maybe a
full-time office manager?
One final issue: Use of ICE Reports
• Literature on evaluations often mentions proper
use by administrators
• A quick glance is worse than no information at all
• Items to focus on: percentage of students
responding; type of course (graduate vs.
undergrad, introductory vs. advanced); particular
questions; distribution of answers (are one or
two terrible ratings dragging average down?)
• NOT ONE NUMBER
Download