Instructor Course Evaluation

advertisement
Instructor Course Evaluations (ICEs)
at AUB
Development, Research, Faculty and
Student Perceptions
Karma El Hassan, PhD., OIRA
1
Outline

History of ICE Development at AUB
 Processing of ICEs
 Summary of Research on:
– Student Evaluations
_ ICEs
 Student Perceptions: Survey Results
 Faculty Perspective:
– E-mail exchanges
– Survey results
2
I. Development of ICEs

The ICE forms in use were developed four years ago in
collaboration between faculty and OIRA.
 ICE Form includes items covering
- instructor, course, learning outcomes (core),
- additional items.
 Based on review of the literature, OIRA proposed set of
core items that were discussed, revised, and finally
approved by various faculties.
 As to additional items, faculties first decided on list of
categories relevant to their courses, then selected relevant
items from item bank provided by OIRA.
3
II. Processing of ICEs - 1


Obtain initial course lists from Banner.
As Banner is not updated with more recent information or
changes to schedule, we request deans’ offices for the
information.
 Get initial course/instructor/section lists from dean's office
of every faculty.
 Verify this with respective departments, as sometimes
departments do changes like pooling/canceling sections,
assigning different instructors, etc.
 Finalize our data base of course/instructor/section #.
 Use this data base to issue envelope labels (later on
reports) for every evaluation..
4
Sample Envelope Label/ Ans. Sheet

Dr. Mustapha Haidar
 AGRL201 Sect.:01
 # of Students: 39
 Dept. Code: 01
 Category: Large
Lecture Class
FACULTY
FAFS 1
FAS
2
FEA
3
FM
4
FHS
5
SB
6
SNU
7
Others 8
DEPT
0 0
1 1
2 2
3 3
4 4
5 5
6 6
7 7
8 8
9 9
COURSE
0 0 0A
1 1 1B
2 2 2C
3 3 3D
4 4 4E
5 5 5
6 6 6
7 7 7
8 8 8
9 9 9
SECTION
0 0
1 1
2 2
3 3
4 4
5 5
6 6
7 7
8 8
9 9
5
Instructions to Graduate Assistants-1

1.
2.
3.
4.
5.
6
7.
8.
9.
I. Steps to be Followed in Instructor Course Evaluation (ICE) Administration
Graduate Assistant should arrange with course instructor with respect to the ICE administration. The
questionnaires should be given last two weeks of classes.
The ICE should be administered in the absence of the course instructor.
Tell the class that the instructor will not get results until after grades have been submitted.
Students should use pencils to complete the forms.
Graduate Assistant reads instructions to students (following second page) and guides them while filling general
information; faculty of the course and not of student, department (department code is printed on envelope label),
course and section numbers. For example Arabic 201, Section 1
Under faculty, A&S #2 should be blackened
Under dep’t, Arabic # 03 should be blackened
Under course, 201 should be blackened
Under section, # 01 should be blackened
It is important that students put their own section number correctly. In case of multi-instructor sections and large
lecture courses, please check code given on envelope label for section.
Graduate Assistant answers any questions raised.
No discussion of ratings among students should be allowed.
Students should be given time needed to complete questionnaires
After students have finished, collect surveys and thank students for their cooperation. Students should be told
that the materials will be returned directly to department secretary or chair and then to OIRA. The instructor
will NOT get results or forms until after grades have been submitted. Place everything in the envelope, seal it
and give it to the departmental secretary or chair to be forwarded to OIRA.
6
Instructions to Graduate Assistants-2

1.
2.
3.
II. Administration Instructions to be Read by Graduate Assistants:
These forms are used to provide information to the instructor and to the University on instructor, course and
student development in the course. They are intended to help the instructors improve their own teaching, and the
University in decisions regarding appointment and promotion. Your input is essential for the improvement of the
teaching learning process. Therefore, please read every question carefully and answer in appropriate space on the
computer scannable sheet by blackening the corresponding circles.
First, fill in general information, i.e. faculty of the course and not student, department, course #, and section # (own
section number correctly).
Graduate Assistant should provide this information, for example: Arab 210 section 1
Under faculty blacken 2
Under dep’t. blacken 03 (dep’t. code is printed on envelope, list is also enclosed).
Under course blacken 201
Under section blacken 01
It is important that students put their own section number correctly. In case of multi-instructor sections and
large lecture courses, please check code given on envelope label for section.
4.
5.
6.
7.
8.
9.
Answer questions starting with item # 1 on computer sheet using pencils.
Provide your comments on attached sheet.
No need to provide your name or ID.
Take time you need to complete the questionnaire and when you are ready hand it in.
No need to discuss ratings with your colleagues while filling out the questionnaire.
7
The instructor will not know the results of these ratings until after the semester is over and grades have been
submitted.
Processing of ICEs - 2





Provide GAs. with explicit instructions on how to
administer ICE especially with respect to
importance of proper coding by students
Envelopes are sent to departments 3-4 weeks
before end of term.
Once received, sheets are checked for accuracy of
filled-in course section information.
If discrepancies are found, they would be
corrected.
Scan forms and report results.
8
Processing Problems





Course sections taught by more than one instructor,
informed only after reports are released.
GAs. pool sections together while administering and
codes get mixed up.
Departments change instructors and we don’t know
about it.
Students attend sections other than their own and put
their own section code on evaluation. This results in
response rate more than 100%.
GAs do not abide by the label information printed on the
ICE envelope.
9
III. Summary of Research on
Student Evaluations (SE)-1

SE are used extensively on college campuses
(Marsh, 1987; Seldin, 1993).
 Approximately 86% of liberal arts colleges and
100% of large research universities systematically
collect SE (Seldin 1999).
 Authors who researched them agree that they are
the single most valid source of data on teaching
effectiveness, in fact there is little evidence of the
validity of any other sources of data (Mc Keachie,
1997).
10
III. Summary of Research on Student Evaluations
(SE)-2



Validity of student ratings has been sufficiently well
established (Marsh, 1984; Arubayi, 1987).
Focus of research has shifted more recently to study of
specific background characteristics, biasing variables,
which might harm validity (Wachtel, 1998).
Characteristics associated with
- Administration of student evaluations (Feldman, 1978;
Chen & Hoshower, 1998; Wachtel)
- The course itself (Marsh & Dunkin, 1992; Braskamp &
Ory, 1994; Anderson & Siegfried, 1997)
- The instructor (Anderson & Siegfried, 1997; Wachtel)
- Students themselves (Tatro, 1995; Chen & Hoshower,
1998).
11
III. Summary of Research on Student Evaluations
(SE)-3

Variables little/no
effect on SE:
Instructor (age, sex,
teaching experience).
2. Student (age, sex)
3. Course( time of day).
1.
Variables affecting SE:
A. Course
1. University required
vs. elective course
2. Higher vs. lower
level course
3. Class size
4. Discipline
B. Student grade
expectation
12
Grade Expectancy and Ratings





The effect of a student’s expected grade on evaluation of his/her
teacher in that course.
Studies generally assert that there is a positive correlation between
expected grade and student ratings.
The mere existence of a correlation between a background variable and
rating scores does not necessarily constitute a bias or a threat to the
validity of SE.
It does not necessarily follow that an instructor can obtain higher
ratings merely by giving higher grades.
Alternative explanations include: (1) the leniency hypothesis
(instructors can ‘buy’ better evaluations by giving higher grades); (2)
the validity hypothesis (more effective instructors cause students to
work harder, thereby earning higher grades); and (3) the student
characteristic hypothesis (pre-existing student characteristics such as
prior subject interest affect both teaching effectiveness and student
ratings).
13
III. Summary of Research:
Conclusion

The literature supports that students can provide
valuable information on teaching effectiveness,
given that the evaluation is properly designed.
 There is a great consensus that students cannot
judge all aspects of faculty performance.
 Students should not be asked ‘high-inference’
questions like judging whether the materials used
in the course are up to date or how well the
instructor knows the subject matter of the course.
14
III. Summary of Research:
Conclusion-2

An important quote by Mc keachie (1997)
‘Classes differ. Effective teaching is not just
a matter of finding a method that works
well and using it consistently. Rather,
teaching is an interactive process between
the students and the teacher. Good teaching
involves building bridges between what is
in your head and what is in students’ heads.
What works for one student or for one class
may not work for others’.
15
Summary of ICE Validity Research



1.
2.
3.
4.
Reliability for all
subscales was good
(r=.90-.96).
Content/construct validity
evidence.
Differences by:
Gender (F>M).
Class (HL>LL).
Course (elective> univ.
required).
Subject (SS/HU>
Eng./SC).

1.
2.
3.

Grade expectations
high; 70% expecting
≥80.
low negative correlation
with rating (-.18<r<-.22)
students with high
expectations gave lower
ratings.
Correlation between
grade and rating was16low
(.18-.25).
IV. Faculty Perspective: Summary of
Issues in E-mail exchanges-1





‘Grading leniency’ and need for a correcting factor for
‘difficult courses’.
Need to incorporate other measures of teaching
effectiveness and not only ICEs, like graduating
students, alumni, etc.
Flaws in collection, administration, and
processing/reporting (especially in sections taught by
more than one instructor).
Students’ attitude, do not take evaluations seriously,
popularity contest.
Need to take differences due to type of course
(required/elective), subject matter, student level,
17
faculty workload, into account.
IV. Faculty Perspective: Summary of
Issues in E-mail exchanges-2






Need to define exactly what is purpose of evaluations.
Time of evaluation, should be made earlier in
semester.
Some items encourage ‘academic tourism’ and should
be given less weight. ‘High inference’ items requiring
subjective judgment should be minimized.
Reporting issues: averages are skewed high, may not
adequately discriminate.
Some suggested removing averages/section and for the
whole ICE.
18
Response rates sometimes exceeded 100%.
IV. Faculty Perspective: Survey
Findings -1






Majority value input from ratings and make use of them to
improve their courses.
Few agree that they should be used for making personnel
decisions regarding salary and promotion.
>50% believe that faculty change their teaching to receive high
ratings
Around 40% assert that what is addressed in class may be
determined by content of ratings.
Nearly 50% do not believe that the ratings result in negative
consequences like reducing faculty morale and job satisfaction.
50% disagree that ratings are a ‘meaningless activity’, while a
third agrees with that statement.
.
19
IV. Faculty Perspective: Survey
Findings -2





Majority believe that demanding a lot from students
will result in lower evaluations
There was a split of opinion with respect to ‘Good
instructors get high course evaluations’.
As to faculty view of process students use to fill out
the ratings, majority believe that students do not take
evaluations seriously.
Around half perceive that students do not have enough
knowledge to judge quality of instruction.
Most of faculty disagree with ‘faculty members should
20
not be evaluated by students’.
IV. Faculty Perspective: Survey
Comments -1





Enhance communication about the system
Revisit questionnaire as it was commented that i) wording of
some items is not clear, and ii) items are not tailored or
contextualized to particular needs of courses and accordingly
information provided is not very useful.
Expand system to include other means of evaluating faculty
performance. The current system should be one of many
available like peer review and others.
Revise ICE administration/collection procedures now in use.
Faculty value evaluations and make use of them to make
adjustments, especially the comments.
21
IV. Faculty Perspective: Survey
Comments - 2



Trend analysis recently reported over a number of
semesters provided good indices of teaching
effectiveness.
Faculty cited some biases associated with the
system like ‘ratings affected by mid-term exam
grades and difficulty of course’, ‘had to water down
content of courses to make students happy and have
fun’, ‘students want easy way out, etc...
Although faculty agreed with the need to be
evaluated by students, yet they questioned the
seriousness with which students responded.
22
Student Perceptions: Survey
Findings -1





70% perceive them to be a means for indicating
suggestions for improvement
Around half believe that faculty value input from
them and make improvements as a result of
weaknesses identified.
Around 2/3 perceive them as an effective means of
evaluating faculty and do not agree that the ratings
are a meaningless activity.
50% do not believe that faculty alter their teaching
to get high evaluations.
Believe that the content of the rating form affect
what is addressed in class
23
Student Perceptions: Survey
Findings - 2

1.
Very high percentage
perceives the rating process as allowing honest
evaluations
2. states that they are giving adequate thought and effort to
it.
3. assert that they are being fair and accurate in their
evaluations.
 However, when asked about their peer’s attitude, opinion
is equally divided between agree, disagree, and uncertain.
 5-point rating scale, is well-understood by students
 Around half opt for a ‘3’ when they are undecided or
uninterested. However, another 40% disagree with their
colleagues with regard to the use of ‘3’.
24
Student Perceptions: Survey
Comments -1

1.
2.
3.
4.
5.
6.
Recommendations were provided to
revise form, make it shorter, and different from one major to
another
administer ICEs earlier in semester in order to give faculty
member chance to adjust and students to see effect of evaluation.
improve administration of Graduate Assistants.
introduce electronic submission as it better protects identity of
students, especially while writing comments.
improve communication regarding the evaluations like ‘students
must be informed about the importance of the surveys’ and
‘should be told how and why the results of the ICE will be used’.
publicize results of the evaluations. Students felt that ‘it is better
to register with a certain instructor because of numerical data
rather than campus rumors’, etc..
25
Student Perceptions: Survey
Comments -2

1.
2.
Students believe ratings are important but raised following concerns:
Evaluations are not made use of and accordingly students do not take them
seriously. Comments were made like
‘we need to feel that what we say matters’,
‘we students feel that evaluations are useless as not a single time there is a
consideration of our preferences’,
‘ I point out positive and negative points of instructor hoping that somebody
listens to what I have to say however thinking that nobody really cares’,
‘I don’t think it matters what I write about the professor’, ‘most students fill
form with inappropriate choices because they don’t believe that the surveys
will do any good’, and
‘we never really know how ratings are used and if instructors themselves
take them into consideration’.
Some students fill them inaccurately. Several reasons were cited like ‘lack
of time’, ‘pressure from GA’s”, ‘students rather finish in haste and get out
of class sooner’, ‘some professors make light of evaluations which leads us
students to think that they do in fact make nothing’, and ‘personal likes and
dislikes affect ratings’.
26
Download