ICE Summary Report Fall/Spring 2009-10 

advertisement
Instructor Course Evaluations
Fall/Spring 2009-10 Report
The Instructor Course Evaluation System (ICES) prepared by the Office of Institutional
Research & Assessment (OIRA) and approved by the Senate was administered fall and
spring 2009-10 in all faculties, with the exception of Faculty of Medicine (FM), in paper
version.
The Instructor Course Evaluation Questionnaire (ICE)
The items used in the 2001-2 administrations were used this year with some major
revisions introduced in consultation with various faculties. The ICE includes the
following components:
1. Student background items covering gender, class, required / elective status, expected
grade in course, and number of hours worked for the course/week.
2. Core items (17) included in all forms. These are generic items that can apply to all
courses irrespective of course design or size, and they can be used for normative
scores and comparison across courses and over time to show improvement. They
cover instructor (8), course (6), and student learning outcomes (3) in addition to
global evaluation items.
3. Specific items selected by department/faculty (11-12) from item bank depending on
type of course (lecture, seminar, lab, studio) and its size. Item bank includes specific
items for large lecture courses, for labs/studio/clinical teaching classes, and for
discussion classes. In addition, the item bank includes extra items on instructional
methodology, student interaction and rapport, feedback and evaluation, assignments
and student development. Items selected from them will supplement core
questionnaire depending on type of course and kind of information required.
4. Open-ended questions focusing on instructor and course strengths and weaknesses
and requesting suggestions for improvement.
ICE Administration
The ICE was administered in the last four weeks of the fall and spring semesters. Specific
detailed instructions for graduate assistants outlining steps of administration and
instructions to be read to students were sent with departmental packages. Students were
assured of the confidentiality of their responses and prompted to take the questionnaire
seriously. The ICE was given to a total of 1522 and 1490 course sections in fall and
spring, respectively, and a total of 26,602 and 22,007 student evaluations were filled out,
higher than the usual 22,000 for fall terms. A breakdown of the sample of students by
class, reason for taking the courses, and expected grade is reported in Table 1. Table 2
provides the detailed breakdown of the surveyed population of courses and the
percentage of course sections with a response rate > 40% by faculty and also reports
mean response rate per faculty, while Table 3 provides the breakdown by department.
The percentage response rate has been calculated based on course sections with  40 %
response rate. The percentage of course sections with higher than 40% response rate for
the surveyed sample was 97% for fall and 92% for spring, with faculty rates ranging
between 81-100 % for both terms. With respect to departmental response rates, lowest
were in Physics, Chemistry, and STAT in fall term and same departments in addition to
1
ECON in spring term. Tables 2 & 3 also report the mean response rate for all course
sections by faculty and department. Mean response rate for the whole sample ranges
between 65-88% for faculties with FHS and Nursing obtaining highest response rates in
spring and fall terms, respectively.
Table 1: ICE (Fall and Spring 2009-10) Sample Description
Facult %
% S Class
%
%
Reason for taking
y
F
F
S
Course
FAFS
Freshman
5.9
7.8
7.9
7.2 Required from major
FAS
56.2 56.5 Sophomore
29.4 27.9 Elective from major
FEA
20.6 18.4 Junior
24.2 25.8 Elective outside
major
FHS
Senior
5.9
4.4
22.6 24.7 Required outside
major
OSB
9.6
10.5 4rth Year
6.4
5.5 University required
th
SNU
5 Year
1.7
2.3
.5
.4
Graduate
7.0
6.4
Special
1.3
1.3
Prospective
.3
Expected Grade
%F
 90
85-89
80-84
70-79
70
%S
13
24.5
31.3
25.5
4.6
15.4
26.9
31.7
21.5
3.4
Number of
hours
≤3
4–6
7 – 10
> 10
%F
%
F
57.8
13.5
13.2
%
S
53.3
15.1
15.7
9.3
10.5
5.2
4.5
%S
38.7
38.6
13.1
7.7
36.9
41.6
12.8
6.6
Table2: Surveyed Population of Courses & Response Rates by Faculty
Faculty
Courses
F
S
Response Rate
>=40
F
S
Mean Resp.
Rate
% >=40
F
S
F
S
Agricultural & Food
Sciences
82
93
82
91
100
98
78%
78%
Arts & Sciences
824
732
767
653
93
89
70%
68%
Business
Engineering &
Architecture
140
133
131
129
94
97
68%
70%
265
256
245
208
93
81
74%
69%
Health Sciences
71
49
71
48
100
98
86%
88%
Nursing
19
31
19
27
100
87
80%
72%
2
AUB
1401
1294
1315
1156
97%
92%
76%
74%
Table 3: Response Rates & Courses Surveyed by Department
Dept.
Count of Course
>=0.4
% >=0.4
Faculty
F
FAFS
AGSC
21
S
31
FAFS
ANSC
8
FAFS
LDEM
FAFS
FAS
F
S
Mean
Resp. Rate
21
31
F
100
S
100%
F
75%
S
82%
11
8
11
100
100%
83%
81%
25
22
25
22
100
100%
76%
75%
NFSC
AMST
28
3
29
28
3
27
4
100
100
93%
100%
80%
85%
74%
4
FAS
ARAB
33
35
33
31
100
89%
78%
68%
FAS
FAS
AROL
BIOL
10
57
11
11
32
100
93
100%
FAS
CHEM
39
26
21
67
94%
54%
75%
67%
75%
34
39
10
53
FAS
CHIN
1
A
1
4
100
FAS
CMPS
38
34
36
28
FAS
CVSP
62
63
61
FAS
ECON
74
40
FAS
EDUC
37
FAS
ENGL
FAS
72%
55%
66%
48%
100%
67%
70%
95
82%
65%
63%
62
98
98%
76%
70%
60
26
81
65%
60%
52%
33
37
30
100
91%
82%
73%
136
128
135
125
99
98%
77%
74%
FAAH
34
37
34
37
100
100%
72%
75%
FAS
FREN
2
2
2
2
100
100%
87%
83%
FAS
GEOL
21
19
21
19
100
100%
69%
74%
FAS
HIST
17
16
14
16
82
100%
56%
78%
FAS
MATH
71
62
61
50
86
81%
65%
56%
FAS
FAS
FAS
MEST
PHIL
PHYS
13
26
25
10
10
21
13
100
100
76
100%
100%
57%
75%
74%
55%
78%
21
23
13
26
19
FAS
PSPA
44
40
43
38
98
95%
73%
70%
FAS
FAS
FAS
FAS
FAS
PSYC
SOAN
STAT
UPEN
UPMA
28
25
14
12
2
30
28
24
13
12
2
29
24
9
7
2
100
96
93
100
100
97%
74%
74%
54%
90%
91%
73%
FAS
UPSC
OSB
OSB
ACCT
24
19
21
18
88
95%
65%
66%
BUSS
33
31
32
30
97
97%
71%
71%
24
12
7
2
2
100%
2
3
100%
75%
100%
100%
70%
51%
72%
61%
117%
108%
117%
OSB
DCSN
ENTM
13
2
18
12
2
18
92
100
100%
61%
73%
70%
FINA
INFO
25
10
25
9
25
7
24
9
100%
70%
96%
100%
72%
57%
66%
70%
MKTG
MNGT
19
14
20
19
11
95%
100%
95%
100%
63%
81%
73%
11
18
14
FEA
FEA
FEA
FEA
ARCH
CHEN
CIVE
31
31
29
94%
55%
62%
49%
44
3
36
42
17
2
34
96%
67%
94%
74%
48%
78%
EECE
96
85
89
74
93%
87%
79%
81%
FEA
ENMG
17
17
17
13
100%
76%
71%
53%
FEA
ENSC
2
1
2
1
100%
100%
99%
50%
FEA
GRDS
23
22
22
15
96%
68%
77%
55%
FEA
MECH
49
58
41
49
84%
84%
69%
66%
FEA
URDS
1
1
1
1
100%
100%
63%
50%
FEA
URPL
2
2
2
2
100%
100%
81%
90%
FHS
FHS
ENHL
11
11
11
10
100%
91%
89%
90%
ENSC
2
FHS
EPHD
18
5
18
5
100%
100%
83%
87%
FHS
HBED
HMPD
LABM
12
10
10
13
13
7
10
100%
100%
100%
100%
100%
100%
84%
84%
90%
90%
7
10
12
10
10
MLTP
5
3
5
3
100%
100%
90%
96%
PBHL
3
NURS
19
OSB
OSB
OSB
OSB
OSB
FHS
FHS
FHS
FHS
Nursing
2
100%
3
31
19
100%
100%
27
100%
80%
88%
78%
87%
80%
Results
Results were reported to each faculty member, department chair, and dean electronically.
As for the comments, they were sent in sealed envelopes to the respective deans’ offices.
In addition to item means, averages/percentiles were reported for the instructor, the
course and for student learning outcome development. In addition, category, faculty, and
university percentiles/means were reported for each item and for each subgroup.
Percentiles were computed using only course sections with equal to or more than 40%
response rates. In addition, three additional reports were provided to the deans: one
summarizing institutional performance on 17 core items by faculty, another providing
summary data for all departments within their faculty, and a third providing a summary
for each department in the faculty. Department chairs also got a copy of their department
summary.
4
73%
72%
Figures 1 and 2 present summary normative data for ICE subscales for the University and
per faculty for fall and spring 2010, and in comparison with 2008 and 2009 scores. Only
course sections with response rate equal or higher than 40% were included in normative
data as they provide more reliable estimates. Table 5 presents scores on all subscales by
faculty for fall and spring terms. Faculties with highest means were highlighted in blue.
As in previous administrations, students’ evaluations of teachers (A) were, in general,
higher than their evaluations of courses (B) and of learning outcomes(C). Fall 2010 ICE
results show a drop in ratings, usual for fall terms, then ratings go up again in spring to
instructor of (A, mean=4.1), course (B, mean=3.95), additional items (D, 4.05), but are
slightly lower on and learning outcome (C, 3.9). The results, however, show stability of
ICE ratings on three subscales over time.
Figure1.
ICE Average by Subscale
4.15
ICE Average
4.10
4.05
A
4.00
B
3.95
C
3.90
D
3.85
3.80
3.75
201020
201010
200920
200910
200820
200810
With respect to instructor effectiveness by faculty, mean scores ranged between 3.9-4.1
in the fall and 4.0-4.1 spring, with most of faculties showing a drop in fall especially
FEA, while FAS maintained its average and OSB showing improvement in spring. The
highest in instructor effectiveness for both terms are FAS and FHS with a mean of 4.1,
and lowest FEA with an average of 3.9 (fall) and 4.0 (spring) together with NU.
5
Figure 2a.
Instructor Effectiveness by Faculty
4.25
ICE Average
4.20
AG
4.15
AS
4.10
EA
4.05
HS
4.00
NU
SB
3.95
3.90
201020
201010
200920
200910
200820
200810
With respect to course evaluations, same trend prevails with drop in fall term and rise in
spring. FAS, FAFS and FHS had highest scores in fall (3.9) and FAS, FHS and SNU had
highest in spring (4.0).
Figure 2b.
Course Evaluation by Faculty
ICE Average
4.05
4.00
AG
3.95
AS
EA
3.90
HS
3.85
NU
3.80
SB
3.75
201020
201010
200920
6
200910
200820
200810
As to learning outcomes by faculty, scores ranged from 3.8-4.0 in fall and 3.9-4.0 in
spring, lower than previous terms. FAFS has highest mean rating for fall and SNU for
spring. FEA ICE form does not include learning outcomes.
Figure 2c.
Learning outcome by Faculty
ICE Average
4.30
4.20
AG
4.10
AS
EA
4.00
HS
3.90
NU
3.80
SB
3.70
201020
201010
200920
200910
200820
200810
Additional items means ranged 3.9-4.1 with the usual drop for fall term. FAS and FAFS
had highest scores for both terms and SNU lowest. Similarly FEA did not have additional
items in its ICE form.
Figure 2d.
ICE Average
Additional Items by Faculty
4.25
4.20
4.15
4.10
4.05
4.00
3.95
3.90
3.85
3.80
3.75
AG
AS
EA
HS
NU
SB
201020
201010
200920
200910
200820
200810
As to item # 8, overall effectiveness of instructor, it averaged 4.1 for all faculties in both
terms, also, item # 14, overall course effectiveness, averaged 3.8 in fall and 3.9 in spring.
7
Overall instructor effectiveness is higher than previous terms, while overall course shows
same trend. A breakdown of items 8 and 14 averages by faculty is reported in Table 4.
Table 4: Average of Overall Items by Faculty
Faculty
N
Item # 8
F
S
F
S
82
91
4.04
4.06
AG
767
653
4.08
4.13
AS
245
208
3.94
4.05
EA
71
48
4.04
4.12
HS
131
25
4.07
NU
19
129
4.01
4.15
SB
1315 1154
4.1
4.10
AUB
Item # 14
F
S
3.95
3.88
3.87
3.94
3.81
3.96
3.85
3.97
3.79
3.98
3.76
3.92
3.8
3.94
With respect to item 8, FEA, FHS and OSB improved in spring over fall terms, with later
obtaining highest score. As for item 14 most of the faculties showed improvement with
exception of FAFS. Figure 3 presents 3-year trend of overall items. It does show gradual
increase.
Figure 3.
Trend Analysis for Overall Items
4.2
4.1
4
3.9
Overall Instructor
3.8
Overall Course
3.7
3.6
3.5
Fall 08
Sp 08
Fall 09
Sp 09
Fall 10
Sp 10
Appendix presents item statistics for the items by faculty, and for the whole university.
8
Table 5 presents subscale averages and their relevant quartiles per faculty and for the
university.
Table 5: Subscale Averages& Quartiles per Faculty& for University
N
Mean
25
F
Additional Items
Clinical
F
S
F
S
82
91
4.03
4.05
3.7
3.80
4.2
4.20
4.5
4.40
AS
653
3.98
4.05
3.7
3.80
4.0
4.10
4.3
4.40
HS
NU
71
3.93
4.05
4.20
4.3
4.30
4.35
3.6
3.7
3.70
3.65
3.80
3.80
4.0
3.88
3.97
4.01
3.99
4.02
4.04
3.7
131
1051
48
21
129
921
3.9
4.0
4.2
4.3
AG
82
91
3.94
3.90
3.6
3.60
4.1
4.10
4.10
4.00
4.3
4.30
4.30
4.30
AS
767
652
3.90
3.96
3.6
3.70
3.9
4.00
4.2
4.20
EA
245
208
3.78
3.91
3.5
3.63
3.9
4.00
4.1
4.20
HS
71
48
3.93
4.00
3.6
3.80
4.0
4.00
4.3
4.20
NU
19
25
3.78
3.98
3.5
3.50
3.8
4.10
4.2
4.40
129
1153
3.82
3.87
3.94
3.94
3.6
3.6
3.80
3.70
3.9
3.9
4.00
4.00
4.1
4.2
4.20
4.20
82
91
4.05
4.05
3.8
3.80
4.3
4.20
4.4
4.40
AS
767
653
4.08
4.12
3.8
3.80
4.2
4.20
4.4
4.40
EA
245
208
3.91
4.03
3.6
3.70
4.0
4.10
4.3
4.40
HS
71
48
4.05
4.11
3.8
3.90
4.1
4.20
4.4
4.48
NU
19
26
4.03
4.02
3.7
3.58
4.0
4.20
4.5
4.50
131
129
4.01
4.14
3.8
3.90
4.1
4.20
4.4
4.40
1315
1155
4.03
4.10
3.8
3.80
4.1
4.20
4.4
4.40
AG
SB
AUB
Learning Outcomes
S
767
SB
AUB
Instructor Teaching
Effectiveness
F
75
AG
SB
AUB
Course Evaluation
S
Percentiles
50
F
S
131
1315
AG
82
91
3.96
3.94
3.6
3.70
4.1
4.10
4.4
4.30
AS
767
653
3.77
3.86
3.4
3.50
3.8
3.90
4.2
4.30
HS
71
48
3.83
3.89
3.5
3.50
3.8
4.00
4.3
4.28
NU
19
27
3.82
4.01
3.4
3.70
3.8
4.10
4.3
4.30
131
1070
129
948
3.75
3.79
3.88
3.88
3.4
3.4
3.60
3.50
3.8
3.90
3.90
4.1
4.2
4.25
4.30
SB
AUB
3.8
Table 6 presents subscale means by category of courses in every faculty. Lowest (red
font) and highest (blue font) categories within each faculty were highlighted to facilitate
comparison for improvement.
9
Table 6: Subscale Means per Category per Faculty
Faculty
CATEGORY
AG
AIII
AG
Graduate Lecture
AG
AG
COUNT
F
S
7
11
Instructor
Effectiveness
F
S
Course
Effectiveness
F
S
Learning
Outcomes
F
S
3.37
3.34
3.27
3.24
3.47
3.46
14
17
4.39
4.32
4.24
4.05
4.24
3.99
Lab Teaching
4
10
4.08
3.92
3.90
3.83
3.93
3.88
Large Lecture
14
14
3.99
4.03
3.86
3.91
3.83
4.01
AG
Large Lecture & Lab
23
19
4.06
4.09
3.92
3.99
3.99
3.95
AG
Seminar
5
6
4.18
4.33
4.26
4.38
4.00
4.33
AG
Small Lecture
11
18
4.29
4.01
4.21
3.79
4.19
3.92
AS
7
10
4.16
4.34
4.00
4.19
4.14
4.21
AS
Education-Method
Education-NonMethod
AS
Humanities
346
334
4.13
4.15
3.93
3.98
3.80
3.89
AS
Sciences
229
172
3.99
4.06
3.82
3.90
3.59
3.71
AS
Social Sciences
155
117
4.07
4.08
3.91
3.92
3.91
3.94
EA
FEA
245
208
3.91
4.03
3.78
3.91
4.24
4.35
4.24
4.15
4.08
4.25
HS
HS
HS
Discussion Lecture
Discussion Lecture
+ Assignment
30
5
21
19
20
2
22
13
4.16
4.08
4.26
3.82
4.01
3.96
4.11
3.78
3.94
4.01
4.10
3.42
HS
Lecture + Lab
5
4
4.52
4.18
4.38
3.95
4.06
3.95
NU
SNU Form A
10
8
3.89
4.38
3.72
4.31
3.66
4.10
NU
SNU Form B
9
19
4.18
3.86
3.86
3.83
3.99
3.97
SB
ACCT
21
18
3.88
4.23
3.72
4.08
3.56
3.91
SB
BUSS
32
30
4.02
4.15
3.79
3.89
3.62
3.74
SB
FINA
25
24
4.14
4.20
3.96
4.05
4.07
4.13
SB
MKTG
17
19
4.05
4.11
3.89
3.97
3.94
4.03
SB
MNGT
17
11
4.21
4.36
3.96
4.19
3.94
4.24
SB
OPIM
19
27
3.79
3.95
3.64
3.69
3.45
3.54
3.86
4.04
3.69
4.00
3.82
4.10
HS
7
3.94
4.14
Lecture
Lecture +
Assignment
21
4.05
4.22
3.52
3.97
Conclusion: Accomplishments and Areas of Improvement
ICE results are showing stability with slight improvement this spring as compared to last
fall for most faculties. Response rates are increasing and whole process is being taken
more seriously.
The fall and spring administrations went smoothly as we have become more organized
and were able to anticipate problems ahead of time. Forms were sent early to provide
departments with ample time to do the administration and not to have to leave it till last
two weeks of term, when attendance is usually low. Before we prepared the forms and
10
sent them, we made sure that course/instructor/section coordinates were accurate and
reflected what actually is and not what is supposed to be according to the Banner. Proper
coding was given to large lectures, lab lectures, multi-instructor courses, etc. Before
scanning the filled out forms, OIRA staff checked the course/section/department/faculty
information entered by students. These procedures decreased the problems encountered
in data entry and enabled the issuing of the results in final form within two weeks period
instead of the usual one month. Reports generated followed format adopted last fall and
faculty members were provided with an interpretive guide. In addition, summary
institutional, faculty, and departmental reports were issued to deans and department
chairs. These summary reports were also published on OIRA website for possible review
by faculty and students, and this step provided evidence that the evaluations are taken
seriously by faculty and by the administration.
Procedures to produce the ICE reports were improved through automating most of the
stages of the process and the report production. The building up of an ICE database
enables us to produce trends report by teacher, course or department and/or by item.
These reports are now available.
Despite the above accomplishments, several problems were encountered that we hope can
be overcome in future administrations:
1. Administration is still a major problem and it should improve. Graduate assistants
were trained but still need more training on how to administer the ICE and how to
motivate students to answer. They should be given adequate time to conduct the
evaluations and not to leave everything to the last week of the semester or to
conduct them during final exams. They should ensure that students fill right
course/section information on answer sheet. Envelopes need to be sealed and sent
back to OIRA promptly, not after a month, and tidily, not in a mess (all mixed up,
information not properly filled in, wrong coding, etc.).
2. The problem of getting up-to-date accurate information regarding courses/sections
offered and their enrollment has improved though still exists in some faculties.
We obtain needed information from departments or deans’ offices directly;
however, these also do not always have most update information, especially with
regard to enrollment. We get course capacity information and not actual
enrollment information in many cases and this affects response rate obtained.
3. Departments need to inform dean’s office with changes they had incorporated
and these should be reflected on Banner. Similarly, deans’ offices should alert us
to courses with labs and lectures with different instructors, and to courses being
taught by more than one instructor or sections they would like combined ahead of
administration so that we can account for these variations
+
11
Appendix: ICE Item Averages by Faculty
1
Term
200720
200720
200720
200720
200720
200720
200720
Faculty
FAFS
FAS
OSB
FEA
FHS
SNU
AUB
2
F
4.14
4.25
4.17
4.09
4.24
4.10
S
4.12
4.27
4.27
4.15
4.25
4.04
4.20
4.23 4.31
6
F
4.22
4.36
4.28
4.18
4.41
4.16
7
3
S
4.25
4.37
4.36
4.23
4.35
4.14
8
F
3.91
3.95
3.94
3.83
3.92
4.01
S
3.94
4.01
4.06
3.96
4.01
4.04
F
4.04
4.08
4.01
3.94
4.04
S
4.06
4.13
4.15
4.05
4.12
4.07
3.82
3.90
3.93
4.00
4.04
4.11
14
F
3.88
3.92
3.81
3.69
3.88
3.81
S
3.86
3.96
3.94
3.82
3.96
4.03
F
3.90
3.87
3.78
3.73
3.81
3.72
S
3.86
3.94
3.89
3.87
3.95
3.99
F
3.95
3.87
3.79
3.81
3.85
3.76
S
3.88
3.94
3.92
3.96
3.97
3.98
3.86
3.93
3.83
3.92
3.86
3.94
12
F
5
F
4.09
4.12
4.01
3.89
4.06
4.04
4.07
4.16
4.14
3.99
4.09
4.05
4.13 4.06
9
S
3.90
3.90
3.97
3.83
3.92
4.01
13
4
S
4.09
4.16
4.15
4.02
4.15
4.06
4.33 4.05
F
3.94
3.85
3.84
3.67
3.81
3.94
12
F
4.09
4.10
4.02
3.90
4.01
4.11
F
4.00
3.96
3.93
3.90
4.08
3.85
F
3.91
3.87
3.83
3.75
3.85
3.89
S
3.91
3.92
3.99
3.89
3.90
3.96
4.12 3.85
10
S
4.00
4.02
4.03
3.99
4.08
3.98
3.92
11
3.98
3.88
3.84
3.85
3.99
3.84
S
3.95
3.95
3.94
3.94
4.03
4.07
F
3.89
3.84
3.78
3.73
3.89
3.62
3.88
3.95 3.95 4.01 3.82 3.89
15
F
S
3.91 3.89
3.69 3.79
3.67 3.81
F
4.03
3.87
3.89
S
4.02
3.96
4.00
F
3.94
3.74
3.70
S
3.90
3.83
3.82
3.84
3.94
4.01
3.91
4.04
4.06
3.71
3.75
3.77
3.99
3.76
3.81
3.71 3.81
16
S
3.85
3.91
3.90
3.86
3.94
3.83
17
3.90 3.98
3.75 3.84
Download