ICE Summary Report Fall/Spring 2011-12

advertisement
Instructor Course Evaluations
Fall/Spring 2011-12 Report
The Instructor Course Evaluation System (ICES) prepared by the Office of Institutional
Research & Assessment (OIRA) and approved by the Senate was administered fall and
spring 2011-12 in all faculties, with the exception of Faculty of Medicine (FM), in paper
version.
The Instructor Course Evaluation Questionnaire (ICE)
The items used in the previous administrations were used this year with some revisions
introduced especially with regards to blended learning courses and course learning
outcomes (FAS). The ICE includes the following components:
1. Student background items covering gender, class, required / elective status, expected
grade in course, and number of hours worked for the course/week.
2. Core items (17) included in all forms. These are generic items that can apply to all
courses irrespective of course design or size, and they can be used for normative
scores and comparison across courses and over time to show improvement. They
cover instructor (8), course (6), and student learning outcomes (3) in addition to
global evaluation items.
3. Specific items selected by department/faculty (11-12) from item bank depending on
type of course (lecture, seminar, lab, studio) and its size. Item bank includes specific
items for large lecture courses, for labs/studio/clinical teaching classes, and for
discussion classes. In addition, the item bank includes extra items on instructional
methodology, student interaction and rapport, feedback and evaluation, assignments
and student development. Items selected from them will supplement core
questionnaire depending on type of course and kind of information required.
4. Open-ended questions focusing on instructor and course strengths and weaknesses
and requesting suggestions for improvement.
ICE Administration
The ICE was administered in the last four weeks of the fall and spring semesters. Specific
detailed instructions for graduate assistants outlining steps of administration and
instructions to be read to students were sent with departmental packages. Students were
assured of the confidentiality of their responses and prompted to take the questionnaire
seriously. The ICE was given to a total of 1634 (1606 last fall) and 1416 (1579 last
spring) course sections in fall and spring, respectively, and a total of 24,240 and 22,585
student evaluations were filled out, lower than last year A breakdown of the sample of
students by class, reason for taking the courses, and expected grade is reported in Table 1.
Demographics are quite similar in both semesters, however, sample includes lower
percentages from FEA and FHS in spring, a higher percentage of students taking elective
courses in spring (33 vs. 28%) and they have higher grade expectations as 43% expect
grade to be ≥ 85 vs. 41% in fall. Table 2 provides the detailed breakdown of the surveyed
population of courses and the percentage of course sections with a response rate > 40%
by faculty and also reports mean response rate per faculty, while Table 3 provides the
breakdown by department. The percentage response rate has been calculated based on
course sections with  40 % response rate. The percentage of course sections with higher
1
than 40% response rate for the surveyed sample was 93-4% for fall and spring, same as
last year’s, with faculty rates ranging between 85-100 % for both terms.
Table 1: ICE (Fall and Spring 2011-12) Sample Description
Facult %
%
Class
%
%
Reason for taking
y
F
S
F
S
Course
FAFS
Freshman
Required from major
6
7
6
6
FAS
Sophomore
55
56
27
27 Elective from major
FEA
Junior
21
19
26
27 Elective outside
major
FHS
Senior
5
3
24
25 Required outside
major
OSB
4rth Year
University required
11
12
8
7
th
SNU
5 Year
2
2
1
1
Graduate
7
6
Special
1
1
Expected Grade
 90
85-89
80-84
70-79
70
%F
%S
14
27
33
22
4
Number of
hours
≤3
4–6
7 – 10
> 10
15
28
32.5
21.5
3
%
F
57
14
14
%
S
53
16
17
10
10
5
4
%F
%S
36
42
14
8
36
43
14
7
Table2: Surveyed Population of Courses & Response Rates by Faculty
Faculty
Courses
Response Rate
>=40
Mean Resp.
Rate
% >=40
F
S
F
S
F
S
F
S
Agricultural & Food
Sciences
101
105
101
105
100%
100%
80%
79%
Arts & Sciences
853
716
764
716
90%
91%
68%
70%
Business
Engineering &
Architecture
155
154
154
154
99%
97%
74%
72%
304
249
277
249
91%
91%
68%
68%
52
22
46
21
52
19
46
21
100%
86%
100%
75%
85%
69%
84%
65%
1487
1305
1367
1305
94%
93%
74%
73%
Health Sciences
Nursing
AUB
With respect to departmental response rates, lowest were in Physics, Chemistry, and
Math in both terms and quite similar to previous years. Tables 2 & 3 also report the mean
2
response rate for all course sections by faculty and department. Mean response rate for
the whole sample ranges between 65-85% for faculties with FHS obtaining highest
response rates in spring and fall terms.
Table 3: Response Rates & Courses Surveyed by Department
Faculty
Faculty
Dept.
Dept.
Count of Course
>=0.4
% >=0.4
Mean
Resp. Rate
F
S
F
S
F
S
F
S
FAFS
FAFS
AGSC
AVSC
32
12
36
10
32
12
36
10
100%
100%
100%
100%
77%
83%
81%
81%
FAFS
FAFS
LDEM
NFSC
27
30
29
30
27
30
29
30
100%
100%
100%
100%
80%
81%
78%
79%
FAS
AMST
5
6
5
6
100%
100%
82%
67%
FAS
FAS
ARAB
AROL
34
9
34
9
46
44
96%
97%
100%
88%
70%
64%
BIOL
31
8
42
100%
100%
FAS
32
8
48
67%
68%
84%
71%
FAS
CHEM
41
33
23
22
56%
67%
45%
50%
FAS
CHIN
3
4
3
4
100%
100%
68%
83%
FAS
CMPS
32
33
28
29
88%
88%
61%
64%
FAS
CVSP
56
61
55
61
98%
100%
74%
76%
FAS
ECON
77
74
66
61
86%
82%
62%
62%
FAS
EDUC
41
35
41
35
100%
100%
82%
82%
FAS
ENGL
138
122
134
118
97%
97%
77%
77%
FAS
FAAH
33
35
33
35
100%
100%
73%
78%
FAS
FREN
2
2
2
2
100%
100%
60%
95%
FAS
GEOL
20
18
19
17
95%
94%
68%
70%
FAS
HIST
21
15
20
14
95%
93%
74%
72%
FAS
MATH
74
71
58
55
78%
77%
58%
57%
FAS
FAS
MEST
PHIL
10
36
31
9
34
31
90%
94%
100%
80%
70%
68%
FAS
PHYS
38
30
20
16
53%
53%
45%
57%
FAS
FAS
FAS
FAS
FAS
PSPA
PSYC
SOAN
STAT
UPEN
50
32
29
15
7
50
34
27
15
4
49
30
28
9
7
49
34
27
13
4
98%
94%
97%
60%
100%
98%
100%
100%
87%
100%
73%
74%
72%
55%
94%
73%
73%
77%
60%
88%
FAS
UPMA
2
1
2
1
100%
100%
100%
63%
FAS
UPSC
2
1
2
1
100%
100%
94%
75%
3
OSB
ACCT
32
23
32
23
100%
100%
73%
67%
OSB
BUSS
33
33
17
4
19
12
17
3
19
12
100%
75%
100%
100%
100%
92%
100%
100%
91%
72%
DCSN
ENTM
FINA
INFO
36
22
2
28
10
100%
OSB
OSB
OSB
OSB
36
24
2
28
11
74%
65%
71%
83%
75%
70%
54%
70%
74%
OSB
OSB
MKTG
MNGT
19
19
21
14
19
19
20
13
100%
100%
95%
93%
71%
78%
72%
76%
FEA
FEA
FEA
ARCH
CHEN
CIVE
41
8
43
29
10
39
35
8
40
22
10
36
85%
100%
93%
76%
100%
92%
61%
71%
67%
57%
70%
73%
FEA
EECE
88
82
84
79
95%
96%
77%
72%
FEA
FEA
ENMG
18
18
17
18
94%
100%
62%
67%
FEA
GRDS
20
75
15
66
75%
88%
57%
65%
FEA
MECH
81
2
73
2
90%
100%
67%
90%
FEA
URDS
3
2
3
2
100%
100%
72%
86%
FEA
URPL
2
13
2
13
100%
100%
100%
75%
FHS
FHS
ENHL
10
10
10
10
100%
100%
87%
85%
ENSC
1
4
1
4
100%
100%
100%
89%
FHS
EPHD
8
7
8
7
100%
100%
89%
88%
FHS
HMPD
HPCH
LABM
9
7
10
13
8
3
9
7
10
13
8
3
100%
100%
100%
100%
100%
100%
83%
83%
77%
84%
77%
78%
MLSP
5
1
5
1
100%
100%
88%
100%
PBHL
2
1
2
1
100%
100%
89%
89%
NURS
22
28
19
21
86%
75%
69%
65%
FHS
FHS
FHS
FHS
HSON
17
ENSC
14
82%
68%
Results
Results were reported to each faculty member, department chair, and dean electronically.
As for the comments, they were sent in sealed envelopes to the respective deans’ offices.
In addition to item means, averages/percentiles were reported for the instructor, the
course and for student learning outcome development. In addition, category, faculty, and
university percentiles/means were reported for each item and for each subgroup.
Percentiles were computed using only course sections with equal to or more than 40%
response rates. In addition, three additional reports were provided to the deans: one
summarizing institutional performance on 17 core items by faculty, another providing
summary data for all departments within their faculty, and a third providing a summary
4
for each department in the faculty. Department chairs also got a copy of their department
summary.
Figures 1 and 2 present summary normative data for ICE subscales for the University and
per faculty for fall and spring 2011-12 in comparison with 2008-11 scores. Only course
sections with response rate equal or higher than 40% were included in normative data as
they provide more reliable estimates.
As in previous administrations, students’ evaluations of teachers (A) were, in general,
higher than their evaluations of courses (B) and of learning outcomes(C). ICE results
show a very slight drop in ratings for instructor and learning outcomes, while additional
items and course ratings show stability. Averages for 2011-12 are (A, mean=4.05), course
(B, mean=3.90), additional items (D, 4.0), and learning outcome (C, 3.85). Four-year
results, however, show stability of ICE ratings on the subscales over time.
With respect to instructor effectiveness by faculty (Figure 2a), mean scores ranged
between 3.9-4.2 in the fall and spring. FAFS and FHS have shown a drop in 2011-12,
FAS shows stability over the years, and FEA and OSB highly fluctuating, with FEA
showing lowest averages.
With respect to course evaluations, ratings ranged between 3.75-4.1 with drop for FAFS
and FEA. FAS and OSB showed stability, while FHS and SNU went down in spring
2012.
5
As to learning outcomes by faculty, scores ranged from 3.7-4.2. FAS, OSB and SNU
showed stability, while FAFS scores fell from 4.2-4.0 and FHS was fluctuating.
Additional items means ranged 3.7-4.2 for the past four years. SNU witnessed a
significant drop in spring 2012 (4.2-3.7), similarly for FHS, it did not regain its higher
ratings that dropped spring 2011. Other faculties showed stability, except for FAS that
dropped in fall 2012. Similarly FEA did not have additional items in its ICE form.
6
Table 4: Average of Overall Items by Faculty
Faculty
N
Item # 8
F
S
F
105
AG
101
4.10
716
AS
764
4.09
249
EA
277
3.91
46
HS
52
4.08
21
NU
19
4.23
154
SB
154
4.05
1305
AUB
1367
4.08
7
Item # 14
S
4.05
4.13
3.96
4.03
3.96
4.09
4.08
F
S
4.01
3.91
3.79
3.96
4.03
3.87
3.93
3.96
3.92
3.83
3.84
3.91
3.90
3.90
As to item # 8, overall effectiveness of instructor, it averaged 4.1 for all faculties in both
terms, also, item # 14, overall course effectiveness, averaged 3.9 in fall and in spring
same as last year’s. A breakdown of items 8 and 14 averages by faculty is reported in
Table 4. SNU has shown a drop in spring on both items, while other faculties maintained
scores. Figure 3 presents 4-year trend of overall items. It does show increase from 2008
and then stability over the last three years.
Appendix presents item statistics for the items by faculty, and for the whole university.
8
Table 5 presents subscale averages and their relevant quartiles per faculty and for the
university. Faculties with highest subscale average are highlighted in blue.
Table 5: Subscale Averages& Quartiles per Faculty& for University
N
Mean
25
F
Additional Items
Clinical Items
Course Evaluation
Instructor Teaching
Effectiveness
Learning Outcomes
S
S
F
105
4.07
4.06
4.00
F
S
716
4.01
4.03
3.80
4.10
4.45
4.10
4.30
4.30
51
44
4.07
3.93
4.10
4.05
4.40
4.28
SB
154
9
1064
154
9
1033
3.92
NU
4.00
4.10
4.00
3.70
4.10
4.20
3.80
3.80
3.30
3.80
4.30
4.20
4.25
4.30
AG
100
AS
750
3.96
3.90
3.70
4.15
4.00
4.40
4.40
3.92
3.70
3.70
3.90
4.00
4.20
EA
275
4.20
3.80
3.50
3.60
3.80
3.80
4.10
4.10
HS
3.98
3.88
3.80
3.70
4.00
4.00
4.20
4.20
21
4.05
3.91
3.60
3.70
4.25
4.00
4.50
4.25
3.90
3.90
3.90
4.10
3.63
3.70
3.70
3.90
3.89
3.90
3.89
3.70
1348
154
1313
3.88
4.20
4.10
4.10
AG
100
105
4.09
4.07
4.00
3.80
4.20
4.10
4.50
4.45
AS
750
716
4.09
4.11
3.80
3.90
4.10
4.20
4.40
4.40
EA
275
249
3.90
3.94
3.60
3.70
4.00
4.00
4.30
4.30
HS
51
46
4.11
4.03
3.90
3.78
4.20
4.10
4.40
4.40
NU
18
21
4.19
4.01
3.75
3.75
4.40
4.10
4.60
4.50
SB
154
4.05
4.10
4.10
4.10
4.40
3.80
3.80
3.80
4.15
4.05
4.07
4.06
3.90
1348
154
1305
4.40
4.40
4.40
AG
100
105
4.01
3.95
3.73
3.60
4.10
4.00
4.40
4.40
AS
750
716
3.82
3.83
3.50
3.50
3.90
3.90
4.20
4.20
HS
51
44
3.93
3.60
3.43
4.00
3.90
4.30
4.20
NU
18
21
4.11
3.77
3.98
3.68
3.65
4.15
4.00
4.60
4.30
154
154
1054
3.82
3.85
3.85
3.58
3.60
3.50
3.90
3.90
3.90
4.20
4.10
4.20
100
AS
750
HS
S
75
4.40
AG
F
Percentiles
50
F
S
3.80
4.20
3.80
4.10
3.80
3.73
3.70
4.01
3.97
3.70
4.02
105
4.02
716
3.92
249
3.75
51
54
NU
18
SB
154
SB
1073
4.18
3.85
4.00
3.50
4.20
3.90
4.60
4.20
Table 6 presents subscale means by category of courses in every faculty. Lowest (red
font) and highest (blue font) categories within each faculty were highlighted to facilitate
comparison for improvement.
9
Table 6: Subscale Means per Category per Faculty
Faculty
CATEGORY
Instructor
Effectiveness
F
S
COUNT
F
S
Learning
Outcomes
F
S
3.63
3.63
3.49
3.75
3.58
19
3.90
4.43
4.22
3.56
4.38
4.05
4.34
3.99
12
4.10
3.92
4.06
3.84
4.02
3.87
10
9
4.20
4.10
4.17
28
3.88
3.78
4.06
4.20
4.23
17
4.08
4.28
3.72
4.19
6
8
4.20
4.19
4.03
4.20
4.07
4.14
31
17
4.14
3.91
4.06
3.74
4.01
3.62
4.26
4.37
4.29
4.35
4.16
4.02
4.06
3.98
4.01
FAFS
FAFS
AIII
12
3.60
Blended Learning
1
FAFS
Graduate Lecture
FAFS
Lab Teaching
16
9
FAFS
Large Lecture
FAFS
Large Lecture & Lab
FAFS
Seminar
FAFS
Small Lecture
FAS
FAS
Blended Learning
14
Education-Method
Education-Non
Method
7
10
4.37
4.51
34
25
4.12
FAS
Course
Effectiveness
F
S
11
4.05
3.60
FAS
Humanities
335
316
4.08
4.09
3.91
3.90
3.80
3.8
FAS
Sciences
201
194
4.05
4.13
3.88
3.93
3.70
3.8
FAS
Social Sciences
173
171
4.12
4.08
3.93
3.91
3.96
3.88
FEA
Blended Learning
FEA
FEA
FHS
FHS
Blended Learning
1
4.10
3.78
Discussion Lecture
Discussion Lecture +
Assignment
4
4.15
4.10
14
21
4.15
FHS
Lecture
19
11
4.11
FHS
Lecture + Assignment
11
7
3.99
FHS
Lecture + Lab
3
5
4.30
HSON
HSON
Blended Learning
1
HSON FORM A
11
13
4.02
HSON
HSON FORM B
7
8
OSB
OSB
ACCT
32
BUSS
33
OSB
FINA
19
OSB
MKTG
19
OSB
MNGT
22
OSB
OPIM
29
23
36
28
20
15
32
4.46
4.12
FHS
2
275
4.50
249
3.90
4.06
3.94
4.19
3.75
3.91
4.08
4.20
4.03
4.24
3.77
4.04
4.05
3.75
3.80
4.05
3.99
3.98
4.09
3.95
3.99
3.69
3.94
3.49
3.86
3.91
3.68
3.60
4.20
3.98
3.90
3.90
4.05
3.69
3.96
3.81
3.97
3.85
3.93
3.93
3.89
4.08
3.83
3.83
3.66
4.03
3.84
3.89
3.91
3.82
4.22
3.69
4.17
3.97
4.10
3.97
4.09
4.14
3.94
4.23
3.99
3.83
4.05
3.73
3.85
3.81
4.46
3.83
3.66
4.17
3.77
3.85
3.78
Conclusion: Accomplishments and Areas of Improvement
ICE results are showing stability with slight improvement this spring as compared to last
fall. Response rates are increasing and whole process is being taken more seriously.
10
The fall and spring administrations went smoothly as we have become more organized
and were able to anticipate problems ahead of time. Forms were sent early to provide
departments with ample time to do the administration and not to have to leave it till last
two weeks of term, when attendance is usually low. Before we prepared the forms and
sent them, we made sure that course/instructor/section coordinates were accurate and
reflected what actually is and not what is supposed to be according to the Banner. Proper
coding was given to large lectures, lab lectures, multi-instructor courses, etc. Before
scanning the filled out forms, OIRA staff checked the course/section/department/faculty
information entered by students. These procedures decreased the problems encountered
in data entry and enabled the issuing of the results in final form within two weeks period
instead of the usual one month. Reports generated followed format adopted last fall and
faculty members were provided with an interpretive guide. In addition, summary
institutional, faculty, and departmental reports were issued to deans and department
chairs. These summary reports were also published on OIRA website for possible review
by faculty and students, and this step provided evidence that the evaluations are taken
seriously by faculty and by the administration.
Procedures to produce the ICE reports were improved through automating most of the
stages of the process and the report production. The building up of an ICE database
enables us to produce trends report by teacher, course or department and/or by item.
These reports are now available.
Despite the above accomplishments, several problems were encountered that we hope can
be overcome in future administrations:
1. Administration is still a major problem and it should improve. Graduate assistants
were trained but still need more training on how to administer the ICE and how to
motivate students to answer. They should be given adequate time to conduct the
evaluations and not to leave everything to the last week of the semester or to
conduct them during final exams. They should ensure that students fill right
course/section information on answer sheet. Envelopes need to be sealed and sent
back to OIRA promptly, not after a month, and tidily, not in a mess (all mixed up,
information not properly filled in, wrong coding, etc.).
2. The problem of getting up-to-date accurate information regarding courses/sections
offered and their enrollment has improved though still exists in some faculties.
We obtain needed information from departments or deans’ offices directly;
however, these also do not always have most update information, especially with
regard to enrollment. We get course capacity information and not actual
enrollment information in many cases and this affects response rate obtained.
3. Departments need to inform dean’s office with changes they had incorporated
and these should be reflected on Banner. Similarly, deans’ offices should alert us
to courses with labs and lectures with different instructors, and to courses being
taught by more than one instructor or sections they would like combined ahead of
administration so that we can account for these variations.
4. Administering ICE on-line is being considered though main worry is expected
lowering of response rate. Current rate of around 75% increases reliability of ICE
results.
11
Appendix: ICE Item Averages by Faculty
1
Faculty
FAFS
FAS
OSB
FEA
FHS
HSON
F
4.15
4.25
4.17
4.06
4.26
4.20
4.20
Faculty
FAFS
FAS
OSB
FEA
FHS
HSON
F
3.95
3.84
3.86
3.68
3.88
4.01
3.82
2
S
4.13
4.27
4.19
4.09
4.18
4.13
4.21
F
4.28
4.38
4.31
4.17
4.41
4.32
4.32
6
F
3.93
3.92
3.88
3.64
3.92
4.10
3.86
4.23
4.39
4.29
4.23
4.30
4.16
4.33
F
4.13
4.12
4.07
3.94
4.16
4.18
4.08
7
S
3.94
3.87
3.92
3.73
3.77
3.91
3.85
F
4.01
3.97
3.96
3.85
4.06
4.17
3.95
12
Faculty
FAFS
FAS
OSB
FEA
FHS
HSON
3
S
3.92
3.94
3.90
3.73
3.83
3.89
3.89
F
4.01
3.90
3.83
3.68
3.91
4.01
3.86
4.13
4.13
4.12
3.95
4.06
3.95
4.09
4.00
3.99
4.01
3.86
3.87
4.02
3.97
F
4.10
4.09
4.05
3.91
4.08
4.23
4.05
3.95
3.91
3.87
3.77
3.81
3.89
3.87
F
4.01
3.91
3.87
3.79
3.96
4.03
3.89
12
4.10
4.15
4.07
3.92
4.08
4.08
4.09
F
3.96
3.87
3.87
3.74
3.88
4.11
3.86
9
S
4.05
4.13
4.09
3.96
4.03
3.96
4.08
F
4.06
3.89
3.86
3.80
4.04
4.07
3.89
F
4.07
3.99
3.93
3.86
4.12
4.09
3.97
3.98
3.90
3.89
3.82
3.96
3.97
3.89
15
S
3.96
3.92
3.90
3.83
3.84
3.91
3.90
S
3.93
3.90
3.87
3.75
3.84
3.94
3.87
10
S
14
S
5
F
4.08
4.12
4.07
3.87
4.14
4.29
4.07
8
S
13
S
4
S
F
4.00
3.74
3.73
3.88
4.12
3.77
11
S
4.02
3.97
3.95
3.89
4.03
3.95
3.96
3.92
3.77
3.80
16
F
4.08
3.93
3.96
3.76
3.91
3.79
4.09
4.19
3.96
S
F
4.00
3.86
3.87
3.73
3.95
3.97
3.85
S
3.93
3.88
3.87
3.77
3.86
3.80
3.86
4.02
3.94
3.96
17
F
3.96
3.80
3.78
3.90
3.80
3.79
3.91
4.08
3.95
3.84
4.01
3.82
3.66
3.93
3.80
S
S
Download