Instructor Course Evaluations Fall/Spring 2013-14 Report The Instructor Course Evaluation System (ICES) prepared by the Office of Institutional Research & Assessment (OIRA) and approved by the Senate was administered on-line fall and spring 2013-14 in all faculties, with the exception of Faculty of Medicine (FM). The Instructor Course Evaluation Questionnaire (ICE) The items used in the previous administrations were used this year with some revisions introduced especially with regards to blended learning courses and course learning outcomes (FAS). The ICE includes the following components: 1. Student background items covering gender, class, required / elective status, expected grade in course, and number of hours worked for the course/week. 2. Core items (17) included in all forms. These are generic items that can apply to all courses irrespective of course design or size, and they can be used for normative scores and comparison across courses and over time to show improvement. They cover instructor (8), course (6), and student learning outcomes (3) in addition to global evaluation items. 3. Specific items selected by department/faculty (11-12) from item bank depending on type of course (lecture, seminar, lab, studio) and its size. Item bank includes specific items for large lecture courses, for labs/studio/clinical teaching classes, and for discussion classes. In addition, the item bank includes extra items on instructional methodology, student interaction and rapport, feedback and evaluation, assignments and student development. Items selected from them will supplement core questionnaire depending on type of course and kind of information required. 4. Open-ended questions focusing on instructor and course strengths and weaknesses and requesting suggestions for improvement. ICE Administration The ICE was administered on-line in the last four weeks of the fall and spring semesters using the on-line system prepared last year in collaboration with IT Academic Support and various faculties. The system uses the Banner Survey module. Specific detailed instructions were provided for students on how to access the system and periodic reminders were sent. To enhance response rate, a new measure was taken this term and it involved placing a 3-week hold on grades of students who do not fill their ICEs. Faculty were also provided with facility to follow up on response rates of their various course sections. Students were assured of the confidentiality of their responses and prompted to take the questionnaire seriously. The ICE was given to a total of 2398 (2204 last fall) and 2397 (2055 last spring) course sections in fall and spring, respectively, and a total of 36,478 and 33,287 (14,997 and 11,031 last year) student evaluations were filled out, much higher than last year because of increased response rate. The 40% response rate was dropped and all course sections were included in the evaluations. A breakdown of the sample of students by class, reason for taking the courses, and expected grade is reported in Table 1. Demographics are quite similar in both semesters, however, sample includes lower percentages of sophomores 1 filling ICE in spring, a lower percentage of students taking required courses from outside the major and university requirements in spring (15% vs. 18%; 9% vs. 11% respectively) and they, in general, have higher grade expectations in spring as 44% expect grade to be ≥ 85 vs. 42% in fall, slightly lower expectations than last year’s (49% and 45%, respectively. Table 2 provides the detailed breakdown of the surveyed population of courses and the percentage of course sections by faculty and also reports mean response rate per faculty, while Table 3 provides the breakdown by department. Faculty response rate was high ranging from 73-80% in the fall and 73-88% in spring, while departmental ranged from 31-94% in fall and 54-100% in spring. Table 1: ICE (Fall and Spring 2013-14) Sample Description Faculty % % Class % % Reason for taking F S F S Course FAFS Freshman Required from major 4 6 7 6 FAS Sophomore 41 Elective from major 63 60 37 FEA Junior Elective outside 19 20 26 28 major FHS Senior Required outside 3 2 21 23 major OSB 4rth Year University required 9 10 5 6 th SNU 5 Year 2 2 0 0 Graduate Special Expected Grade 90 85-89 80-84 70-79 70 %F %S 19 23 24 20 9 20 24 24 20 8 Number of hours ≤3 4–6 7 – 10 > 10 % F 30 15 14 % S 29 14 13 18 15 11 9 %F %S 27 30 24 14 27 30 24 12 With respect to departmental response rates, lowest were in MHRM (OSB), URPL (FEA), and MEST (FAS). Tables 2 & 3 also report the mean response rate for all course sections by faculty and department. Mean response rate for the whole sample ranges between 73-88% for faculties with FAS and FHS leading in fall and SN and OSB in spring term. 2 Table2: Surveyed Population of Courses & Response Rates by Faculty Response Rate Faculty Courses % >=40 Mean Resp. >=40 Rate F S F S F S F S Agricultural & Food Sciences 83 112 83 111 100% 99% 73% 74% Arts & Sciences 1212 1123 1191 1099 98% 98% 80% 79% Business Engineering & Architecture 161 183 151 178 94% 97% 76% 81% 364 396 354 394 97% 99% 78% 79% Health Sciences Nursing 76 121 61 175 74 117 57 170 97% 97% 93% 97% 79% 75% 73% 88% 2017 2050 1970 2009 97% 97% 77% 79% AUB Table 3: Response Rates & Courses Surveyed by Department Faculty Faculty FAFS FAFS FAFS FAFS FAS FAS FAS FAS FAS Dept. Dept. Count of Course >=0.4 % >=0.4 Mean Resp. Rate F S F S F S F S AGSC AVSC 24 7 40 10 24 7 40 9 100% 100% 100% 90% 64% 63% 66% 69% LDEM NFSC 25 27 27 35 25 27 27 35 100% 100% 100% 100% 78% 79% 78% 83% AMST 6 6 5 6 83% 100% 69% 73% ARAB AROL BIOL 38 10 117 38 6 111 37 10 117 38 6 109 97% 100% 100% 100% 100% 98% 80% 70% 85% 82% 80% 83% CHEM 112 114 111 114 99% 100% 83% 84% FAS CHIN 3 3 3 3 100% 100% 77% 72% FAS CMPS 76 69 73 69 96% 100% 79% 75% FAS CVSP 50 47 50 47 100% 100% 86% 80% FAS ECON 84 76 84 76 100% 100% 81% 81% FAS EDUC 46 33 45 32 98% 97% 81% 75% FAS ENGL 142 123 142 122 100% 99% 84% 83% FAS FAAH 40 37 40 37 100% 100% 84% 82% FAS FREN 2 2 2 2 100% 100% 91% 76% FAS GEOL 21 20 21 20 100% 100% 80% 79% FAS HIST 18 17 15 16 83% 94% 70% 68% 3 FAS 122 15 15 121 16 13 122 14 13 121 15 7 100% 93% 87% 100% 94% 54% 79% 70% 71% 81% 67% 46% PHIL 45 45 45 44 100% 98% 77% 77% PHYS PSPA PSYC SOAN STAT 87 55 33 26 22 78 50 33 27 18 86 54 33 26 22 77 48 33 25 18 99% 98% 100% 100% 100% 99% 96% 100% 93% 100% 79% 73% 85% 83% 83% 82% 74% 78% 73% 79% UPEN 15 11 11 7 73% 64% 68% 56% FAS UPHU 2 1 1 1 50% 100% 57% 67% FAS UPIT 3 3 3 2 100% 67% 74% 54% FAS FAS OSB OSB OSB UPMA UPSC ACCT BUSS DCSN 4 3 22 47 14 2 3 18 65 20 3 3 22 47 14 2 2 18 65 20 75% 100% 100% 100% 100% 100% 67% 100% 100% 100% 65% 81% 82% 80% 78% 75% 61% 77% 85% 83% OSB OSB ENTM FINA 2 14 2 18 2 14 2 18 100% 100% 100% 100% 78% 78% 80% 77% OSB OSB OSB INFO INFP MFIN 11 5 7 7 4 8 11 5 7 7 4 8 100% 100% 100% 100% 100% 100% 89% 53% 78% 78% 64% 91% OSB MHRM 10 15 1 10 10% 67% 31% 75% OSB MKTG 14 15 13 15 93% 100% 75% 81% OSB MNGT 15 11 15 11 100% 100% 83% 77% FEA FEA ARCH 30 33 30 33 100% 100% 83% 70% CHEN 20 26 20 26 100% 100% 88% 77% FEA CIVE 67 70 67 70 100% 100% 72% 79% FEA EECE 89 109 88 108 99% 99% 84% 86% FEA ENMG 20 21 19 21 95% 100% 73% 77% FEA ENSC 2 FEA GRDS 21 22 16 22 76% 100% 66% 76% FEA FEA FEA MECH URDS URPL 110 2 3 110 1 3 108 2 2 109 1 3 98% 100% 67% 99% 100% 100% 77% 63% 42% 76% 100% 72% FHS ENHL 13 13 12 12 92% 92% 73% 76% FHS ENSC 1 1 1 1 100% 100% 100% 50% FHS EPHD 17 8 16 8 94% 100% 80% 74% FAS FAS FAS FAS FAS FAS FAS FAS FAS MATH MCOM MEST 2 4 100% 53% FHS HMPD 9 8 9 8 100% 100% 87% 85% FHS HPCH 11 15 11 15 100% 100% 76% 81% FHS LABM 9 13 9 10 100% 77% 82% 56% FHS MLSP 5 3 5 3 100% 100% 94% 76% FHS PBHL 11 HSON NURS 121 11 175 117 100% 170 97% 69% 97% 75% Results Results, including comments were reported to each faculty member, department chair, and dean electronically. In addition to item means, averages/percentiles were reported for the instructor, the course and for student learning outcome development. In addition, category, faculty, and university percentiles/means were reported for each item and for each subgroup. Percentiles were computed using all course sections and not only those with equal to or more than 40% response rates to be comparable to previous year’s statistics. In addition, three additional reports were provided to the deans: one summarizing institutional performance on 17 core items by faculty, another providing summary data for all departments within their faculty, and a third providing a summary for each department in the faculty. Department chairs also got a copy of their department summary. Figures 1 and 2 present summary normative data for ICE subscales for the University and per faculty for fall and spring 2013-14 in comparison with 2009-13 scores. As in previous administrations, students’ evaluations of teachers (A) were, in general, higher (4.0-4.1) than their evaluations of courses (B) (3.9) and of learning outcomes(C) (3.8-3.9). ICE results this year show a rise from last year but comparable to previous performance. Averages for fall and spring 2013-14 are (A, mean=4.0; 4.0), course (B, mean=3.9; 4.0), additional items (D, 3.9; 4.0), and learning outcome (C, 3.8; 3.9). Ratings in spring are, in general, slightly higher than fall. Four-year results show stability of ICE ratings on the subscales over time, except for last year which should be taken as an exception because of lower response rate and change of administration format. With respect to instructor effectiveness by faculty (Figure 2a), mean scores ranged between 3.8-4.3 in the fall and spring. FEA reports lowest rating of 3.8, while SNU reports highest in both terms with OSB catching up in spring. Most of faculties show stability over the years, with FEA dropping in last two years. With respect to course evaluations, ratings ranged between 3.7-4.3. SNU is leading and FEA has lowest averages. Most of faculties improved in spring and over last two years. 5 88% Figure 1. Subscale Averages 2009-13 AUB Average per subscale 4.2 201420 4.1 201410 ICE Averages 4 201320 3.9 201310 3.8 211220 201210 3.7 201120 3.6 201110 3.5 201020 201010 3.4 A B C 200920 D ICE Subscales Figure 2 A Instructor Effectiveness by Faculty 4.4 I 4.3 C 4.2 E 4.1 A V E R A G E S 201420 201410 201320 4 201310 3.9 201220 3.8 201210 3.7 201120 3.6 201110 3.5 201020 3.4 201010 FAFS FAS FEA Faculties FHS 6 HSON OSB Figure 2B Course Evaluation by Faculty 4.3 I 4.2 C 4.1 E 4 A v e r a g e s 201420 201410 201320 3.9 201310 3.8 201220 3.7 201210 3.6 201120 3.5 201110 3.4 201020 3.3 FAFS FAS FEA Faculties FHS HSON OSB 201010 As to learning outcomes by faculty, scores ranged from 3.7-4.3. FAS and SNU showed improvement, while FAFS and OSB showed stability, and FHS were fluctuating. Additional items means ranged 3.8-4. SNU witnessed a significant drop in spring 2012 then improved, as for, its scores are fluctuating. Other faculties showed stability, except for FAS that showedimprovement. FEA did not have additional items in its ICE form. Figure 2 C Learning Outcomes by Faculty I 4.5 C 4.3 E A 4.1 v e 3.9 r 3.7 a g 3.5 e 3.3 s 201420 201410 201320 201310 201220 201210 201120 201110 FAFS FAS FEA Faculties FHS 7 HSON OSB 201020 Figure 2 D Additional Items by Faculty 4.6 I C E A v e r a g e s 201420 4.4 201410 201320 4.2 201310 201220 4 201210 201120 3.8 201110 3.6 201020 201010 3.4 FAFS FAS FEA Faculties FHS Table 4: Average of Overall Items by Faculty N Item # 8 Faculty F S F S FAFS 82 112 4.0 4.0 FAS 1212 1123 4.0 4.0 FEA 363 395 3.8 3.8 FHS 76 60 4.1 3.8 HSON 121 183 4.3 4.2 OSB 161 175 4.0 4.2 AUB 2015 2048 4.0 4.0 HSON 200920 OSB Item # 14 F S 3.9 3.9 3.7 3.7 4.3 3.8 3.9 3.9 4.0 3.8 3.9 4.0 4.1 4.0 As to item # 8, overall effectiveness of instructor, it averaged 4.0 for all faculties in both terms, also, item # 14, overall course effectiveness, averaged 3.9 in fall and 4.0 in spring. A breakdown of items 8 and 14 averages by faculty is reported in Table 4. For item 8, SNU has highest average in both terms, with OSB catching up in spring and FEA reporting lowest average, with FHS catching up in fall. Same interpretations apply for item 14 for both terms. Figure 3 presents 4-year trend of overall items. It does show increase from 2009, then stability with a drop in 2013 and then rise in 2014. 8 Figure 3 Trend Analysis for Overall Items 4.14 I C 4.09 E 4.04 A v 3.99 e 3.94 r a 3.89 g e 3.84 s 3.79 Overall Instructor Overall Course Fall 09 Sp 09 Fall 10 Sp 10 Fall 11 Sp 11 Fall 12 Sp 12 Fall 13 Sp 13 Fall 14 Sp 14 Appendix presents item statistics for the items by faculty, and for the whole university. 9 Table 5 presents subscale averages and their relevant quartiles per faculty and for the university. Faculties with highest subscale average are highlighted in blue. Table 5: Subscale Averages& Quartiles per Faculty& for University N Mean F 83 1201 S 110 1106 F 4.0 4.0 S 4.1 4.0 F 3.7 3.7 Percentiles 50 S F S 3.6 4.1 4.2 3.7 4 4.1 1 74 161 1520 48 1 55 183 1455 93 4.2 3.8 3.9 3.9 4.4 4.6 4.1 4.1 4.0 4.1 4.2 3.5 3.6 3.7 4.2 4.6 3.9 3.8 3.7 3.7 4.2 3.8 4 4 4.5 4.6 4.1 4.1 4.1 4.3 4.2 4.2 4.2 4.3 4.8 4.6 4.4 4.4 4.4 4.6 AG AS EA HS NU SB 83 1201 360 74 103 161 1982 110 1106 391 55 160 183 2005 3.9 3.9 3.7 3.8 4.3 3.8 3.9 4.0 4.0 3.8 4.0 4.2 4.0 4.0 3.6 3.6 3.4 3.6 4 3.6 3.6 3.6 3.7 3.4 3.8 3.8 3.8 3.7 4 4 3.8 3.8 4.3 3.9 3.9 4.1 4.0 3.8 4.1 4.2 4.0 4.0 4.3 4.3 4.1 4.1 4.6 4.2 4.2 4.4 4.3 4.1 4.4 4.6 4.3 4.3 AG AS EA HS NU SB 83 1201 360 74 103 161 1982 83 1201 1 74 103 161 1623 110 1106 391 56 160 183 2006 110 1106 1 55 160 183 1615 4.0 4.0 3.8 3.9 4.3 4.0 4.0 3.9 3.8 4.5 3.7 4.3 3.8 3.8 4.0 4.1 3.8 4.1 4.2 4.2 4.0 3.9 3.9 4.8 3.9 4.1 4.0 3.9 3.8 3.7 3.5 3.6 4.1 3.8 3.7 3.6 3.4 4.5 3.4 4 3.5 3.4 3.7 3.7 3.5 3.9 3.7 3.9 3.7 3.4 3.5 4.8 3.6 3.7 3.6 3.5 4.1 4.1 3.9 3.9 4.4 4.1 4.1 3.9 3.8 4.5 3.8 4.4 3.8 3.9 4.2 4.1 3.9 4.2 4.3 4.2 4.1 4.2 3.9 4.8 4.0 4.2 4.1 4.0 4.4 4.4 4.2 4.2 4.7 4.4 4.4 4.3 4.2 4.5 4.2 4.7 4.2 4.2 4.5 4.5 4.2 4.5 4.6 4.5 4.4 4.4 4.3 4.8 4.4 4.5 4.4 4.4 25 Additional Items AG AS EA HS SB AUB Clinical Items Course Evaluation AUB Instructor Evaluation AUB Learning Outcomes AUB NU AG AS EA HS NU SB 75 F 4.4 4.3 S 4.5 4.4 Table 6 presents subscale means by category of courses in every faculty. Lowest (red font) and highest (blue font) categories within each faculty were highlighted to facilitate comparison for improvement. 10 Table 6: Subscale Means per Category per Faculty Faculty FAFS FAFS FAFS FAFS FAFS FAFS FAFS FAFS FAS FAS FAS FAS FAS FAS FEA FEA FHS FHS FHS FHS FHS FHS FHS HSON HSON HSON OSB OSB OSB OSB OSB OSB CATEGORY AIII Graduate Lecture Lab Teaching Large Lecture Large Lecture & Lab Seminar Small Lecture Blended Learning Education-Method Education-NonMethod Humanities Sciences Social Sciences Blended Learning FEA Blended Learning Graduate Lecture Discussion Lecture Discussion Lecture + Assignment Lecture Lecture + Assignment Lecture + Lab Blended Learning Blended Learning HSON FORM A HSON FORM B ACCT BUSS FINA MKTG MNGT OPIM Instructor Course Learning Effectiveness Effectiveness Outcomes F S F S F S 3.9 3.7 3.6 3.5 3.7 3.7 4.1 4.2 4.0 4.1 3.9 4.0 4.8 4.4 4.6 4.3 4.5 4.0 4.1 4.1 4 4.0 4.0 3.8 3.8 3.6 3.8 3.5 4.1 4.3 4.3 4.3 4.2 4.2 4.1 4.0 4.0 4.0 4.1 4.2 4.5 4.5 4.4 4.5 4.4 COUNT F S 12 11 15 13 14 3 22 14 26 16 6 4 15 22 2 11 7 7 39 377 565 213 4 360 1 26 335 537 201 17 391 4 2 12 18 18 17 10 18 18 55 48 22 57 21 14 17 30 3.9 4.1 3.9 4.1 4 3.8 3.3 4.0 4.1 4.0 4.1 4.2 3.8 4.0 4.4 3.9 24 16 7 8 5 15 67 93 18 80 26 15 13 31 11 3.9 4.0 3.9 4 4.0 4.1 3.9 4 3.8 3.9 3.6 4.0 3.7 3.8 4.5 3.9 4.8 3.8 3.9 4.1 4.2 3.9 3.8 4.0 4.1 3.9 3.9 3.9 4.1 3.6 4.0 3.2 4.1 4.1 4.2 4.5 4.0 4.0 4.1 4.0 4.1 4.0 4.2 4.2 4.1 4.5 4.2 4.2 4.4 4.3 4.1 4.1 4.0 4.0 3.9 3.1 4.0 4.0 3.8 2.9 4.0 4.0 4.2 4.3 3.9 3.8 3.9 3.9 3.9 3.8 4.2 4.2 4.3 4.1 4.1 4.0 3.9 3.9 4.3 4.4 3.7 3.8 3.9 4.0 3.9 3.7 4.1 4.1 4.2 4.1 4.1 4.0 3.8 3.6 Conclusion: Accomplishments and Areas of Improvement ICE results are showing stability with slight improvement this spring as compared to last fall. Response rates are increasing and whole process is being taken more seriously. The fall and spring administrations went smoothly as we have become more organized and were able to anticipate problems ahead of time. The on-line administration was a success and reduced the administrative and paper work and the grade hold resulted in increased response rate averaging around 80%. Faculty were provided with option to view and monitor response rates in their respective sections, students were sent reminders, and periodic reporting of progress in response rate was posted on OIRA website and shared using emails. Reports generated followed format adopted last fall and faculty members were provided with an interpretive guide. In addition, summary institutional, faculty, and departmental reports were issued to deans and department chairs. These summary reports were also published on OIRA website for possible review by faculty and students, and this step provided evidence that the evaluations are taken seriously by faculty and by the administration. Procedures to produce the ICE reports were improved through automating most of the stages of the process and the report production. The building up of an ICE database enables us to produce trends report by teacher, course or department and/or by item. These reports are now available. Despite the above accomplishments, several problems were encountered that we hope can be overcome in future administrations: 1. The problem of getting up-to-date accurate information regarding courses/sections offered and their enrollment has improved though still exists in some faculties. OIRA is being better provided with up to date information on courses with labs and lectures with different instructors, and on courses being taught by more than one instructor or sections that need to be combined We obtain needed information from deans’ offices directly; however, these also do not always have most update information, especially with regard to part timers who have not signed their contracts. 2. On-line ICE makes use of Banner survey and its is not user friendly as presents one item per page and this is time consuming with a long survey. In addition, using Banner is not compatible with smart devices and so reduces student motivation to answer. Purchasing a new software is being considered to overcome the above problems. 12 Appendix: ICE Item Averages by Faculty 1 Faculty FAFS FAS OSB FEA FHS HSON F 4.12 4.14 4.15 3.95 3.99 4.33 4.11 2 S 4.13 4.16 4.25 3.92 4.19 4.22 4.12 6 Faculty F FAFS 3.87 FAS 3.87 OSB 3.84 FEA 3.61 FHS 3.68 HSON 4.21 3.83 F 3.8 3.92 3.83 3.63 3.72 4.21 3.86 S F 4.22 4.09 4.26 4.05 4.39 4.03 4.06 3.83 4.40 3.89 4.24 4.32 4.23 4.02 7 S F 3.87 3.91 3.92 3.91 4.02 3.95 3.64 3.73 3.99 3.8 4.13 4.31 3.89 3.9 12 Faculty FAFS FAS OSB FEA FHS HSON 3 F 4.29 4.25 4.27 4.13 4.18 4.39 4.24 4 S F 4.10 4.07 4.10 4.05 4.17 4.03 3.85 3.8 4.11 3.86 4.19 4.32 4.07 4.01 8 S F 3.94 4 3.98 3.99 4.18 4.01 3.75 3.8 4.03 3.8 4.18 4.31 3.97 3.97 13 S F 3.91 3.87 4.01 3.9 4.01 3.85 3.72 3.66 3.98 3.76 4.18 4.17 3.96 3.86 9 S F 4.03 3.94 4.03 3.94 4.15 3.85 3.80 3.75 4.06 3.87 4.18 4.29 4.01 3.91 14 S F 3.95 3.92 3.97 3.88 4.01 3.83 3.72 3.67 4.03 3.73 4.14 4.27 3.94 3.86 13 5 S F 4.07 3.89 4.09 3.85 4.19 3.81 3.81 3.61 4.11 3.68 4.22 4.2 4.05 3.82 S 3.93 3.91 4.03 3.68 3.93 4.09 3.89 10 S F 3.99 3.98 3.99 3.99 4.04 3.91 3.78 3.83 4.07 3.91 4.19 4.28 3.97 3.97 15 S F 3.93 3.86 3.95 3.71 4.03 3.68 3.75 3.95 3.68 4.14 4.3 3.93 3.75 11 S 4.03 4.04 4.11 3.86 4.10 4.22 4.03 16 F 3.86 3.88 3.77 3.61 3.67 4.24 3.83 S 3.92 3.96 4.00 3.68 3.92 4.13 3.92 S F 3.86 4.02 3.82 3.88 3.93 3.91 17 S F S 4.01 3.76 3.86 3.94 3.72 3.83 4.11 3.75 3.94 3.86 4.12 3.87 4.07 4.17 3.99 3.94 4.41 3.93 3.56 4.31 3.75 3.87 4.10 3.87