Draft changes to the Faculty Handbook   associated with changing the student evaluation system from SPOT to IDEA 

advertisement
Faculty Senate Evaluation Committee
DRAFT
March 27, 2013
Draft changes to the Faculty Handbook associated with changing the student evaluation system from SPOT to IDEA Formatted: Bottom: 0.81"
additions IV. D. 5. Applications for RTP: Instructions b.i.3.II. Supporting Documentation I. The supporting documentation normally can be presented in one, or perhaps two, three‐ring binders. It is not necessary or desirable to submit boxes of voluminous files. Supporting documentation must include: A. all of the candidate's IDEA student evaluation reports over at least the most recent two‐and‐one‐half years, organized in reverse chronological order. B. all refereed publications published since the candidate was appointed at the present rank. A "selected" sub‐collection of those publications is not sufficient, nor is a collection of photocopies of titles pages or tables of contents. Copies of publications will be returned to the candidate when the review process is completed. A table of contents or explanation of the organization of the supplementary documentation is helpful. b.ii. Instructions to the Administrator responsible for Making the Initial Recommendation 1. No later than four weeks prior to forwarding an application, for any SPOT results that may be required or requested for the application, a request from the Office of Institutional Research a "Report of SPOT Question 16 by semester" for each RTP candidate in time to include that report with Section IV of the application. IV. H. 3. Teaching Evaluations Because numerous studies have indicated that both peer and student evaluations are necessary for the equitable evaluation of teaching effectiveness, it is strongly suggested that peer and student evaluations be given similar emphasis in personnel recommendations. Deleted: deletions
Deleted: SPOT Deleted: summaries—the printouts summarizing results of all 16 evaluation questions—
Deleted: as well as copies of the Question 16 Section Summaries and frequency graphs over the same period. All SPOT results should be Deleted: (See Guidelines for the Administration, Use, and Interpretation of the "Student Perception of Teaching" (SPOT) Evaluation, especially Section B4.)
Formatted: Indent: Left: 1.5"
Deleted: The SPOT Guidelines document states, "
Formatted: Font: 12 pt
Deleted: "
Formatted: Font: +Body, 12 pt
Faculty Senate Evaluation Committee
DRAFT
March 27, 2013
a. Peer Evaluation A variety of methods of peer evaluation are in use throughout the campus. There is no single instrument for peer evaluation, but typically faculty are requested to include in their review materials syllabi, course tests and examinations, statements about new courses developed, and results of student evaluations. For new and non‐
tenured faculty and graduate teaching assistants, peer review includes direct observation of classroom teaching. b. Student Evaluation Every faculty member is evaluated by students every semester in all courses (including summer courses) using an instrument developed by the IDEA Center, a non‐profit organization that has provided validity‐tested student rating services to colleges and universities since 1975. The central focus of the evaluation is to provide information on students’ progress with respect to the learning objectives chosen by the faculty member. The use of the the IDEA instrument is mandatory, but is flexible enough to allow departments to add supplementary questions. The IDEA Center provides normed summary ratings adjusted for five factors beyond the direct control of the instructor: work habits, course motivation, size of class, course difficulty, and student effort. Also provided are research‐based insights on actions that might be taken to improve teaching. These reports are transmitted to the faculty member and department chairperson (or appropriate supervisor), together with a comparison to The IDEA Center’s national database. Student evaluations of teaching are considered, along with other measures and instruments of evaluation, in annual evaluation; in evaluation for reappointment, promotion, and tenure; and in post‐tenure review. The reliability of data gathered by way of student evaluation instruments depends, in part, on the establishment of a set of common practices for administration and use.UNC Wilmington has time‐tested procedures that coincide with those recommended by the IDEA Center. See the section Guidelines for the administration of student evaluation instruments. IV. H. 4. Guidelines for the administration of student evaluation instruments The reliability of data gathered by way of student evaluation instruments depends, in part, on the establishment of a set of common practices for administration and use. Guidelines for the administration and use of the IDEA student rating instrument follow. a. Administration Formatted: Font: 12 pt
Formatted: Font: +Body, 12 pt
Deleted: (SPOT)
Formatted: Font: 12 pt
Deleted: the institution‐wide Student Perceptions of Teaching (SPOT) questionnaire. This questionnaire and the instructions for administering it were developed by the Faculty Senate. Formatted: Font: 12 pt
Deleted: SPOT Deleted: although some departments also have additional student evaluation questions which are appended to the SPOT. Deleted: The department chairperson or appropriate supervisor receives the results of the SPOT from the Office of Academic Computing and shares them with the faculty member every semester. Deleted: SPOT results Formatted: Font: +Body, 12 pt
Formatted: Font: 12 pt
Deleted: The Faculty Senate has adopted a set of guidelines for the administration and use of SPOT (SPOT Guidelines)
Formatted: Font: 12 pt
Deleted: SPOT Formatted: Font: 12 pt
Deleted: The following statements constitute a set of guidelines for the administration and use of SPOT.
Faculty Senate Evaluation Committee
i.
ii.
iii.
iv.
v.
DRAFT
March 27, 2013
The IDEA instrument is administered online. Administration of the instrument shall ordinarily be conducted during the last ten class days of the semester (last five class days in a summer session). Administration at other times will be determined by agreement of the dean, the department chairperson and the faculty member. The IDEA instrument shall ordinarily be used by all instructors in all courses every semester including summer sessions. Recognizing, however, that some courses rely heavily on specialized, non‐classroom learning experiences (e.g., field‐based; hospital‐based; laboratory‐based; performance‐based), exceptions may be established at the departmental level by mutual consent of a faculty member and the appropriate chairperson. In such cases, some method of student evaluation shall be implemented by the department chairperson. Exceptions may also be established by such mutual consent for courses with enrollment less than five (5). Should departments wish to use additional evaluation instruments, these departmental instruments shall be administered after the administration of IDEA. If the IDEA instrument is administered in a classroom setting with desktop computers or mobile devices, the instructor shall leave the classroom and its vecinity Deleted: at a time convenient to the instructor
Deleted: Days when tests are being given or returned shall be avoided when possible.
Deleted: SPOT Deleted: Paper versions of SPOT will be used in traditional classroom settings, and online courses will use an online version of SPOT. Deleted: SPOT
Deleted: Administration of the paper SPOTs shall be delegated to an individual other than the instructor. That individual may be a student or another faculty member.
Deleted:
¶
A brief standardized statement of instruction shall be presented to each class prior to the administration of SPOT.
Formatted: Font: 12 pt, Font color: Accent 1
Formatted: Font: +Body, 12 pt, Font color:
Accent 1
Deleted: During the administration of the paper SPOTs, the instructor shall leave the classroom and its vicinity.
Formatted: Font: 12 pt
Deleted:
¶
... [1]
Formatted: Font: +Body, 12 pt
Formatted: Font: 12 pt
Formatted: Font: 12 pt, Bold
b. Analysis and reporting i. All analysis shall be completed by the IDEA Center, and reports returned in a timely manner through the appropriate administrative channels to the faculty member. ii. iii. ii.Every personnel action recommendation for reappointment, promotion, tenure, or post‐tenure review should contain all IDEA reports from the most recent two‐
and‐one‐half years.* All RTP recommendations shall include a qualitative interpretation of student evaluation results by the department chairperson, and may include—at the individual's discretion—the individual's own qualitative interpretation. All statistical calculations and quantitative analysis processed by anyone other than the IDEA Center (which is discouraged) must be clearly identified as such. Formatted: Font: +Body, 12 pt, Bold
Deleted: <#>Following administration of ... [2]
Formatted: Font: 12 pt
Formatted: Font: (Default) +Body, 12 pt
Formatted
... [3]
Deleted: Academic departments/units shall ... [4]
Deleted: Three copies of a course section ... [5]
Deleted: <#>
... [6]
Formatted
Formatted: Font: +Body, 12 pt
Formatted
... [7]
Formatted: Font: +Body, 12 pt
Deleted: a summary, in a standard format, ... [8]
Formatted: Font: +Body, 12 pt
Deleted: SPOT Deleted: Computing Services
Formatted
... [9]
Faculty Senate Evaluation Committee
DRAFT
March 27, 2013
*Some reports may also require (or include) courses evaluated by the previous SPOT student evaluation instrument. Until two‐and‐one‐half years from the Fall 2013 implementation of IDEA, a summary, in a standard format, of the individual's SPOT results for Q16 (at least) must be combined with the IDEA reports, to cover the most recent two‐and‐one‐half years. Deleted: SPOT Formatted: Font: (Default) +Body, 12 pt
Formatted: Font: (Default) +Body, 12 pt
Formatted: Font: +Body, 12 pt, Not Bold, Not
Italic, Font color: Auto
Deleted: SPOT c. Guidelines for appropriate use of student evaluation results i. Data from individual faculty gathered through the use of student evaluations shall be treated with confidentiality. The data may, however, be legitimately used by anyone directly involved with evaluation for the purpose of reappointment, promotion, tenure, post‐tenure review, or annual departmental review. Release of the data to anyone else is prohibited without the written permission of the faculty member. Because numerous studies have indicated that both peer and student evaluations are necessary for the equitable evaluation of teaching effectiveness, it is strongly suggested that peer and student evaluations be given similar emphasis in personnel recommendations. ii.
iii.
The Evaluation Committee (Senate Bylaws, Article V.B.1) of the Faculty Senate is charged with regularly reviewing both student and peer evaluation procedures, and with reporting and making recommendations for improvement to the Senate. In the case of a formal appeal of a reappointment, promotion, tenure, or post‐
tenure‐review recommendation, all parties involved directly in the appeal process shall be allowed access to the archived data pertinent to that case. iv.
Individual student evaluation results, when combined with qualitative interpretation by the department chairperson and with peer evaluations of teaching, can contribute to measuring an individual's teaching effectiveness and to identification of areas of strength and areas where improvement is possible. Under those conditions, student evaluation results are appropriately used for annual merit evaluation summaries, consideration for salary raises, RTP, and post‐tenure‐review decisions. Deleted: and with recognition of the need for continued study of the meaning and validity of these data
Deleted: shall not be released by anyone other than the faculty member to anyone who is not Deleted: , or to anyone who is not directly involved with the development of norms,
Deleted: Each department shall use a release form that will enable instructors to designate other individuals or groups who may have access to evaluation information. In addition, quantitative data shall not be released from the department, or comparable administrative unit, without an accompanying written interpretation of the data by the appropriate evaluating officer and, if he/she chooses, by the faculty member. The evaluating officer's interpretation shall explain how an instructor's scores compare with peers in the same department, discipline, or course assignment, as appropriate. Formatted: Indent: Left: 0.75", Numbered +
Level: 4 + Numbering Style: i, ii, iii, … + Start
at: 1 + Alignment: Right + Aligned at: 1.75" +
Indent at: 2"
Deleted: ¶
<#>¶
<#>Instructors shall be given no access to individual response forms prior to submission of grades and completion of processing by Computing Services.¶
Deleted: SPOT Deleted: SPOT Formatted: Font: (Default) +Body, 12 pt
Deleted: SPOT Formatted: Font: (Default) +Body, 12 pt
d. Warnings against inappropriate use of student evaluation results i.
ii.
Averages (raw or adjusted) from the IDEA reports must not be "averaged" to produce a "combined IDEA score." Formatted: Font: +Body, 12 pt, Not Bold, Not
Italic, Font color: Auto
Deleted: Standard deviations that are reported by section (resp., department) for each item measure the extent to which student responses are "scattered" within that section (resp., department). They do not measure the manner in which instructor means are distributed, hence should not be ... [10]
Deleted: Means Deleted: for the sixteen SPOT items Deleted: SPOT Faculty Senate Evaluation Committee
iii.
iv.
v.
DRAFT
March 27, 2013
Averages (raw or adjusted) for two or more courses must not be averaged to obtain a "semester IDEA score" for an individual. Averaging IDEA scores from several different courses across several semesters to obtain an "overall individual IDEA score" is improper. Direct comparisons of ratings from the version of SPOT used from fall 1992 through summer 2004 to ratings from the revised version implemented in fall 2004, or to ratings from the IDEA system implemented in fall 2013, are not appropriate. V. A. 1. h. Teaching Evaluations Because numerous studies have indicated that both peer and student evaluations are necessary for the equitable evaluation of teaching effectiveness, it is strongly suggested that peer and student evaluations be given similar emphasis in personnel recommendations. i. Peer Evaluation A variety of methods of peer evaluation are in use throughout the campus. There is no single instrument for peer evaluation, but typically faculty are requested to include in their review materials syllabi, course tests and examinations, statements about new courses developed, and results of student evaluations. For new and non‐tenured faculty and graduate teaching assistants, peer review includes direct observation of classroom teaching. ii. Student Evaluation Every faculty member is evaluated by students every semester in all courses (including summer courses) using an instrument developed by the IDEA Center, a non‐profit organization that has provided validity‐tested student rating services to colleges and universities since 1975. The use of the the IDEA instrument is mandatory, but is flexible enough to allow departments to add supplementary questions. The IDEA Center provides normed summary ratings adjusted for five factors beyond the direct control of the instructor: work habits, course motivation, size of class, course difficulty, and student effort. Also provided are research‐based insights on actions that might be taken to improve teaching. These reports are transmitted to the faculty member and department chairperson (or appropriate supervisor), together with a comparison to The IDEA Center’s national database. Student evaluations of teaching are considered, along with other measures and instruments of evaluation, in annual evaluation; in evaluation for reappointment, promotion, and tenure; and in post‐tenure Deleted: Mean scores Deleted: SPOT Deleted: SPOT Deleted: SPOT Deleted: e. Guidelines for interpretation of SPOT results¶
¶
<#>Guidelines for SPOT Ratings Collected 1992‐2004:
¶
<#>There is strong evidence that the SPOT questions as a whole give a valid measure of characteristics of effective teaching, and that the results are reliable. Moreover, there are ample reasons to support the use of the Question 16 section mean as the best single measure of student perception of teaching.
¶
<#>SPOT scores should, whenever possible, be viewed in the context of the immediately preceding five semesters. Comparisons should be general and should not ascribe meaning to the precision with which means are reported. (For example, a mean of 4.22 on Question 16 for a certain course might properly be described as lying in the second highest quintile of UNCW Question 16 scores, but should not be viewed as different from a score of, say, 4.18.)
¶
<#>The receipt of a Q16 section mean in the lowest quintile is not necessarily an indication of poor teaching. Only 2.6% of student responses campus‐wide to Question 16 are "poor", and if every student were to answer "average" to Question 16, the mean ... [11]
Formatted: Font: 12 pt, Bold, Italic, Font
color: Accent 1
Formatted: Font: 12 pt
Deleted: The SPOT Guidelines document states, "
Formatted: Font: +Body, 12 pt
Deleted: "
Formatted: Font: +Body, 12 pt
Formatted: Font: +Body, 12 pt
Deleted: (SPOT)
Deleted: the institution‐wide Student Perceptions of Teaching (SPOT) questionnaire. This questionnaire and the instructions for ... [12]
Deleted: SPOT Deleted: although some departments also have additional student evaluation questions which are appended to the SPOT. Deleted: The department chairperson or appropriate supervisor receives the results of the SPOT from the Office of Academic ... [13]
Deleted: SPOT results Faculty Senate Evaluation Committee
DRAFT
March 27, 2013
review. The reliability of data gathered by way of student evaluation instruments depends, in part, on the establishment of a set of common practices for administration and use.UNC Wilmington has time‐tested procedures that coincide with those receommended by the IDEA Center. See the section Guidelines for the administration of student evaluation instruments. Formatted: Font: (Default) +Body, 12 pt
Deleted: The Faculty Senate has adopted a set of guidelines for the administration and use of SPOT (SPOT Guidelines)
Page 3: [1] Deleted
ITSD
3/7/2013 2:24:00 PM
Departments shall avoid practices which compromise student anonymity (i.e., student names and/or identification numbers shall not appear on evaluation forms). i.
Page 3: [2] Deleted
ITSD
3/7/2013 2:24:00 PM
Following administration of the paper SPOTs, the evaluation forms shall be sealed in an envelope and returned immediately to the departmental office. Department chairpersons will keep these secure and will forward them for processing. No analysis or interpretation is to be made by anyone prior to processing of the SPOT forms by Computing Services. Page 3: [3] Formatted
ITSD
3/7/2013 2:26:00 PM
Font: +Body, 12 pt, Not Bold, Not Italic, Font color: Auto
Page 3: [4] Deleted
ITSD
3/7/2013 2:26:00 PM
Academic departments/units shall deliver the administered forms, with blank forms removed, to Computing Services by the last working day of final exams (within one week of final exams for summer sessions) for analysis. The analysis will not be done until after all grades have been submitted to the registrar. i.
Page 3: [5] Deleted
ITSD
3/7/2013 2:28:00 PM
Three copies of a course section summary (for each instructor, if team‐taught) shall be prepared; one for the instructor, one for the department chairperson, and one for the instructor's dean. This summary shall contain, for each item Q1 through Q16 and for any optional supplemental items the percentage of responses in each response category. For Q16 the summary shall contain the individual's response mean, the individual's response standard deviation, the individual's minimum and maximum responses, the number of students enrolled in the section, the number of students responding, the departmental response mean, and the departmental response standard deviation. In addition, the instructor shall receive one copy of the response frequencies of all SPOT items, including the demographic information. ii.
Page 3: [6] Deleted
ITSD
3/7/2013 2:30:00 PM
Computing Services shall also provide to each instructor and his or her department chairperson and dean a Question 16 Section Summary for each section evaluated by SPOT. That summary shall contain: course and section number, instructor's name, and semester (or summer session); the section mean on Question 16; a histogram of the responses to Question 16 by students in this section.
Page 3: [7] Formatted
ITSD
3/8/2013 9:13:00 AM
Normal, Indent: Left: 0.5", Hanging: 0.5", No bullets or numbering
Page 3: [8] Deleted
user
3/27/2013 5:37:00 PM
a summary, in a standard format, of the individual's SPOT results for Q16 (at least) over the most recent two‐and‐one‐half years, together with a visual representation of trends. (An accumulation of Question 16 Section Summaries over that period would accomplish this.) Page 3: [9] Formatted
ITSD
3/7/2013 3:24:00 PM
Font: (Default) +Body, 12 pt, Font color: Black
Page 4: [10] Deleted
ITSD
3/7/2013 2:47:00 PM
Standard deviations that are reported by section (resp., department) for each item measure the extent to which student responses are "scattered" within that section (resp., department). They do not measure the manner in which instructor means are distributed, hence should not be used to conjecture what percentile an instructor's mean score represents (or even how good or how bad a mean score is). i.
Page 5: [11] Deleted
ITSD
3/7/2013 2:52:00 PM
e. Guidelines for interpretation of SPOT results Guidelines for SPOT Ratings Collected 1992‐2004: There is strong evidence that the SPOT questions as a whole give a valid measure of characteristics of effective teaching, and that the results are reliable. Moreover, there are ample reasons to support the use of the Question 16 section mean as the best single measure of student perception of teaching. SPOT scores should, whenever possible, be viewed in the context of the immediately preceding five semesters. Comparisons should be general and should not ascribe meaning to the precision with which means are reported. (For example, a mean of 4.22 on Question 16 for a certain course might properly be described as lying in the second highest quintile of UNCW Question 16 scores, but should not be viewed as different from a score of, say, 4.18.) The receipt of a Q16 section mean in the lowest quintile is not necessarily an indication of poor teaching. Only 2.6% of student responses campus‐wide to Question 16 are "poor", and if every student were to answer "average" to Question 16, the mean (3.00) would lie in the lowest quintile. However, receipt of Question 16 means in the lowest quintile over a period of several semesters may indicate an opportunity for improvement. Examination of and peer evaluations may reveal ways to improve student perception of an individual's teaching. Guidelines for SPOT Ratings Collected from the Revised Version Implemented in Fall 2004: Revised SPOT questions have been selected from reliability‐tested instruments at UNCW and other institutions, and have been edited by the best judgment and experience of UNCW faculty. Revisions to the 1992‐2004 SPOT instrument have been made in every case to improve the philosophy of the survey as a whole, the survey questions themselves, and the quality of the information collected. Page 5: [12] Deleted
ITSD
3/7/2013 1:51:00 PM
the institution‐wide Student Perceptions of Teaching (SPOT) questionnaire. This questionnaire and the instructions for administering it were developed by the Faculty Senate. Page 5: [13] Deleted
ITSD
3/7/2013 1:55:00 PM
The department chairperson or appropriate supervisor receives the results of the SPOT from the Office of Academic Computing and shares them with the faculty member every semester. 
Download