SETC: Building a better tool for student evaluations Dai Heide and Panayiotis Pappas, FASS Teaching Fellows April 12 2016 1. 2. 3. 4. 5. Introduction SETC project background i. Rationale ii. Values iii. Framework iv. Institution-Wide questions v. Faculty of Science questions SETC in FASS i. Timeline and process ii. Workshop #1: Type of questions iii. Workshop #2: Drafting questions Tips for Departmental Questions Complementing SETC Agenda (With thanks to Corinne Pitre-Hayes for material and advice) 2 • Evaluation forms at SFU were first developed 30 years ago • Recommendations for changes to the commonly used forms were made by SCUTL and TFTL • During consultation it became clear that the context of evaluation should also be reconsidered • The strong commitment to teaching and learning at SFU needs to be more clearly reflected in the evaluation instrument and processes that we adopt going forward Introduction 3 • Current research indicates the need for departments and instructors to be able to customize the evaluation form • The research also indicates: – Questions should focus on teaching and courses rather than on faculty characteristics (humor, warmth, etc) – Students should be given a clearer grasp of the pedagogical designs, intentions, and learning objectives to better equip them to provide useful and informed feedback – Administrators, departments and faculty members should be provided with guidelines for effective and responsible interpretation and use of the evaluation data in order to take into account issues of bias, validity, and reliability What is the rationale for the project? 4 • Importance of recognizing the complexity and multidimensionality of teaching • Concerns regarding validity and reliability: – Developing appropriate questions in the evaluation instrument(s) – Interpreting the data to reduce “noise” that affects validity and reliability • Multiple data sources are required to make valid judgments about overall teaching effectiveness • SETC is one source, but no single source is sufficient Key research findings to build on 5 • The initial phases of the Student Evaluation of Teaching and Courses (SETC) project ran from 2012-2013 Summer - Fall 2012 Fall 2012 – Spring 2013 Summer 2013 January 2014 Input from the SFU community and beyond 6 • The recommendations were in seven key areas: 1. Validity 2. Flexibility 3. Responsible use of the data 4. Use of evaluation data to improve teaching 5. Efficiency 6. Engagement 7. Structured support • A SETC Working Group was formed by SCUTL to identify institution-wide teaching and learning priorities and review policy The goal of implementation 7 Source: University of Toronto The framework; total less than 24 questions 8 Summer 2014 - Spring 2015 Implementation approach 9 1 The course instructor explained course concepts clearly. 2 The course instructor identified difficult areas when explaining course concepts 3 The course instructor created a respectful learning environment. 4 The course instructor was approachable when students asked for guidance. 5 The course instructor explained grading criteria clearly. Institution Wide Questions 10 6 The assessments in this course (tests, assignments, essays, etc.) allowed me to demonstrate my understanding of the course content. 7 Course materials (textbook, readings, handouts, assignments, etc.) improved my understanding of the course content. 8 Course activities (lectures, discussions, group work, labs, etc.) were engaging. 9 The different course parts/activities (lectures, labs, tutorials, online forums, discussions) were connected Institution Wide Questions (cntd) 11 1 The course instructor’s feedback on course assignments, projects, tests, and/or papers provided guidance on how to improve my performance in the course. 2 The course instructor related course concepts to practical applications, current issues or real-life situations. 3 The course instructor demonstrated an interest in student understanding when explaining course concepts. 4 The course instructor encouraged students to draw knowledge from other courses to understand course material. Faculty of Science Questions 12 • Instructors choose questions in 7th week of Fall Semester • July 15th: Departments submit set of questions • June 1st: FASS finalizes set of questions, and publishes them • May 9th: TFs distribute draft set of questions and request feedback • April 12th: Town hall provides input to TFs for going forward with drafting set of questions Timeline for our decision 13 • Instructor only? • Course only? • A mix (2 plus 2)? Workshop #1: what type of questions? 14 • The course expanded my understanding on important issues in the subject matter. • The course inspired me to learn more about the subject matter. • Course assignments, projects, tests, and/or papers helped me to develop skills I can use in other courses. • Compared to other courses, the workload for this course was: (very light —> very heavy) Workshop #2: Draft, Course only 15 • The course instructor encouraged students to ask questions about the course material • The course instructor was enthusiastic about the course material • The course instructor was receptive to different perspectives in class • The course instructor identified difficult areas when explaining course concepts Workshop #2: DRAFT, Instructor only 16 • The course expanded my understanding on important issues in the subject matter. • Course assignments, projects, tests, and/or papers helped me to develop skills I can use in other courses. • The course instructor was enthusiastic about the course material • The course instructor identified difficult areas when explaining course concepts Workshop #2: Draft, Mixed 17 • We can also create our own question(s) • If there is a strong preference for a question that is not available in the databank, then we can formulate it with the help of TLC and IRP Workshop #2: Creating our own question(s) 18 • How do we access the question databank? • The document will be made available to members of committees. It belongs to UoT and is not for public dissemination. • Who handles questions? • You can ask the TFs, or Corinne • Who owns the questions? • The questions are owned at the local level • Data is stored on the vendor's servers in Montreal under FIPPA compliant agreement. • Ownership is SFU. • Currently stored indefinitely. FAQs 19 • Try to set up a small ad hoc committee to drive the development of questions • Leverage existing groups and existing channels for decision-making • Focus on your teaching and learning priorities first • Take advantage of the UofT questions, and questions developed by other SFU units – see if any of them match your T&L priorities • Ask for help – we are happy to assist in any way that’s useful to you; IRP has great expertise in developing survey questions Tips for Departmental Questions 20 • Ensure you’ve asked yourselves – “Do these questions really apply to ALL courses in my department?”; if there is a question that doesn’t seem to apply, maybe it belongs at the instructor/course level • What to do about open-ended questions? • Don’t generate data you are not committed to understanding • Don’t double dip • SETC is not the place for assessing Educational Goals Tips for Departmental Questions 21 • The evaluations are only one perspective on teaching and course design • What are other practices that can provide a more complete picture? • IRP plans to build a model once enough data has been generated • Phase two of the working group on best practices guide (at SFU and beyond) • Course design question • Self reflection opportunity Complementing SETC 22 • Understanding and using the survey tool • Reading and understanding the reports • Achieving high response rates Future Considerations 23