Experiences on Course Development Practices in a Power System Analysis Master Course a.k.a. tales of a journey through the valley of death in providing a high quality learning environment Assoc. Prof. Dr. Ing. Luigi Vanfretti and Maxime Baudette KTH SmarTS Lab luigiv@kth.se KTH Scholarship of Teaching and Learning Acknowledgment There has been a lot of former students involved in this efforts, through teaching assistant duties and beyond, which we can’t list here – but you know who you are, and we thank you very much! A very special thanks is expressed to Mostafa Farrokhabadi, who was the first student helping with the research work related to this talk until Maxime Baudette got involved. Outline • Motivation • Constructive Alignment Theory • Implementation: T&L activities, Assessment Activities • Evaluation of the implementation: Feedback, RSPQ-2F, Ranking Algorithm • Filtering Feedback: Repertory Grid • Introducing Peer Instructions with Clickers EG2100 in Brief Introductory course on Power System Analysis • 6 ECTS (7.5 ECTS before 2014) • Given on 2 periods (September to December) • About 85 students The course is divided in five important topics: • Modern Power Systems with an Introduction to Sustainable Energy Technologies and Smart Grids • Fundamental Principles for Power System Analysis AC circuits • Electrical Modeling of Generators, Transmission Apparatus and Networks • Methods for Analysis and Design of Power Networks in Steady State and Unbalanced Operation • Steady State Stability Analysis using the Power-Flow Formulation • Methods for Analysis of Power Distribution Systems in Steady State “I never teach my pupils, I only attempt to Motivation provide the conditions in which they can learn.” A. Einstein The previous course (until 2010) that was replaced by this one had several drawbacks in its pedagogic design and material: • The course material was outdated, inconsistent and lacked appeal. • No teaching method was used to encourage the learning process The failure rate of the course was very high These drawbacks required a complete overhaul of the course: from preparing new material, to the design of a new implementation using teaching methods. How to address a large variety of background? Students’ Background Students come with a very different background, and often lack any exposure to power systems at all: • Mandatory for M.Sc. Program in Electrical Power Engineering (TELPM), and M.Sc. Joint Program in Smart Electrical Networks and Systems (SENSE) • Elective in other M.Sc. Programs at the school of Electrical Engineering and some Erasmus Mundus M.Sc. Programs • Also offered to power and energy professionals within industry collaborations and PhD students (including those from other universities) This variety in the students’ background makes it challenging to design well aligned and adequate T&L activities and proper assessment methods. Redesign of the Course - Goals In 2011 it was decided re-design the course to achieve the following goals: • Increase the success rate • Largely increase the depth and moderately the coverage of the course content • Link the course to the current developments in the industry and academia, including the “Smart grid” concept Re-Design Approach: Implement the constructive alignment theory (CAT) as the didactic teaching approach of choice for the course Adopt the consensus-oriented decision-making (CODM) model to design the course Develop effective multiple feedback channels to gather the students’ perception and select relevant feedback Constructive Alignment – Basic Principles (Biggs and Tang) Basic principle: The theacher’s fundamental task is to get students to engage in learning activities that are likely to result in their achieving the desired outcomes in a reasonably effective manner. What the student does is more important in determining what is learned than what the teacher does. Constructive Alignment Theory: • Provides a set of principles • These principles can be used to devising Teaching and Learning Activities (T&Ls) • The activities can help in achieving the Intended Learning Outcomes (ILOs). Using Principles of Constructive Alignment Course Objectives What should the student be able to do as a result of the course? (Intended Using the principles from constructive Learning alignment: Outcomes) • Start by carefully aligning T&LAs and Assessments • These activities should to support the students to fulfill the ILOs. • The students role (MAJOR): Teaching and – The students use the activities to Assessment Learning construct their knowledge and achieve Activities desired outcomes. • The teacher’s role (minor): – To design a learning environment that What should the encourages the student to perform the What work is appropriate students do to T&Ls that aid the students to construct for students to do demonstrate that they their knowledge. reached the objectives? to reach the ILOs? Variety of Background – Affects the proper implementation of CAT Affects student’s readiness to engage in the Teaching and Learning Activities Diagnostic test results show a low average in the students preparation Fundamental knowledge varies a lot. % of correct Question answers 1 99 2 40 3 90 4 70 5 53 6 84 7 59 8 62 9 69 10 27 11 32 12 84 13 69 14 53 15 77 16 90 17 67 18 63 19 67 20 74 21 44 22 68 23 51 Total 64.8 CODM – Design Process (Hartnett) Framing the Problem Links the design group to the consensus model. Five 2-hour sessions (10 hrs) Allows to deliver a shared proposal: • Important if the design group will also be the implementation group (needed commitment). Open Discussion Identifying Underlying Concerns Closing Developing a Preferred Solution Developing Proposals Choosing a Direction Design Rule for CAT Implementation o Pr d w ith of s O IL se ru pe time ne (Written Test) ith w ily d Da nd s ne , a e i g e s s , ci s Al ctur ise xer c Le er y E Ex ekl e W l T&L Activity ig Al ua pt Choice and Mix of Types of Questions Well Designed Test ly ai D Effective Support of Course ILOs s se is In-Class Test Solution W ee ce (T es Le ct t P ct i n u re re u re to r pa r M (D ra an iv tio Te a g i si n) o ac ea n hi bl of ng e S G tu M As ro d e on s up n ito ista s) ts ri n n g, ts ( So Te l u st tio D n ) eli ve ry , on Staff Preparation for the T&L Activity al ic ns er atio um t N pu om C C Staff Preparation Le Peer Assessment kl y Ex c Ex c er er ci ci se s es ct ur Le ( I ow lea den n rn tify m fro is m ta ) ke s Student Preparation (le Ide ar nt n if o m th fro y is er m ta s ) ke s Self-Reflection on Peer Assessment Cause Effect L su e a e s rn ot rai from h e se rs d by Assessment Method C la m rify is ta ow ke n s Student Readiness for the T&L Activity C le In ar st G ru ra ct d i io ng Fa ns ir As by s Pe ess e r me s n t Example of using the consensus based model Step 1 (Framing the problem): design rule used for one of the T&L activities using Cause-and-Effect analysis. T&L Activities Course Objectives 1: Lectures (22) , Invited Lecture (4) 2: Reading Quiz, Conceptual Quiz (Intended Learning Outcomes) [1], [2] 3: Homeworks (3) [3],[4] 4: Tests (2) [5] 5: Final Exam Teaching and Learning Activities Assessment T&L Activities - Lectures Lectures are intended to develop conceptual understanding: T&L Activities- Invited Lectures Invited Lectures are intended to develop the interest of students by providing insights from the industry and top researchers from Academia. Sample of Invited Lectures • Magnus Danielsson. System Design, R&D, Net Insight AB, Stockholm, Sweden. – Lecture on Communication Issues in Smart Grid Applications • Svein Harald Olsen, Statnett SF, Oslo, Norway. – 2 Lectures on IEC CIM • Sonja Berlijn, SVP R&D, Statnett S, Oslo, Norway. – Lectures active R&D projects (Lean Line + Voltage upgrading) • David Petesch/Yannick Fillon, RTE - National Center for Grid Expertise, Paris La Défense – France – Lecture on EMTP and Real-Time Simulation at RTE • Prof. Federico Milano, UCD – Challenges for power system modelling and simulation T&L Activities- Daily Reading Quiz, Conceptual Quiz In total, they are worth 10 BONUS points of the final grade. Reading Quiz consists of some very basic questions about the content of the lecture • The questions are designed in a way that a student should go through the lecture content before arriving to the lecture! Peer Instruction Implementation Conceptual Quizzes asked about the lecture content during the lecture. • The questions addressed different learning levels, focusing on developing deep learning. They sometimes involved a small amount of numerical calculations. The students answered the questions using a remote clicker. T&L Activities – Homeworks For the students to practice the methods and study the concepts covered during the lectures. Classroom There are 3 Homeworks (30 points) covering the contents learned in the respective lectures. Home! The homework are aimed to prepare the students for the assessment tasks (tests and final exam). In 2014 computer assisted assignments were introduced. Assessment – Tests and Final Exam • 2 Tests during the semester . • Final exam The tests and exam were built in two parts, with theory questions in the first part and problems in the second part. The problems were similar to those solved in the homework. The assessment of the course is done though the different graded activities of the course. All the grades are added to give a total grade on 100 points Total Points Grade Letter 10 (Bonus) 94 – 100 A Homeworks 30 87 – 93 B Tests 30 81 – 87 C Final Exam 40 74 – 80 D Total 100 (+10 Bonus points) 65 – 73 E 60 – 64 Fx 00 – 59 F Course Activity Points Daily Quiz Evaluation of the implementation Three Course Evaluations • One after each test (end of P1 and end of P2) • One after the final exam R-SPQ-2F Evaluation (at the beginning of the course) • Two different question types (surface and deep approach) • Students are graded using their answers • The results will reveal the students approach Interviews using the Repertory Grid Technique • A form of structured interview to find out a participant’s preference on a given topic and the way these preferences are ordered on a rating scale • Allows to link qualitative to quantitative feedback. Explanation on KTH Social Instructions on Form Quantitative Data Qualitative Data Sample Feedback using Google Forms Quantitative Feedback Qualitative Feedback Course Evaluations 2011-2014 After 2011, the form was improved for including an introductory part with instructions, explaining the students the grading scale they should use. Course Evaluations 2012-2014 Course Evaluations - YOY improvements The feedback provided by students that retook the course shows that the changes improved the course. The acceptance of the changes comes after the students realize that it has a positive impact on their learning. Note: very reduced population for statistical analysis Sample Question 10 Is the new course structure (lectures, daily exercises, weekly exercises, and 9 weekly tests) preferable to the previous course structure (lecture, 8 assignments, and final examination)? 7 6 10 5 9 4 8 3 7 2 6 1 5 0 4 1 2 3 Evaluation 1 Evaluation 2 Final Evaluation E E 3 4 F 5 2 1 0 1 2 3 4 5 Activity No. 1 – Design of a course evaluation process Instructions: Discuss in small groups (no larger than 3) the following questions. Using your answers, propose a design to evaluate a “sample” course. What feedback do you need for your particular course? • T&L activities: do they contribute to the student’s understanding? Do the T&Ls have an impact on the Assessment Activities? In what form do you want it and how much? • Quantitative, Qualitative, Both How do you gather it in an efficient way? • What tools do you use. How often do you gather it? • Number times and type of feedback to be gathered – (a) that allows to trace changes, (b) of new information (originating from the changes). How do you use it? • How do you show the students that their opinion is considered. The Revised two-factor Study Process Questionnaire: R-SPQ-2F Allows to determine the students’ learning approach The process is exposed to some drawbacks! Deep Approach Surface Approach 50 45 40 Points (max 50) 35 30 25 20 15 10 5 0 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59 Students' Number R-SPQ-2F: A new ranking algorithm 1st Quarter 3rd Quarter 2nd Quarter 4th Quarter Deep Learning Surface Learning (Grade Rank ) DA (Grade Rank ) SA Grade distribution and its correlation with students’ learning approach Filtering Feedback: Repertory Grids The repertory grid method is a “person-centered” approach that allows the student to develop their own constructs using their own perception of the subject. Repertory grids serve as a feedback mechanism where the students can assess course elements using well-defined constructs and considering different scales. The construction of each grid allows a methodological analysis of the grids to identify constructive feedback to improve the course: a formal quantitative method to select and discard feedback. Good Feedback Further Analysis Good Grid Bad Feedback Discard Bad Grid Implementation of repertory grids The elements of the grid were determined by the T&L activities. The constructs were built from interviews with some students • A small group of students (about 10%) was carefully selected to represent the different learning approaches • Each student was interviewed individually about the different elements (qualitative data collection) • The constructs were identified by analyzing all interviews An empty grid was sent to each student who should fill it in Repertory grid analysis Different software provide methods for analysis of the grids (Rep IV, Idiogrid), such as: • Principal Component Analysis (PCA): allows better interpretation of feedback. • Cluster Analysis: re-orders elements in a tree, similar objects are placed together. Allows to interpret the feedback. • Descriptive analysis (Mean, Mode) Activity No. 2 – Feedback Analysis Instructions: Assume that you have a high volume course (many participants). In this activity, you should analyze the method for feedback analysis presented, and propose alternatives to relate the students to their learning approach and classify which feedback to address in a systematic (reproducible) way. • Are there other alternatives (methods) to analyze the course feedback and relate it to the student’s learning approach? How do you check if your implementation of CAT is working? • How can you use the proposed method to filter the different feedback and choose which to address? Peer Instruction Peer Instruction and the use of “Clickers” Reading Quiz and Conceptual Quiz: To implement peer instruction there is very good technology available • These are called clickers and are used for gathering students feedback in real-time. Clickers were given to each student and kept by them throughout the course. A flexible tool for different uses! In the EG2100 course, clickers are used for the following: • Unique identifier for each student • Attendance registration during Lectures • Pre-requisite test at the beginning of the course (not graded) • In-class questions: • Tests on the mandatory pre-readings (5 questions) • Questions during the lecture (~5 questions) • Rewarded by bonus points In-class interactions Tests on the mandatory pre-readings (5 questions) • Questions about the content of the pre-readings • Regular questions, all done in a row Questions during the lecture (~5 questions) • Questions about the subject being taught • Regular questions, often one by one • If “average results” are obtained, students are invited to discuss with their neighbors and answer the question once again (peer instruction) • If “bad results” were obtained, the lecturer re-explained the material. Preparation of Teaching Materials for Peer Instruction Material has to be prepared to introduce the peer instruction as a change of activity. The lectures need to be prepared in 10-15 min modules Questions need to be prepared to assess deep learning (or surface when needed) Instead of a review at the end of the lecture, peer instruction allows for review of deep learning concepts during the lecture. This has a cost: Your time! – Think if you can afford it before doing it! Pros/Cons Pros: • Greatly appreciated by the students: • Higher attendance and better preparation of the students • Better in-class attention • Helps the teacher to get direct feedback (e.g. unclear notions) • Flexible ! Cons: • Additional Data Management • Technical issues may happen in class (crash, bad reception, empty battery, etc.) • Different lecturing style (might be hard to accept by students in early phases) Student’s feedback about the clickers The feedback received is very positive and encouraging for continuing this peer-instruction implementation The mandatory prereadings helped your understanding of the course material. It was helpful to discuss your answers with your classmates (peer instruction) The daily quizzes helped you keeping up to date with the course materials. It was fun to use the clickers. It helped you be more attentive in class Activity No. 3 – Peer Instruction Method Using Clickers Instructions: Assume that you have already your teaching prepared from previous course deliveries. In this activity, you should analyze how to modify the course materials to adapt it to the peer instruction method using clickers. Discuss the following questions, provide a proposal on how to implement peer instruction and an example slide for your particular course? • At what learning depth level do you intend to use peer instruction? • When using the peer instruction method, what factors do you need to consider so that you can use clickers to obtain quick feedback of the student’s understanding of a particular topic? • How do you transform (timing, amount of material and content) your course material to adapt it to the peer instruction method? Highlights and Words of Caution • A wider variety of stakeholders can participate in CODM process – this would allow for a very good course design addressing different points of view! • CAT is a good vehicle to enhance the students’ learning. It’s implementation has to be considered carefully. Implementing CAT is extremely expensive – DO NOT BELIEVE the propaganda that says the contrary! • R-SPQ-2F new ranking algorithm efficiently classifies the students’ learning approach depth. Useful for sampling feedback from students with different learning approaches. • The two main challenges for the course designers are: 1) To continuously modify the course so the surface approach students move toward deep approach. 2) To do carry out (1) when there is absolutely no benefit to the teaching staff (a lot of work that brings no reward) Discussion (1/2) – Continuity and Effort • Continuity: • The teaching staff changes from year to year depending on the availability of PhD students (Luigi is always involved and the only one to provide continuity) • Course planning based on consensus requires continuity, in order to focus on improvements instead of challenging the whole structure • Effort: Workload and Time • The T&L activities require a large workload for the teaching staff to prepare and correct the assignments (very expensive and not properly funded) • Course planning based on consensus requires time and involvement from the teaching staff • The feedback process requires a lot of time for gathering and analysis of the results • Integration of clickers in the lectures requires a lot of time (very expensive!) in order to provide a good experience for the students (questions should be well organized and well formulated) Discussion (2/2) • Systemic factors: brace yourself to a large inertia and resistance to change • Students and other faculty will oppose different didactic approaches to those that they are used too – you will face a large resistance to change. – Not having a “single exam” to be able to pass the course has been criticized and brought up to the program coordinator year after year… – This attitude is hard to change in the 4 th year. However, we have shown that the acceptance of the students improves over time when they are able to reap the benefits of a good learning environment. – You will waste your time with complaints from the student union, program coordinators, department head… just for trying to do things differently (I would say BETTER!) • There is no means to enforce student feedback at KTH, this weakens statistical analysis and the amount of samples. – Changes in rules or flexibility is needed if KTH wants to take education seriously. • No pain, No gain? More like lot’s of pain, and no gain! • The one most important thing we have learned is that there is no reward for good teaching in KTH – do it at your own risk and for your own self development. • The only benefit you could obtain, by investing even more time, is to try to generate some publications for your teaching portfolio (in case you want to find somewhere else to teach with quality). • Epilogue – A tale just like that of the one ring Even though this has been a wonderful adventure in teaching and learning, all good things must come to an end. • Luigi can’t afford to continue to subsidize the delivery of the course by using time from his PhD students (which translates to use the funds from research projects!) to deliver the course. – A loss of 290 000 SEK was generated in 2014. – A good learning environment comes at a cost: we do not get paid enough to deliver a high quality learning environment. • Luigi will not longer be the responsible for this course nor involved in it, and with that goes the 4 years of experience… • But at least we enjoyed the walk through the valley of death, and we have learned a lot about how to deliver a high quality learning environment! Thank you! “It's really hard to design products by focus groups. A lot of times, people don't know what they want until you show it to them.” — Steve Jobs Questions? luigiv@kth.se Good luck with the rest of your day! References [1] L. Vanfretti and M. Farrokhabadi, “Consensus-Based Course Design and Implementation of Constructive Alignment Theory in a Power System Analysis Course”, European Journal of Engineering Education, 2014. https://dx.doi.org/10.1080/03043797.2014.944101 [2] L. Vanfretti and M. Farrokhabadi, “Evaluating Constructive Alignment Theory Implementation in a Power System Analysis Course through Repertory Grids”, IEEE Transactions on Education, vol. 56, no. 4, pp. 443 – 452, Nov. 2013. DOI: http://dx.doi.org/10.1109/TE.2013.2255876 [3] L. Vanfretti and Federico Milano, “Facilitating Constructive Alignment in Power System Engineering Education using Free and Open Source Software,” IEEE Transactions on Education, vol. 55, no.3, August 2012, pp. 309-318. DOI: http://dx.doi.org/10.1109/TE.2011.2172211 [4] S. Taylor, “Managing postgraduate research degrees”. In H. Fry, S. Ketteridge and S. Marshall (eds) The Effective Academic: A Handbook for Enhanced Academic Practice. London: Kogan Page. [5] S. Taylor and N. Beasley. A handbook for doctoral supervisors. Routledge, New York, 2005.