e-AFFECT Queen’s University Belfast Institutional Story Project Information Project Title (and acronym) e – Assessment and Feedback for Effective Course Transformation (e-AFFECT) Start Date September 2011 Lead Institution Queen’s University Belfast Partner Institutions N/A Project Director Ms Linda Carey Project Manager & contact details Mrs Linda Ryles Project website http://go.qub.ac.uk/e-AFFECT Project blog/Twitter ID http://blogs.qub.ac.uk/e-affect/ Design Studio home page http://jiscdesignstudio.pbworks.com/w/page/50671059/eAFFECT%20Project Programme Name Assessment and Feedback Strand A Programme Manager Lisa Gray 1 End Date August 2014 l.ryles@qub.ac.uk telephone: 028 9097 1343 Executive Summary The aim of the e-AFFECT project is to build upon existing good practice and drive strategic change with respect to assessment and feedback at Queen’s, using technology that is supported by the University where appropriate to enhance the student and staff experience. Queen’s University Belfast is a broad-based research-intensive institution which draws most of its full time undergraduates from Northern Ireland. The project is coordinated by staff in the Centre for Educational Development who work with colleagues from academic Schools and other support services. The e-AFFECT approach has been based on a) existing work on assessment and feedback, particularly the principles for good assessment and feedback practice with the development of a conceptual model of educational principles, b) an overarching Appreciative Inquiry methodology which is a positive approach to change, and c) a phased approach to engagement with Schools, where each phase of the Appreciative Inquiry cycle takes place over a three year period. Page 1 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story To date (August 2014) 255 academic staff, 4491 students and 19 Administrative/Clerical staff in 14 degree programme teams have been involved directly. The approach to stakeholders has been one of ‘Changing Together’ including senior management in the University and Schools, academic and support staff, students, key University personnel. The approach to evaluation is formative and based on the Context, Input, Processes and Product model (CIPP). This is a decision-focused model concerned with improving programmes and involving serving stakeholders. The project has developed a toolkit which may be adapted for use by others. It includes: a baseline report template, questionnaires and timelines for reviewing current practice Appreciative Inquiry materials which include the script and interview schedule for capturing stories, dreaming and envisioning the future Action planning template to identify SMART actions to achieve the vision Assessment and feedback timelines Literature review Educational principles cards for assessment and feedback Themed technology cards to help inform choice Templates for feedback review Best practice case studies. The outcomes and benefits from the project include: A phased approach to change (‘changing together’) that provided a non-judgmental review of existing practice and a collaborative plan action plan for change. There is potential to use this model beyond the assessment and feedback arena The creation of ‘space’ for dialogue around assessment, feedback and the curriculum amongst the degree programme team. This has resulted in a growing understanding of the importance of good assessment and feedback practices – the project has positively affected attitudes to assessment and feedback Increased uptake in the use of the assignment tool in the Virtual Learning Environment resulting in significant savings in time for administrative/clerical staff and greater efficiencies for academic staff (including external examiners) and students In all individual modules that have amended their assessment and feedback activities (with or without technology) there has been an improvement in student performance The development of a critical friend model and bespoke training/demonstrations for subject groups We have moved beyond the technologies that we espoused at the start of the project (to now include PeerWise, WebPA, GradeMark and VoiceThread) The use of QuestionMarkPerception has expanded – question development is a growth area and several programme teams are using their student bursaries to pay PhD students to develop question banks Page 2 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story In response to growing staff interest in exploring online marking and feedback and influenced by positive findings of recent pilots in other institutions, in semester 2 of the 2013-14 academic year CED launched a small, controlled pilot of GradeMark in Politics, Creative Arts, Food Science and Psychology. Other disciplines have confirmed interest in taking part in the new academic year. The University has committed to covering the cost of the full Turnitin suite for the next two academic years to allow a meaningful pilot to take place. In 2014-15 the project team will continue to work with: Phase 1 participants in further embedding, developing and extending their activities; Phase 2 participants in refining and embedding their interventions Phase 3 participants in implementing their action plans (this Phase will complete its three years of the project in 2015-16) Support will also be offered to those subject areas that have so far not engaged with eAFFECT. The project has concluded that if headway is to be made in addressing the issue of assessment and feedback it needs to: take a collaborative approach with active support of decision-makers in the University encourage dialogue between all stakeholders recognise and publicise existing good practice in the University lead to a culture change and not be seen as a ‘quick fix’ incorporate principles of assessment and feedback into wider University policies and processes including Quality Assurance. 2 Headline achievements of e-AFFECT The key achievements of the e-AFFECT project are: To date (August 2014) 255 academic staff, 4491 students and 19 Administrative/Clerical staff in 14 degree programme teams have been involved The development of a phased approach to change using Appreciative Inquiry that facilitates time to review, consider and develop activities that will bring about change The creation of ‘space’ for discussion around the curriculum, assessment and feedback within degree programme teams A conceptual model of educational principles for assessment and feedback has been developed The project team has designed and refined two sets of cards that suggest ways in which these principles may be achieved and the technologies and software/license requirements that may be used to support interventions/bring benefits to students and staff (these are shared across the sector via the Design Studio and may be customised to suit other institutions) Page 3 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Bespoke training sessions/demonstrations for subject groups The emergence of a ‘Critical Friend’ model for dissemination/support of activities and use of technology The increased use of the assignment tool in the Virtual Learning Environment Evidence that e-submission/marking/feedback can create significant savings in time and cost in the management of coursework. In one School this is the equivalent of 20 working days The launch of a small GradeMark pilot In all cases where changes to assessment and feedback in individual modules were introduced as part of the project, the students performed better than in previous years This was either in terms of the mean mark or in terms of the distribution of marks, ie. fewer fails and more students achieving higher grades Some subject areas used support provided by the project to trial technologies such as VoiceThread and Adobe Acrobat and disseminated their findings at the Centre for Educational Development’s annual conference in June 2014 which was entitled ‘Assessment and Feedback: a road to success’. The event was well-attended and provided an opportunity to foreground the good practice being developed across the University and beyond. Keynote speakers were Professor Margaret Price, National Teaching Fellow from the ASKe Pedagogy Research Centre at Oxford Brookes University and Richard Osborne, Project Manager at the University of Exeter’s Jisc-funded COLLABORATE project. 3 The key drivers and assessment and feedback context for e-AFFECT 3.1 Drivers The key drivers for the e-AFFECT project were: The wish to build upon existing good practice developed with the support of the Higher Education Academy Enhancement Academy to enhance the student and staff experience of assessment and feedback The need to develop an effective institution-wide framework for the management of strategic change A desire to address a lack of consistency in assessment and feedback practice across the University as evidenced in external (NSS) and internal student surveys To extend the use of technology already supported by the University to support assessment and feedback To support student attainment and retention in the University. Page 4 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story 3.2 Context 3.2.1 Queen’s University Belfast Queen’s is a broad-based research-intensive institution with 20 Schools, 11 Institutes, 2 University Colleges and 8 Directorates. The student body is primarily full time undergraduates from Northern Ireland (Table 1). Table 1 Profile of students at Queen's University Belfast (2012-13) Level of study Mode of study of FT students Domicile of FT students 3.2.2 Full time Part time 73.4% 26.6% First Degree PGT PGR Foundation Other UG 83% 8% 7% 0.5% 1% Northern Ireland 85% Rest UK EU Non EU 5% 3% 7% Organisational structure of project Figure 1 summarises the project’s organizational structure. Page 5 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Supporting Student Attainment Action Group Project Director Project Manager Central Support Group Phase 1 Participants Project Team Phase 2 Participants Phase 3 Participants Figure 1 e-AFFECT organisational structure The project reports to the University’s Supporting Student Attainment Action Group. This group is chaired by the PVC Students and Education. The other members are the three Faculty Deans, a Head of School, the SU President and VP Education, Director of Academic and Student Affairs and Head of Education and Skills Development. The project provides updates to the group for each meeting. The Central Support Group has representatives from Academic Affairs (QA and Regulations), Student Services and Systems, Disability Services and Information Services. The purpose of this group is to provide advice on institutional policies and procedures, to ensure that these and appropriate technology are in place to facilitate timely achievement of the project’s objectives and to enable the project’s outcomes to inform institutional review and development processes. 3.2.3 Technology context Two of the project’s objectives are to: identify effective and efficient practices in assessment and feedback for learning across the institution, with a particular emphasis on the role of technology in enhancing these build capacity in use of assessment and feedback technologies Page 6 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Table 2 sets out the University supported technologies which may be used for assessment and feedback. It was considered important that these technologies needed to be better utilised before asking Information Services to support other software on an ad hoc basis. Table 2 Technologies supported by Queen's University Belfast Technology Queen’s Online VLE (SharePoint) (QOL) Questionmark Perception (QMP v5.7) Personal Response Systems – TurningPoint Turnitin UK MS Office WordPress function Assignment tool for e-submission/marking/uploading feedback Assessment tool for QuestionMark Perception Discussion forums and wikis Accessed either through QOL or independently. License supports 18 question types All centrally supported teaching rooms have Turningpoint software and handsets are available for collection; Version 5 will be available for the 2014-15 academic year. Four Schools have their own handsets Used for originality checking License includes PeerMark for peer review and since January 2014 GradeMark Quick parts/ Comments Is available for student blogs The assignment tool in QOL provides the opportunity for significant savings in time and cost of staff time in the management of coursework. Students are able to upload their work and download their feedback remotely, which is important for those who live some distance from the University. QMP is widely used in the University to deliver online formative and summative assessments. In response to staff demand and encouraged by the positive findings of recent pilots in other Russell Group institutions, the University extended its Turnitin licence in January 2014 to include GradeMark with PeerMark. This was done to enhance more efficient and effective marking and feedback practice and also to support the development of important graduate attributes of self and peer review of performance. A small, controlled pilot of GradeMark was launched in the second semester of the 2013-14 academic year and this will be extended to include 4 more subject areas in 2014-15. In addition, the project has included JING, Audacity and PeerWise which are open source technologies that can be used to support assessment and feedback as well as learning, although these are not supported by Information Services. There is now a growing body of staff developing experience in the use of technology in relation to assessment and feedback. Their expertise is being shared to support newer participants and to build capacity. Interventions introduced by teams in Level 1 modules are now being extended into Levels 2 and 3, confirming the value of participation in the project Page 7 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story and its growing impact. An example of this is in Environmental planning where the new users of Voicethread, Jing and Adobe Acrobat Pro see themselves as “experts” in their chosen technologies, ready to provide advice to those who follow in further modules/years. 3.3 Context of assessment and feedback prior to the project Since the introduction of the National Student Survey in 2005, Assessment and Feedback have been exposed as the lowest rated aspect of the student experience at Queen’s. Improving this has been a University priority given its impact on student learning and retention. The University’s own First and Second year Experience Surveys (introduced in 2007 and 2008) confirmed that the problem existed at all levels. Queen’s sought to address this through participation in a Higher Education Academy Enhancement Academy project that began in 2009 and is described in the next section. 3.3.1 Previous initiatives on assessment and feedback In 2006-7, the Centre for Educational Development evaluated three web-based marking tools with regard to their support for criterion-referenced marking and the generation of student feedback as part of a Higher Education Academy (HEA) e-Learning Research project. The tools that were evaluated were: Electronic feedback developed by Phil Denton of Liverpool John Moores University (version 13), M2AGIC™ developed by Peter Nicholls of the University of Ulster at Jordanstown; and GradeMark™ which is a part of the iParadigms Turnitin UK suite. The tools were trialled across the following subject areas in the University: Archaeology and Palaeoecology, Computer Based Learning, Computer Science, Drama, Environmental Planning and Medicine. Among the conclusions were that academic staff needed to be digitally literate, that they needed time to familiarise themselves with and set up the software before starting to mark, that comments used in feedback could be analysed as part of the evaluation of teaching and that there should still be opportunities for one-to-one feedback (Jones 2007). A further initiative to address assessment and feedback in the University was participation in a 2009-10 HEA Enhancement Academy project. This project involved senior staff, academics and representatives from the Students’ Union. The project consisted of three strands: (i) five School-based projects developed practical solutions to enhancing practice; (ii) bespoke online resources based on the Re-engineering Assessment Practices project (REAP) principles of good practice were developed with suggestions as to how they could be achieved and accompanied by exemplars from within the University and (iii) there was an institution-wide feedback campaign in partnership with the Students’ Union to enhance student understanding and use of feedback (Figure 2). Page 8 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Figure 2 Feedback Campaign The key messages that came out of this project were that if headway is to be made in addressing the issue of feedback it needs to: Have a collaborative approach Have the active support of decision-makers and senior managers in the University Involve Students’ Union officers Encourage dialogue between all stakeholders Recognise and publicise good practice from across the University Lead to a culture change in the University and not be seen as a ‘quick fix’. 3.3.2 Baseline report The baseline report on the initial three participant groups revealed that there was huge variation in the timing of assessment and feedback, the ways in which feedback was provided on coursework and exams and in the processes operating to manage assessment and feedback in the Schools. This finding was echoed in the baseline reports of the Phase 2 and 3 participants. The approach to the project has been a non-judgmental one (see Sections 4.2 and 4.4) and there was, therefore, no attempt in the baseline report to make judgments about the practice of assessment and feedback in the Schools/subject areas. Where it was possible to identify Page 9 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story areas of good practice, however, these were mapped against the principles for good practice in assessment and feedback (Nicol 2009) that had been used in the HEA Enhancement Academy project (Section 3.3.1). The baseline activities included staff and student questionnaires around their perceptions of assessment and feedback. Some of the questions were common to both staff and students and were broadly based around the REAP principles of good assessment and feedback practice. Initial analysis of these questions demonstrates that there are some significant differences between the experience of students across the University as defined by those in Arts, Humanities and Social Sciences; Engineering and Technology; and Science and between staff and students. Further analysis was carried out during 2013-14. The baseline questionnaires were revised at the beginning of the 2013-14 academic year in response to feedback from earlier phases. Version 2 takes approximately 10 Minutes for staff to complete and 7 minutes for students. 4 Approach to the project The e-AFFECT approach has been one based on (i) existing work on assessment and feedback (Section 3.3.1), particularly the principles for good assessment and feedback practice (Nicol 2009), (ii) an overarching Appreciative Inquiry methodology and (iii) a phased approach to engagement with Schools. 4.1 Educational principles for assessment and feedback The principles for good assessment and feedback practice used in earlier work were the starting point within the project for identifying good practice in the participating subject areas (See Section 3.3.2). These were used in the baseline reports. Educational principles for assessment and feedback were reviewed as part of the literature review and following David Nicol’s webinar on educational principles the view was taken by the Project Team that there should be no more than about seven principles. The Project Team focussed on those that were considered to be the most important in terms of the design of assessment and feedback activities and developed a conceptual model for the use of these in the project (Figure 3). The rationale was that all assessment and feedback activities should encourage positive motivational beliefs and self-esteem. Within this, the application of the principles to the left and centre of the diagram should promote a positive impact on learning from summative assessment. To facilitate and engender dialogue with the programme teams around the educational principles eight cards (Table 3) were developed that set out the headline, the narrative behind it, suggested ways of accomplishing the principle and different technologies that might be used. These were initially developed from the existing web resources that had been developed during the HEA Enhancement Academy project (Section 3.3.2). Page 10 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Table 3 Principle Help clarify good performance (goals, criteria, standards) Encourage ‘time and effort’ on challenging learning tasks Deliver high quality feedback Provide opportunities to act on feedback Encourage interaction and dialogue around learning Give choice of topic, method, criteria, weighting or timing of assessments Development of self-assessment and reflection Create learning communities Figure 3 e-AFFECT educational principles Page 11 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story It was decided that given the overall rationale was that all assessment and feedback activities should encourage positive motivational beliefs and self-esteem no card was necessary for this, equally no card was developed for ‘summative assessment has a positive impact on learning’ because this is subsumed in the others. In addition to the educational principles cards, a further set was developed to highlight the technologies that support assessment and feedback, their associated benefits to students and staff and to identify their computing and logistical requirements. Itemisation of their key features affords easy comparison of options available. Both sets of cards are available via the Design Studio and may be customised to suit other institutions. 4.2 Appreciative Inquiry methodology Developed by Cooperrider and Whitney (2005) as a positive approach to change, Appreciative Inquiry is ‘the cooperative, coevolutionary search for the best …[where] intervention gives rise to inquiry, imagination, and innovation …involv[ing] the art and practice of asking unconditionally positive questions …[and] assumes that every organization and community has many untapped and rich accounts of the positive…(p8). Cooperrider and Whitney (2005) identify a 4-D cycle that can vary in length and formality dependent on the nature of the project. The 4-Ds as expressed by Cooperrider and Whitney are: Discovery, Dream, Design and Destiny with the ‘affirmative topic of choice’ (p 17) at the centre (Figure 4). This approach is about focusing on the positive as opposed to ‘what is wrong and this is what you need to do to fix it’. Discover Destiny Assessment and Feedback Dream Design Figure 4 Appreciative Inquiry (after Cooperrider & Whitney, 2005) More importantly, Cooperrider and Whitney (2005) note that ‘Each AI process is homegrown, designed to meet the unique challenges of the organization and industry involved’ (p15). Page 12 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Given this, the AI process developed for this project had educational principles for assessment and feedback as its affirmative topic of choice (Figure 5). DISCOVERY Evaluate and review Assessment Assessment and and Feedback Feedback highlight highlight stories stories Baseline Baseline DELIVERY Roll out of activities Educational Principles for Assessment and Feedback What will be the Assessment and feedback landscape in 2-3 years’ time? DREAM Action Planning activities to achieve the Dream Development of activities e-AFFECT May 2013 DESIGN Figure 5 Activities in the Appreciative Inquiry cycle Table 4 details the activities and timing of each stage in the AI cycle. Table 4 Activities and timings in Appreciative Inquiry cycle AI Stage What Tools When Discovery Baseline activities Baseline template Questionnaires – staff and students Business process maps Assessment and feedback timelines Appreciative Inquiry script Assessment and feedback interview Appreciative Inquiry script Autumn term Dream Page 13 of 45 AI workshop – discovering assessment and feedback highpoint stories AI workshop – where will the assessment and feedback landscape be in 2-3 years’ time? Literature review with synthesis of assessment and feedback literature, Early spring term Early spring term Literature review 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story educational principles, evidence based examples of where technology has supported assessment and feedback provided as preparation for next stage Action Planning workshop – based around principles – cards and literature review how will dream be realized? Mapped against educational principles Design Delivery Dreams from AI workshop Education principles cards Technology cards Action Plan template Development of activities/redesign/using technology where appropriate Summer term/summ er Roll out of activities Following academic year Review and embedding 4.3 Spring term End of roll out year and subsequent academic year Phased approach The AI steps (Table 4) were carried out in each phase of the project (Figure 6). PROJECT MANAGEMENT DISSEMINATION 2012-2013 2011-2012 Plan Develop Intervention 1 On going activity Intervention 2 Phase 1 Baseline 2013-2014 Phase 2 Evaluation/refinement Evaluation Baseline Evaluation Plan Develop Evaluation/refinement Intervention 1 Intervention 2 Evaluation Evaluation Plan Develop Phase 3 Baseline Evaluation/refinement EVALUATION Page 14 of 45 EMBEDDING 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Figure 6 AI in the context of the project's workflow Three programmes were involved in Phase 1, seven in Phase 2 and four took part in baselining in Phase 3 (due to School commitments, Pharmacy did not move on to the AI and action planning stage). Programmes in Phase 3 will finish the process five years after the start of the project, ie. in 2015-16. Table 5 Stakeholder Group Number Phase 1 Academic staff 65 Clerical/administrative staff 6 Students 1722 Phase 2 Maximum anticipated numbers (as well as ongoing activity from Phase 1) Academic staff 95 Clerical/administrative staff 7 Students 1482 Phase 3 Academic staff 95 Clerical/administrative staff 6 Students 1287 Schools have chosen which assessment and feedback changes they wanted to make and as a consequence of these certain technologies surfaced. In Phase 1 of the project there was an emphasis on the use of the assignment tool in QOL and QuestionMark Perception. In Phase 2, screen capture using JING, electronic voting, blogging and VoiceThread also emerged as choices. In 2014, use of these technologies grew and the University’s Turnitin license was extended to include GradeMark, facilitating a small pilot including some Phase 1 and 2 participants. The use of WebPA was also introduced as a pilot within Computer Science Page 15 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story 4.4 Summary of the Appreciative Inquiry process in action Phase 1 participants The baseline activities for all participants took place during the first semester of 2011-12. Each Appreciative Inquiry workshop elicited their stories of assessment and feedback, their wishes and their visions for the future of assessment and feedback landscape (English and Civil Engineering). In all cases the programme teams were expecting to be told what was ‘wrong’ with their assessment and feedback and ‘how’ they should ‘fix it’. There was surprise that this was not the approach and all expressed enjoyment with the process. 4.4.1 School of English In the School of English four colleagues met for the first Appreciative Inquiry workshop. These were the Director of Education (DE) and the Year Coordinators (YC). The Examinations Liaison Officer (ELO) was to attend. The rationale offered was that these individuals formed the management team for the undergraduate programme. In the Action Planning workshop the DE, one YC and the ELO had decided to pilot the assignment tool in QOL. The technology cards prompted a discussion about how to use CAA (QMP) for assessments in Linguistics and other areas. It was agreed to provide a demonstration for one of the participants. Following the demonstration, a bespoke workshop was convened around technology available to support assessment and feedback to which six staff came. The Action Plan was subsequently updated. Following the successful pilot of the assignment tool in QOL, the whole School moved to use this from September 2012. QMP was used formatively in one module in Linguistics. 4.4.2 Civil Engineering In Civil Engineering four colleagues who teach a particular thread of the programme met for the first Appreciative Inquiry workshop. They were accompanied by another colleague for the Action Planning workshop. In this instance, the team had engaged in some preliminary discussions about the kinds of activities that they would like to carry out. It became clear following this workshop that as the team progressed their activities more detail could be illuminated. In this case the major activities developed were the use of QMP to deliver staged formative feedback; the use of onscreen marking of draft drawings and graphs; the development of workshop materials for students on the criteria and standards for reports; and the redesign of assessment and feedback in a third level module (Figure 7) to incorporate peer review using PeerMark. Page 16 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Class test: Students complete 1 part at random of a four part test in class Handed in Tests scanned and uploaded to PeerMark Coursework: Students take away full test to complete in own time on paper and using LUSAS Students randomly peer review one class test using CW and LUSAS calculations Selfreview Student sees peer review Coursework handed in Staff review peer reviews Class Test – Pass/Fail Examination (90%) (10%) Coursework marked Student receives feedback on class test Student receives feedback on coursework and peer review Figure 7 Redesign of Civil Engineering module (colour coded to the educational principles in Figure 3) 4.4.3 School of Psychology The staff in the School of Psychology adopted a different approach to Appreciative Inquiry and Action Planning. The School had been using the assignment tool in QOL for about three years and the DE wanted a whole School approach to any further developments. For the Appreciative Inquiry workshop six academic staff and two postgraduate teaching assistants (TAs) attended. The Action Planning took place in two parts. First a discussion was had with 11 members of academic staff (no TAs were present). This included some who had been at the Appreciative Inquiry workshop and some who were joining the discussion further on. As a result the DE wanted to take the Action Planning the School Away Day. This discussion was led by the DE. The School decided to focus on feedback and their action plan included a workshop led by the project team on the nature of the feedback they provide to the students. Page 17 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Phase 2 The baseline activities for all participants took place during the first semester of 2012-13. As with Phase 1 the Appreciative Inquiry and Action Planning workshops were carried out in semester 2. Following the success of the bespoke technology session for the School of English all participating teams were offered the same opportunity. 4.4.4 Business Management In Business Management twelve academic staff attended the Appreciative Inquiry workshop. This was followed by the Action Planning workshop at which about twelve staff were present, however, all but one colleague were new to the process and had not been part of the first workshop. It was decided that online submission was a priority and the best way forward was to trial the QOL submission tool across a smaller programme. To this end, the BA Honours (part-time) Management and Business Studies was selected. The new programme was supported by CED with vignette and hand-out material for staff and students on the use of the tool. 4.4.5 Computer Science Six academic staff from Computer Science took part in the Appreciative Inquiry workshop and ten were present for the Action Planning session. This included staff who had joined the programme team after the first workshop. One of the issues facing the team was the impending increase in student numbers. The programme team agreed that there should be electronic submission and marking for all first year modules. It was also agreed that QMP would be used to provide formative assessment opportunities and a final year module would include an opportunity for peer review using PeerMark. The team was keen to look at WebPA as a potential tool to enable students to allocate marks for contribution to group projects. 4.4.6 Creative Arts The School of Creative Arts is made up of four subject areas: Drama, Film Studies, Music and Music Technology. The baseline activity revealed different practices between the subjects. Sixteen academic staff, including the Head of School and the Director of Education, attended the Appreciative Inquiry workshop. They were divided into four groups with representatives from each of the main subject areas. The follow up Action Planning session was built into the School’s Education Away Day. On this occasion there were three groups: Drama and Film Studies, Music and Music Technology. The Action Plan resulted in a number of actions taking place in individual modules. 4.4.7 Environmental Planning Ten academics from Environmental Planning took part in the Appreciative Inquiry workshop. The programme team was particularly interested in developing their students’ feedback literacy. In the subsequent Action Planning session it was agreed that there should be workshops for students at all levels using exemplars and marking exercises, that the student coursework submission form would be amended to enable students to indicate how they had used previous feedback in preparing the assignment, the use of Jing and audio feedback and Page 18 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story the development of VoiceThread tutorial and support material. This programme team took up the opportunity to see which technologies may be used to support assessment and feedback. This took the form of a ‘marketplace’ with staff moving round each of the stations. Phase 3 The baseline activities for all participants took place during the first semester of 2013-14. As with Phase 1 the Appreciative Inquiry and Action Planning workshops were carried out in semester 2. Following the success of the technology ‘marketplace’ session for Environmental Planning, all participating teams were offered the same opportunity before the Action Planning session. This was to inform that session in terms of the types of technology available and how they may be utilised. 4.4.8 Biomedical Sciences Ten staff from the Centre for Biomedical Sciences Education, including two administrative/clerical staff took part in the Appreciative Inquiry workshop. Following the technology session, the Centre staff agreed to produce a matrix of assessment, content and feedback opportunities across the programmes to identify patterns and demonstrate to students how the programme of work fits across the three years. In addition, the Centre would develop an end-of-semester report for each student, including some feedback on examination performance. Individual staff would develop objective testing opportunities using QMP and student generated MCQs using PeerWise. 4.4.9 Law Twenty-two staff from the School of Law participated in the Appreciative Inquiry workshop. At School level the action plan includes activities to ensure more sharing and dialogue around good practice, the mapping and embedding of skills across the programme, examination feedback, and student-led sessions on feedback and time management. Individual module conveners will develop their own activities such as regular ‘take home’ online tests, vodcasts to cover parts of the course that can be dull to present in class, audio, screen capture and peer feedback on drafts and student generated MCQs. 4.4.10 Midwifery Nine staff in Midwifery took part in the Appreciative Inquiry workshop. The group agreed that they would significantly develop the use of assessment criteria as a means of enabling the students to understand what was required and as a basis for the provision of feedback. 4.4.11 Social Work Following the Appreciative Inquiry workshop where eleven academic staff were present, the programme team identified a number of high level actions to be undertaken. These include: a review of module content and assessment, mapping the content, skills and assessment of the programme for staff and students, and a workshop with students and staff to explore assessment requirements. Individual module convenors will introduce peer review, online marking and staged assessments. Page 19 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story 4.5 Stakeholder approach Engaging the academic staff, in particular, to take part in the project was strengthened by University Senior Management support and the kudos that taking part in a Jisc-funded project provided, however that has then been followed by a bottom-up approach to stakeholder engagement in the Schools. Colleagues in Phase 1 collaborated in writing the initial funding application. This helped to strengthen their sense of ownership. Each phase has learnt lessons from those preceding it and more experienced staff have acted as ‘critical friends’. Once programme teams had committed to the project the approach to stakeholder engagement was one of ‘Changing Together’ (Figure 8). The AI approach that was adopted meant that there was a non-judgmental emphasis to the discovery phase. This engendered a degree of trust between the stakeholders and the project team. Directors of Education were interviewed about assessment and feedback, academic staff and students were asked to complete questionnaires of their experiences and perceptions of assessment and feedback and administrative or technical staff in the Schools were interviewed about assessment and feedback processes in their areas. In addition, focus groups were held with some students to elicit a greater insight where this was needed. Figure 8 Changing Together Page 20 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story It was considered important to understand how the office processes operated so that any proposed changes could be calibrated against their impact on the workload of the clerical and administrative staff. In addition, key University personnel were interviewed about University policies and processes as they affect assessment and feedback. This enables any proposed changes to be set within this environment. To support this, the Central Support Group was established with representatives from Academic Affairs, Disability Services, Information Services and Student Services and Systems. Baseline reports ‘belong’ to the Schools/programmes and the activities identified in the Action Plans came from the academic staff with the support of the project team. This approach has a sense of autonomy within the boundaries of the procedures and regulations, whilst at the same time there was recognition by the gatekeepers of these that the boundaries could be moved if there were a clearly defined pedagogic rationale. The project provided a supportive environment for academic colleagues to pilot activities that were clearly set within the educational principles. The bottom up and ownership approach was considered important for the fulfillment of the action plans. The action plans were kept under review and considered to be organic and ‘not set in stone’. Members of the project team kept in touch with the participants in the Schools to determine if further support was required and how the activities identified in the action plans were progressing. For example, in Civil Engineering the nature of the activities undertaken evolved as the ‘projects’ moved forward. Academic staff participating in the project encouraged their colleagues to utilize technologies that support assessment and feedback. For example, when one colleague from Linguistics was shown QMP she was keen to let others have some experience of it and a technology workshop was organized. This resulted in a number of colleagues being able to see how they might use this in the future. In Civil Engineering participants presented their activities to their colleagues at an Education Away Day. This included an opportunity to try QMP within the context of Civil Engineering (critical friend model and learning communities). Students: PhD students were involved in the School of Psychology’s AI discovery workshop and with Civil Engineering in the re-design of assessment in one module and the development of exemplar assessment materials for another. These students provided invaluable insights into their own experiences of assessment and feedback on these programs. The use of QuestionMarkPerception has expanded in 2013-14; question development is a big growth area and several programme teams are using their student bursaries to pay PhD students to create question banks. In Environmental Planning, PhD students are using VoiceThread to develop tutorial material (videos with student-led questions and comments) so that innovations may be extended from Level 1 into modules at Levels 2 and 3. Also in Environmental Planning, PhD students have complied Jing screencasts to support subject specific skills. In Social Work a workshop was held with 15 undergraduate students and 4 staff to explore the requirements for assessed work. In 2014, project funding was used to support a Postgraduate Page 21 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story research student in Civil Engineering to recreate selected computer-assisted assessment quizzes from HELM (‘Helping Engineers Learn Mathematics’) materials into QuestionMark as re-usable resources. Of the Phase 3 programmes, postgraduate students in Law are working in partnership with staff to develop question banks for QuestionMark. 4.5 Approach to the evaluation 4.5.1 Type of evaluation (e.g formative, summative, internal, external) The evaluation that has been undertaken is formative and reviews the project’s progress and impacts at the end of the second of the three year Project Plan. Its focus is primarily on the activities of the Phase 1 cohort of degree programmes, namely Civil Engineering, English and Psychology. However, it also considered the baseline and planning activities of the Phase 2 cohort which comprises Computer Science, Creative Arts, Environmental Planning and Management. The on-going formative evaluation will continue to monitor and review activities throughout the third year of the project. The final Project Evaluation Report in August 2014 will include the outcomes from all phases of the project’s development work, in so far as they can be reported at this stage of the project lifecycle. The evaluation was carried out by an internal member of staff who, although working in CED alongside the project team, has not been directly involved in the project’s development work. She was supported by an external evaluator who provided advice and guidance and assisted with data collection and analysis. 4.5.2 Approach to the evaluation The approach taken by this evaluation was informed by the Context, Input, Processes and Products Model (CIPP) (Stufflebeam, 1996; Stufflebeam and Shinkfield 2007). This is a decision-focused model which is concerned with improving programmes and involving and serving stakeholders and Table 6 provides a summary of its main stages. Table 6 Aspect of evaluation Type of decision Kind of question answered Context evaluation Planning decisions What should we do? Input evaluation Structuring decisions How should we do it? Process evaluation Implementing decisions Are we doing it as planned? And if not, why not? Page 22 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Product evaluation Recycling decisions Did it work? The four aspects of evaluation in the CIPP model support different types of decisions and questions. The formative evaluation explores the Process and Product stages of the project, implementing and reviewing the effectiveness of decisions. The initial phases of this model, the Planning decisions about ‘what to do’ in respective programme contexts and the Structuring decisions about how best to implement plans were addressed in the Baseline and Action Planning activities. The focus and emphasis of the evaluation is on the implementation of the decisions made at these earlier stages and seeks to establish how well the project is working for staff and students in the project Schools. Of particular relevance to the approach taken in the evaluation is the Product focus of the CIPP model which in turn is divided into four stages: impact; effectiveness; sustainability; and transferability. The evaluation questions outlined in Section 2.1.2 of the evaluation report align closely with this focus. Therefore the main emphasis of the evaluation is on impact and effectiveness. The detail of the evaluation can be found in the evaluation report. 4.6 Changes to project Whilst there were no major changes in direction during the project, minor adjustments were made in terms of the participating Schools/programmes. The School of Pharmacy had committed to Phase 1, but staffing issues and a professional accreditation visit meant that they were then included in Phase 2. Whilst the baseline activities were carried out, again staffing issues meant that it was not possible to carry out the AI and action planning. The School deferred its participation until Phase 3 which commenced in September 2013, but subsequently deferred again. The project was designed so that student bursaries would be available to enable some students to work with programme teams either in training staff how to use particular technologies, or producing technology guides for staff and/or students or to help with the redesign of assessments and/or feedback. This did not initially work as we expected, however. What was found was that in the School of English and in Civil Engineering, for example, where QMP was to be used for the delivery of formative activities the staff involved wanted to learn how to use the software by themselves following a demonstration. On the other hand, two recent graduates in Civil Engineering helped with the redesign of assessment in one case and the development of materials for engaging students in understanding the standards required in producing a report. These two students reported how they had benefitted from the engagement and were able to bring their views of what they would have liked when doing these modules. These two students also participated in the national student network. In 2014, project funding was used to support a Postgraduate research student in Page 23 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Civil Engineering to recreate selected computer-assisted assessment quizzes from HELM (‘Helping Engineers Learn Mathematics’) materials into QuestionMark as re-usable resources. Of the Phase 2 programmes, PhD students in Environmental Planning have developed VoiceThread tutorial material so that the innovation may be rolled out into other Level 1 modules and also into modules at Levels 2 and 3. Also in Environmental Planning, PhD students have complied Jing screencasts to support subject specific skills. Of the Phase 3 programmes, postgraduate students in Law are working in partnership with staff to develop question banks for QuestionMark. 5 Project outputs and resources Toolkit – all the items listed in the toolkit were used for Phase 3 of the project and can be adapted for use by others beyond the Jisc-funded period both within Queen’s and beyond. a. Baseline template A template for the baseline report has been developed. Sections relating to Queen’s can be revised in the light of changes in policy or process. Using evidence gathered in the evaluation process, refinements were made to the baseline template for its use with Phase 3 programmes. The headings in the template on the Design Studio include indicative content which can be tailored to the local situation as can the order of the report. b. Questionnaires Staff and student questionnaires were developed to gain insight into experiences and perceptions of assessment and feedback and the technologies used in assessment and feedback. The questionnaires developed for use in this baselining activity have drawn on three existing tools: 1. The Assessment Experience Questionnaire (AEQ) – this was adapted to elicit qualitative reasons for the score given. 2. The FAST project’s written feedback self-evaluation questionnaire was used with staff. 3. The Assessment for Teaching and Learning Audit Benchmarks (ATLAB) project questionnaire was developed by Whitelock and Cross (2011) at the Open University and some of the questions map against Nicol’s (2009) principles of good practice in assessment and feedback 4. Each questionnaire also included questions about students with disabilities and about the respondents. The questions were designed with support from TechDis. c. Timelines Page 24 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story The timelines approach to mapping assessment and feedback developed in the ESCAPE project was adapted for use in e-AFFECT. As well as being used in e-AFFECT baseline activities across all 3 Phases, the mapping exercise forms part of a Continuing Professional Development event for programme/course teams reviewing or preparing new degree programmes. d. Appreciative Inquiry materials - The Appreciative Inquiry approach can be adapted to other topics where institutional/School change is required. At the start of the AI workshop a short presentation was made to staff about the project and the purpose of AI. The Appreciative Inquiry workshop materials were developed from examples found in Stratton-Berkessel (2010). Script: The workshop script includes a lead in statement followed by a number of timed activities. The times are indicative and can be adjusted as required. Interview schedule: the interview schedule provides participants with space to record their colleague’s story(ies) for sharing. These can also be collected for the collation of notes from the session. e. Literature review: A draft literature review which synthesizes the pedagogic literature on assessment and feedback, introduces the principles for assessment and feedback and provides some examples of how technology has been used to support assessment and feedback. f. Cards: To support the Action Planning workshop two sets of cards were developed – one set around the educational principles and one around technologies available within the University or open source. Educational principles: there are eight cards in this set. Each card has: 1. A headline statement 2. Questions to stimulate reflection on how the headline may be implemented 3. A table which suggests ways in which the principle may be achieved and technologies available to support these suggestions Technologies: these ten cards are themed into their functionality and each card provides information on: 1. 2. 3. 4. 5. 6. 7. The type of technology Technology requirements – eg. license, permissions, download Benefits to students and staff in using the technology Tips for using the technology (where these have been gleaned) Implementation considerations Key features set out for easy comparison Accessibility considerations Page 25 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story An interactive version brings the two sets together. g. Action planning template The Action Plan template was designed to capture where, when and how an activity would take place in the programme. It also captures whether training for staff and/or students is required and any potential barriers to the completion of the proposed action. Each action is also mapped against the educational principles. For running each session an Action Planning Process was set out. h. Checklists: checklists were produced for each of the workshops run with staff for Appreciative Inquiry and Action Planning i. Feedback review template: This is a condensed version of feedback analysis developed by Glover and Brown (2006) and can be used to initiate discussion among staff around the consistency and quantity of assessment. How to guides The how to guides supplement the information provided on the technology cards. They offer a specific review of the software and basic usage manual. Best practice case study template This template complements the work of the critical friend, offering insightful advice to those who wish to use the approach to develop their assessment and feedback practices and includes an interactive assessment and feedback timeline. Case studies The case studies available at present are in the form of posters (Table 7) and will be written up in more detail to provide ‘stories’ from the project. Table 7 Case Study Posters Subject Case study Civil Engineering QMP in coursework Civil Engineering On screen annotation of draft drawing and draft graph Exemplars and marking criteria and standards for the presentation of reports Using PeerMark for peer review of coursework Civil Engineering Page 26 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story English e-Submission/feedback/marking English The use of QMP in Linguistics Psychology Review of feedback Management Queen’s Online Assignment tool Environmental Planning Using Jing to deliver formative feedback on visual work and to enhance skills development Environmental Planning Using Acrobat Pro to provide electronic annotation over drawings as a way of giving feedback on design-based modules. Environmental Planning Using VoiceThread to provide videos with student-led questions and comments Drama Using VoiceThread to provide lecture material before class in an effort to stimulate student engagement Computer Science Offline objective marking using QuestionMark Perception 6 Project outcomes and benefits 6.1 Benefits for the University Tangible benefits to date include those that are emerging around the project’s model of working – the phased approach where CED supports teaching staff to initially identify and subsequently implement their planned interventions. This methodology is viewed as having potential to be applied to other initiatives across the University in support of institutional change. The University’s Assessment Policy has been in place since 2007 and experiences identified through the e-Affect project and other assessment review activities, along with a changing external landscape require a review of the policy to ensure that it is fit for purpose. The review is planned for the 2014-15 academic year and its scope is currently being defined. The project team will contribute to this review. Engagement with academic colleagues and students across the University confirms that there is some confusion and wide variation in terms of how the existing Assessment Policy is translated into practice. These issues are being followed up with Academic Affairs in an effort to develop and disseminate clearer institutional guidelines. The project continues to be informed by TechDis and good practice across the sector. Support was also given to Disability Services to support the development of guidelines for assessing the work of dyslexic students. Page 27 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story 6.2 Phase 1 Programme Teams 6.2.1 School of English: The baseline activities included the identification of workflows operating in the Schools for the management of assessment and feedback. The resultant business process maps enable the roles of stakeholders to be identified and where changes are introduced how these affect individuals or groups. One example of this is in the School of English where esubmission/marking and feedback were introduced (Figure 9). . School of English: Revised Workflow School of English: Workflow Phase Student submits 2 hard copies and 1 online by noon with cover sheet Students sit exams with anonymous codes See marks and CW feedback on QOL Student submits coursework online by specified deadline – anonymous code allocated Students Students Phase Students sit exams with anonymous code See marks and CW feedback on QOL CW feedback unlocked CW feedback uploaded to QOL Staff match anonymous codes with student numbers Staff Work allocated to markers Coursework released to students Receive feedback files and marks spreadsheet Satff mark work CWonly Office Special circumstance collated Marks collated for each module Plagiarism board Marks collated for each module Agreed marks into Qsis All CW into Turnitin Reports available Suspected plagiarism cases into Turnitin Staff complete word feedback sheet and save as anonymous code Staff compile marks spread sheet - one for each component Special circumstance collated Agreed marks into Qsis Special circumstances board Exam board Staff Office Ensure anonymity Staff match anonymous codes with student numbers Anonymous codes allocated to markers Coursework released to students Satff mark work Staff compile marks spread sheet - one for each component Plagiarism board CW only Special circumstances board Exam board Staff upload feedback to QOL Figure 9 Workflows in the School of English All coursework (except for one module) in the School was managed electronically during semester 1. Staff have commented on the convenience that overcame any initial difficulties. For example: ‘Reading essays on screen rather than on paper took some getting used to, but the convenience of being able to mark away from the office desk easily outweighed any initial hurdles. I also appreciate the fact that we no longer have to wait for essays to be coded and distributed by the office staff’ (Poster). A focus group discussion as part of the evaluation elicited the following: ‘This has changed my life’; ‘It has changed all our lives’ and ‘Love it, love it’. The one module which is not managed online was described in the Evaluation focus group as ‘very clunky’ now compared to others. Student feedback has also been positive – for example: ‘It is great not to have to rely on printers for hardcopies anymore, and the quality and speed of the feedback on my work was very impressive and helpful’ (Poster). A final year student interviewed as part of the evaluation indicated that she preferred getting her feedback on line rather than having to request it via email: ‘everyone got it whether they liked it or not’. A comment for the 2013 NSS qualitative responses ‘Feedback, although improved substantially recently, was fairly non-existent in my first year’ recognises the developments in the School. Other benefits identified are for the External Examiners who are now able to access material seven days prior to the exam board (Evaluation focus group) and where there are shared modules, staff are now able to coordinate more quickly (Evaluation focus group). Page 28 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story There are 38 academic staff in the School and 903 students on taught (UG and PGT) programmes. Previously, five School Office staff would have spent the equivalent of 4 working days receiving, processing, distributing and filing away students work through the academic year for all assessments (6293 files for 208 assessments in 2012-13). The equivalent of 20 working days has been saved. This move has been welcomed by administrative and clerical staff. One member of staff used QMP to deliver formative MCQ activities in a Linguistics module with 65 students. She states ‘It has re-energised my teaching’. It has been time saving for staff and students receive feedback immediately rather than having to wait for staff to have marked the work. Students comment: ‘Fast feedback’s really important in phonetics. You need to establish good habits from the start’, ‘We know immediately how we’re doing’ and ‘it’s the way to go. Definitely’. ‘Comparison of the mean marks for this component of the module over the last three years indicates that there has been an increase from 66% to 70%. This could be explained by students having the opportunity to try formative activities a number of times. In 2013-14 the use of QMP in this module was extended to include summative assessment. The member of staff’s conclusion was that ‘it revolutionises everything’. The main advantage for the teaching team was the time saved in marking. The student take up of the optional weekly exercises was in excess of 70%, except in the ‘reading week’ when it was 48%. Students were able to revisit these exercises, thus providing some opportunity to catch up if they missed a session. When a correct answer was achieved by the student, the feedback pointed him or her to further areas to investigate. The results for this module in 2013-14 demonstrated a shift in the profile. During 2013-14 the School has done some preliminary planning for a small trial of GradeMark in 2014-15. This has included considering how best to deploy the software to fulfil the recommendation for anonymity and an adjustment in their School’s assessment procedures in preparation for the trial. Student comments about assessment and feedback include: Student participation very much encouraged, with feedback being constructive and highly beneficial. With regard to exams, the English Department started to give individual feedback on exam papers, which I found very useful as before this we were just given general feedback Feedback on my essays and exams has been plentiful and of good quality. In first year, I was encouraged and well advised on how to improve my grades and as a result have since been able to increase my marks into a higher classification. Page 29 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story It is encouraging to note that cohort analysis of the 2012 intake indicates a year on year improvement in student views on all three questions relating to feedback in the internal and external surveys. 6.2.2 Civil Engineering in the School of Planning, Architecture and Civil Engineering: One member of academic staff used QMP to deliver a coursework assignment. The questions are delivered in ‘batches’ with feedback at the end of each ‘batch’. Overall students have responded very positively. Whilst there was no significant difference in the mean marks for this module in 2012-13 compared 2011-12, there was change in the distribution of marks with proportionately fewer fails, thirds and lower second class marks and more upper seconds and firsts (68% compared to 74%). The intention is to build this question bank and release it to students for revision purposes. In a Level 2 module with 98 students, in an attempt to bring on weaker students and to engage students with assessment criteria and feedback three activities were introduced: 1. a workshop on marking reports and provision of a guide to report marking 2. on screen provision of feedback to students on a draft graph for coursework 3. on screen provision of feedback to students on a draft flownet for coursework There is a significant difference between the mean module marks for 2012-13 compared to 2011-12 (p< 0.002) and a shift in the mark distribution. Proportionately there are fewer fails and third class marks and more first class marks following the interventions. The module coordinator analysed the students’ marks comparing the results for those who attended the workshop and/or submitted draft. The conclusion was that students who participated were less likely to make errors in the final submission. In 2013-14, the results for this module presented a similar pattern to those in 2012-13. The summative assessment in a Level 3 module with 121 students was changed from 100% exam to 90% exam with 10% coursework. The redesign of the assessment process is outlined in an article in the University’s learning and teaching publication Reflections of December 2012. Whilst there is no significant difference in the mean marks for the module overall, only one student failed the module compared to 8 in 2011-12 and proportionately fewer students achieved third class marks and more students achieved lower and upper second class marks. As part of the coursework, students were asked to peer review a colleague’s work trialling PeerMark. Since this was the first time PeerMark had been used some additional questions were included to ascertain the students’ views on this technology. 39% of the students tried out the PeerMark tool (Table 8) Using PeerMark to distribute the papers meant that the office staff member was freed from having to sign out and receive student reviews and this fitted with the School’s policy of e-submission. Page 30 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Table 8 Student evaluation of PeerMark Usefulness of Peermark for providing peer review and feedback on structural diagrams Ease of use Ease of access Use of PeerMark to systematically criticise a colleague’s work Would you like to get feedback from a peer using a system such as this? Useful % 71 Easy Easy Yes Yes 75 78 89 76 Whilst it is difficult to claim a cause and effect between these results and the activities undertaken as part of e-AFFECT, it is encouraging to note that cohort analysis of the 2012 intake indicates that students perceive an improvement from the first year to the final year of their course: ‘I have received detailed comments on my work’ and ‘Feedback on my work has helped me clarify things I did not understand’. 6.2.3 The School of Psychology: The School of Psychology identified the following in their action plan: new guidelines for feedback on dissertations, an inventory of writing skills, new feedback sheets incorporating the University descriptors and an acceptance that staff should be exposed to each other’s feedback. All of these actions have been delivered. Further actions have been taken following the Review of Feedback workshop in the School. These include attempts to standardise feedback across markers, including the sharing of good practice, the introduction of tutorial exercises designed to help students interpret feedback, the use of the comments function only on documents rather than track changes, the introduction of a new moderation policy to include a view of feedback provided to students, and a change to feedback sheets where staff highlight the single most important aspect to consider for the next assignment. Level 1 tutorials have been re-designed to incorporate the feedback exercise. Feedback sheets have been changed and examples of good practice were made available in time for the start of the new academic year. Qualitative student comments from 2013 and 2014 indicate that students recognise that there have been improvements in the provision of feedback: The general handling of second year was quite poor in my opinion. Marking, feedback, support and organisation were all lacking, although I know they have since taken steps to improve these areas for subsequent classes. (2013) Page 31 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Feedback is mostly beneficial but rarely explains what I can take from it to apply elsewhere, it is often coursework specific. (2013) Feedback has greatly improved over the past 3 years and has been advantageous in helping me attain better results in other pieces of coursework. (2013) Feedback on coursework has been poor over my three years, albeit that it has improved slightly in my third year. (2013) Some lecturers will give excellent and extremely helpful feedback.. (2013) During my second year, the feedback on work was very poor. However, this was improved by 3rd year. They made it their priority to fix. (2013) Feedback had been very slow, though it has improved greatly this year. (2014) Improvement in the promptness of feedback. (2014) 6.3 Phase 2 Programme Teams 6.3.1 Business Management Business Management decided to proceed with one single action: a trial of the Queen’s Online assignment tool in a different degree programme – the smaller BA Honours (PT) Management and Business Studies. The programme co-ordinator managed the trial which involved 298 students on two campuses 186 students in Queen’s Belfast and 111 students in South West College. The trial identified the following advantages for the Management School: the same submission procedures could be followed by students at both the main campus and the satellite college. It was easy to upload and deliver feedback for students – the need for multiple individual emails to each of the 298 students was eliminated. It was much easier to monitor submission times with electronic submission and it was possible to monitor when students viewed their feedback. The advantages for these part-time students were clear: they did not need to take time off work to submit assignments, it was easier to view their feedback, the deadlines could be set to midnight making them feel that they had more time. Issues to be addressed in the future include: the lack of a facility within the QOL tool to deliver a single feedback file to multiple students (eg. for groupwork feedback); occasional staff unfamiliarity with zip files; logistics/costs where moderators and external examiners require paper copies. At the time of writing it is not clear how the Business Management Programme will proceed, but the BA Honours (PT) Management and Business Studies will continue in its use of the QOL assignment tool. Page 32 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story 6.3.2 Creative Arts 6.3.2.1 Drama VoiceThread was introduced into a module in Drama to enable students to watch lecture content before the class and to add substantive comment and/or questions. 15% of marks were allocated for this active participation. The subsequent class time was used for discussion, groupwork and games and presentations. Whilst the overall module marks have not shown any improvement, the module leader reported that the examination results were good. Student comments included: ‘I like the online lectures’ ‘VoiceThread was helpful’ ‘A great system for interactive learning’ 6.3.2.2 Film Studies Audio feedback and an online repository (Vimeo Business) for student films were used in a Film Studies module. Students were very positive about the audio feedback they received and requested it for their second assignment. The online repository for student films overcame the problem of file size limits in the University’s VLE and meant that issues of submission, archiving and access for External Examiners were overcome. 6.3.3 Computer Science Computer Science reviewed WebPA as a tool to enable students to assess group contribution to enable the more equitable weightings for group participation. The tool also allows the tutor to support weaker students in the group work process. The School plans to use the tool in 2014-15. In order to develop students’ skills in critically evaluating their own work and the work of others in one Level 3 module, PeerMark (part of the Turnitin suite) was used to enable students to peer and self-review final project submissions. 290 scripts were submitted online, shared anonymously and electronically. 652 reviews were completed. 6.3.4 Environmental Planning In Environmental Planning all modules are compulsory. Three modules in the first year undertook activities: Workshops were facilitated on assessment and feedback. As a part of the assessment for this module students were required to indicate how they had used the feedback from the first assignment in the next. Jing was used to provide screencasts to support subject specific skills development and to provide formative feedback on students’ design plans. Four VoiceThread tutorial resources were developed based around 4 themes with questions for the students to answer. The aim was to encourage year 1 students to express an opinion to the question posed and to then discuss this effectively with their Page 33 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story peers. Tutors provided feedback in VoiceThread on the students’ responses. Students viewed the resources as complementary to the standard face-to-face tutorials and as a useful resource for revision. The videos provided alternative learning formats and flexibility of access. In all these modules, whilst there was no significant difference in the mean module mark between 2012-13 and 2013-14, there was a shift upwards in the profile of marks. Acrobat Pro was used to provide feedback annotation on the first of three assignments in a second year design module. Students could access this feedback on their computers, smartphones or tablets. Analysis of the module marks in 2012-13 and 2013-14 demonstrates an upward shift in the profile of marks. 6.4 Phase 3 Programme Teams’ plans for 2014-15 In Phase 3, Biomedical Sciences will provide a ‘marks breakdown’ on exams for each student with ranking, some statistics and a paragraph from the Module Convenor. Additional similar information on coursework performance will be added. Summer studentships are being used to enable students to collaborate with staff in the creation of feedback comment banks to be used with GradeMark and PeerMark. In semester 2, students will use PeerWise to generate MCQs. In the Research Project module, Jing will be used to provide screen capture and audio feedback to students. In the School of Law, regular Teaching and Learning seminars will be held to share good practice and to explore different technologies that support assessment and feedback. Skills will be mapped throughout the degree programme in an effort to highlight where students have opportunities to be taught, to practise and to be assessed in the identified skills. Typed or audio feedback on exams will be provided at the end of both semesters. In an effort to engage students with course material throughout the year, ten online ‘take home’ class tests will be developed using QuestionMark Perception; students must take and pass seven. Supporting resources provided by the University’s Learning Development Service (eg. related to referencing and developing critical analysis) will be signposted to students. Generic feedback on common mistakes/issues will be provided in class, using Audacity to upload comments onto Queen’s Online. Some parts of the course will be offered as Vodcasts. Exemplars of past work will be circulated in an effort to engage students with standards and assessment criteria. Screen capture and audio will be used to upload formative feed forward information to the assignment tool in Queen’s Online – in some instances this may be provided on draft work. In Social Work, a review of assessment methodology has taken place and this included providing a ‘road map’ for students showing skills, content and assessment throughout the course. In a variety of modules, the course team is exploring: the use of Queen’s Online, MS Page 34 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Word and GradeMark to provide e-feedback to students; providing feedback on exams; linking integrated assessment to shared learning outcomes involving service-users and carers. Students will be encouraged to peer review short draft outline papers in an effort to engage them with the assessment criteria and the construction of feedback. Students are to collaborate with staff to map criteria onto a standardised marking sheet. Students are to review past work and draft 250-300 words of constructive feedback. Three skills assessments are being introduced in place of end of semester assessment – this will include opportunities for formative feedback with students self-assessing/evaluating videoed performance of role play. In Midwifery, assessment criteria are to be provided for essays instead of guidelines, in an effort make assessment more transparent to students and marking easier for staff; assessment criteria are to be used in feedback; an assessment rubric is to be developed using level descriptors; referencing is to be standardised and penalties defined (a guide for students is to be developed by students); a review of the timing of feedback will lead to the publication of dates for students and externals; generic feedback on common mistakes will be compiled into in a bank and students will be able to post questions on a discussion forum in Queen’s online. 6.5 Technology The technologies that have been used in the assessment and feedback activities in Phases 1 and 2 (and will be used in Phase 3 in 2014-15) are summarised in Table 9. Other technologies are used in the subject areas, but were not part of the e-AFFECT activities. For example, the Electronic Voting System is used in Civil Engineering. Table 9 Technologies used in Phases 1, 2 and 3 Technology Phase 1 staff Phase 1 students Assignment tool in QOL1 PeerMark in Turnitin 38 903 2 138 QMP 2 143 Adobe Captivate (on screen marking) VoiceThread 1 98 Jing Page 35 of 45 Phase 2 staff Phase 2 students 1 328 618 (CSC &DRA) 2 98 1 31 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Acrobat Pro GradeMark (from Feb 2014 only) 4 179 1 38 3 64 1 At the beginning of 2012-13 the School of Planning, Architecture and Civil Engineering required all suitable work to be submitted and marked using the assignment tool. This included 38 academic staff and 492 taught students in Civil Engineering. The School of Psychology was already using the assignment tool prior to the start of the project (37 staff and 682 students). This School has already been using MSWord for annotating student work. 6.6 Educational principles The educational principles that have been addressed by the assessment and feedback activities in Phases 1, 2 are summarised in Table 10. These figures include whole School counts where an activity has been applied across the School – the use of the assignment tool to provide high quality feedback. Table 10 Educational principles addressed in Phases 1 and 2 Principle Phase 1 staff Phase 1 students Phase 2 staff Phase 2 students Help clarify good performance Encourage ‘time and effort’ on challenging tasks Deliver high quality feedback1 Provide opportunities to act on feedback Encourage interaction and dialogue around learning Give choice of topic, method, criteria, weighting or timing of assessments Development of selfassessment and reflection Create learning communities 5 392 5 505 3 263 3 97 75 1761 4 543 4 272 7 525 40 898 8 585 41 1074 7 581 2 73 1 In 2013-14 uploads of feedback within the QOL assignment tool have not been included in these figures. Page 36 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story 6.7 Other benefits from the project The Appreciative Inquiry approach has demonstrated to academic staff that there is much that is positive in what they do and in their experiences. It has also provided a context in which they are not being ‘told what is wrong and how to fix it’, but a supportive environment in which to try out ideas. If this can be further developed, then the cultural change will be around academic staff looking to CED for greater support in developing their pedagogic practice. The project has enabled CED to develop a framework for managing institutional change and to test methodologies that may now be used to embed other innovations in learning and teaching. In all participating Schools, the project has facilitated conversations around assessment and feedback which would been unlikely without engagement with the project. A model of critical friends has emerged around technologies and cognate subjects. This includes colleagues sharing their experiences in a programme at an Education Away Day. A colleague from Civil Engineering (Phase 1) who has used onscreen annotations of draft drawings demonstrated this to colleagues in Environmental Planning (Phase 2), who are part of the same School. They have now introduced this into their own feedback practice. In addition, the showcasing of School projects in a dissemination event in March 2013 provided an opportunity for colleagues from across the University to see the work of the project. This stimulated interest in joining Phase 3. CED’s annual conference in June 2014 provided a similar platform for innovation and at least one School has indicated interest in exploring its assessment and feedback practices in the 2014-15 academic year. 6.8 Unexpected consequences Whilst not totally unexpected, and in line with the HEA Enhancement Academy, the project has identified a wider use of technology in assessment and feedback than was previously apparent. For example, work with colleagues in Creative Arts in 2012-13 identified a colleague who regularly uses blogs as part of the assessment and feedback process. WordPress blogs are supported by Information Services but not the Learning and Teaching Support Team. One module in Computer Science was using Apache™ Subversion® an open source version control system. Through the project sessions, its potential for use in group projects was shared with other members of the School. Voicethread and WebPA were two technologies that CED had previously reviewed and not chosen to use. The Appreciative Inquiry and Action Planning process highlighted some specific incidences where these tools provided functionality required to support either existing practice or planned innovations. This example and the one above illustrate how the ePage 37 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story AFFECT process has given the centre of the University a better understanding of Schools’ needs. A further unexpected consequence is the success of running subject specific technology workshops. Colleagues who are routinely in contact with each other and who experience the same problems, are more comfortable in this type of situation and use the same disciplinerelated language. These sessions are now an integral part of the process. Another unexpected consequence has been how powerful the experience of one School has been in influencing the take up of e-submission/marking/feedback across the institution. Two Schools, Creative Arts (Phase 3) and Education, have now adopted this as a result of the experiences in the School of English. 6.9 Impact on stakeholders The internal stakeholders for the project were identified in the project plan as those listed in Table 11. In this table we assess the overall impact of the project on these stakeholders to date. Table 11 Project impact on internal stakeholders Stakeholder Interest / stake Overall impact Students on programmes in the project Opportunity to engage with a more dynamic teaching and learning environment through effective assessment and feedback. It is difficult at this stage to assess this. Where activities have related to individual modules this has been the case. Students with disabilities Ability to engage with new technologies used to enhance assessment and feedback practice Ensuring that Assessment and Feedback practices meet with disability obligations ( submission form) Programme teams Opportunity to transform assessment and feedback and to reduce time on marking/administration. In the School of English the adoption of esubmission/marking/feedback has resulted in greater efficiencies and savings. In individual modules that have used QMP then time has been saved. As with using PeerMark to distribute work for peer review. Page 38 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story The small number of modules from Phase 1 that have used GradeMark have not experienced efficiencies in the first use as set up and learning to use the tool have required an investment of time. Students working with programme teams Opportunity to influence design and delivery of assessment and feedback on their programme and enhance employability through Degree Plus accreditation; There has been less uptake than expected. Two colleagues in Civil Engineering each worked with a PhD student to produce assessment and feedback activities – one of these included the use of PeerMark. Other staff wanted to use the technology themselves, but realised later the benefit of having a student build up question banks. There has been a greater collaboration with students in 2013-14, particularly in the creation of question banks. PVC for Education and Students Improvement in student experience and satisfaction as measured by ratings of assessment and feedback (e.g NSS) 2013-14 scores do not demonstrate this. However, see sections 6.2.1 and 6.2.2 which demonstrate positive developments Senior Manager of project Schools Improvement in student experience and satisfaction as measured by ratings of assessment and feedback (e.g NSS) 2013-14 scores do not demonstrate this. However, see sections 6.2.1 and 6.2.2 which demonstrate positive developments Other School staff Observe progress and outcomes of developments and opportunity to adopt practice developed through the project. Staff in all Schools have been kept up to date with the project through its blog, articles in ‘Reflections’, the University’s Learning and Page 39 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Teaching journal, the Guest Speaker event in March 2013, the CED Annual Conference in June 2014 and through briefings to Directors of Education and School Boards. Good practice has been shared within Schools at learning and teaching Away Days, where eAFFECT participants have presented on their innovations. The Schools of Planning, Architecture and Civil Engineering, Creative Arts and Education have moved to esubmission/marking/feedback on the basis of the experiences of the School of English. Central Support Services Furthering University policies to enhance assessment and feedback and embed technology enhanced learning; opportunity to enhance and streamline the assessment processes. Issues have been raised but not yet resolved. For example, discussion around aligning the use of GradeMark with existing internal processes is ongoing with colleagues in Academic Affairs and Student Services and Systems. These will feed into reviews of University policy and guidelines. Centre for Educational Development Opportunity to work with Schools to bring about change, to test a framework that will achieve institutional change, to demonstrate breadth and depth of skill sets within the unit. This has been achieved and strong links have been developed between CED and the participating Schools. The AI approach has been tested and can be adapted for further institutional change. Future Alumni Opportunity to benefit from the project in terms of graduate/professional It is too early to judge this, although some students have Page 40 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story competencies had opportunity to engage in peer review through the project. One stakeholder group that was not identified previously is the administrative and clerical staff in the Schools. This is a critical group that needs to be considered when introducing new activities to ensure that any impacts on them are assessed. For example, the introduction of e-submission/marking/feedback in the School of English has saved the clerical and administrative staff the equivalent of 20 working days. Administrative staff supporting all participating teams provided information for the baseline reports. In Phase 3, two administrative/clerical staff participated in the Appreciative Inquiry and technology/Action Planning workshops. 6.10 The wider sector We have shared our practice and have had dialogue and some collaboration with our CAMEL colleagues (Assessment Careers, InterACT and TRAFFIC) and other projects in Strand A. For example, we took part in a webinar on approaches to institutional change with TRAFFIC and FASTECH and a webinar on analysing feedback with Assessment Careers and eFeP. Our CAMEL colleagues were at our event in March 2013 and were able to see School projects. The event was attended by colleagues from the Republic of Ireland, Ulster University and Belfast Metropolitan College. One team member participated in the HEA and ALT-c conferences July and September 2013 talking about the experiences of co-working with students in the development and design of assessment and feedback activities. The Jisc Programme Manager has used our materials extensively in presentations to institutions across the UK. The educational principles for assessment and feedback were used as part of a workshop on assessment and feedback at the Belfast Bible College as part of their preparation for a QAA visit. Ferrell & Sheppard (2013) mentioned a number of the e-AFFECT activities in their paper at the EUNIS Congress. In developing the project we have used and adapted ideas from both the ESCAPE (assessment timelines) and REAP (educational principles). In addition, we received some help from our Critical Friend, Peter Chatterton and the Jisc Technology Lead in trying to progress our QMP software issues, which we believe affect more institutions than just ourselves. Page 41 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story Following the trial of PeerMark, the evaluation report and recommendations were sent to TurnitinUK. In May/June 2014 one team member participated in MMU’s online course on Assessment in HE sharing experiences and resources from the project with the participants and colleagues from the TRAFFIC project. 7 Sustainability and further developments 7.1 Further activities 2013-14 and beyond During 2013-14 the project team started the engagement process with a new tranche of programmes, as well as continuing its activities with the Phase 1 and Phase 2 programmes (Figure 6). Beyond 2013-14 the Phase 3 programmes will proceed through the project cycle reaching the embedding stage in 2015-16. It is anticipated that at least one further degree programme team will engage 2014-15 and beyond. The AI approach will be used in future developments at University and School level. This methodology has encouraged ownership of action plans. The educational principles are now integral to the PGCHET course. The University’s Assessment Policy is to be reviewed in 2014-15 and the educational principles will be incorporated. As part of the University’s Quality Assurance processes for 2013-14 the special theme was innovation in assessment/learning and teaching. Where innovations were identified they were mapped against principles and disseminated as good practice. The Phase 1 programme areas will continue to embed their activities in the light of the evaluation of these. Where new activities can be introduced, support will be provided. The action plans for the Phase 2 programmes were revised in the light of technology workshops and implementation began in September 2013. During the third year of the project the team began the engagement with four new subject areas: Law, Midwifery, Social Work and Biomedical Sciences. Activity with the School of Pharmacy recommenced, but was subsequently deferred. The project still needs to produce more ‘how to guides’, briefing documents for critical friends and student facilitators, talking heads videos. Page 42 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story 8 Reflections and lessons learned The key messages from this project are that if headway is to be made in addressing the issue of assessment and feedback it needs to: take a collaborative approach have the active support of decision-makers and senior managers in the University encourage dialogue between all stakeholders recognise and publicise good practice from across the University lead to a culture change in the University and not be seen as a ‘quick fix’ produce a sustainable toolkit to support assessment and feedback initiatives based on educational principles for assessment and feedback incorporate principles of assessment and feedback into wider University policies and processes including Quality Assurance 8.1 Change Management 1. Involve Schools in putting the project together – this ensures that there will be participants to ‘kick start’ the project and leads to greater ownership of change. 2. Whilst the project started with Senior Management support at the level of the University, the importance of the role of the Director of Education (DE) being involved or providing support became paramount in moving projects forward at School level. For example, the DE was vital in the School of Psychology in ensuring that a School-wide discussion took place on the nature and consistency of feedback. Equally, the DE in the School of English was able to take e-submission/marking/feedback forward in that School because he had tried it himself. The project is working with two subject areas in the School of Planning, Architecture and Civil Engineering – Civil Engineering and Environmental Planning. In the case of Civil Engineering the participants have been a group of interested staff, not including the DE, bringing about change in individual modules whilst in Environmental Planning the staff have been led by the DE and are affecting change across the programme. Civil Engineering staff have, however, shared their experiences with their colleagues and their innovations have been extended into other modules and at higher levels of the course. Across the University, School leadership at a decision-making level has been clearly effective in 2013-14 where the DE (or equivalent) has been proactive in promoting activities and encouraging staff to participate. In the School of Law, the Head of School has also been instrumental in the progress made. 3. Use ‘hooks’ such as professional accreditation, internal quality assurance processes, etc. to stimulate interest, yet being mindful of timing. 4. The bottom-up approach works well with early adopters and those open to change and new ideas. Their experiences can help to champion activities and bring others on board. 8.2 Technology-enhanced assessment and feedback 1. Ensure that sufficient time is allowed for the considered roll out of activities. Page 43 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story 2. ‘One size does not fit all’ – this has been an underlying principle of the project and building a critical mass can be as effective. 3. ‘Don’t assume’ – it is easy to have a view on a School or subject area from previous encounters and assume that they will be less willing to try new things, particularly the use of technology. Our experiences with the School of English are that they were almost waiting for the opportunity to do so. 4. Organise bespoke technology events for subject groups. This worked really well with colleagues who teach linguistics in the School of English – there was a sense of being able to try things out within the safe environment of one’s colleagues. An event for Environmental Planning resulted in a number of staff realizing how different technologies can be used to support assessment and feedback. These were subsequently included in their action plan. This model was adopted and followed in 2013-14. 5. Don’t impose solutions – the Appreciative Inquiry approach has been one of facilitated change. It is, however, important to ensure that what is proposed by staff is manageable and pedagogically sound and that support is provided. The support could be help with technology, training, encouragement or financial. 8.3 What we would do differently Our experiences with the Appreciative Inquiry discovery and dreaming workshops and the Action Planning workshops with one subject in Phase 1 and seven programmes in Phase 2 have been that different people have attended each. This means that there is a lack of continuity and thought between the sessions. It is important that all those involved are able to participate in each stage. In the case of one School (four subject areas) the workshops were held comparatively late in the year and the Action Planning workshop was cut from two hours to one. Experience also tells us that the workshops produce better outcomes when the two sessions take place within a period of no more than two or three weeks. In Phase 3, a timetable for the process was agreed at the first meeting, which resulted in technology sessions and action planning being scheduled much earlier in the academic year allowing development to take place in good time. It is important that programmes understand the time commitment required and that dates need to be agreed well in advance. This includes ensuring that staff who attend the AI workshop are available to follow through to the technology session/action planning stage. We would try to more clearly identify the impact of assessment and feedback events through recording the levels of participation in pilot activities with staff who attend dissemination events. Page 44 of 45 14/03/2016 18:45 e-AFFECT Queen’s University Belfast Institutional Story References Brown, E, Glover, C J, Freake, S and Stevens, V A M (2004) Evaluating the effectiveness of written feedback as an element of formative assessment in science, Proceedings of the Improving Student Learning: Diversity and Inclusivity Symposium, Birmingham, UK Cooperrider, D & Whitney, D 2005 Appreciative Inquiry: a positive revolution in change Berrett-Koehler Publishers, Inc. San Francisco Ferrell, G & Sheppard, M (2013) Supporting assessment and feedback in practice with technology: a view of the UK landscape, paper presented at EUNIS 2013: ICT Role for Next Generation University, Riga, 12-14 June 2013 Glover, C and Brown, E (2006) Written feedback for students: too much, too detailed or too incomprehensible to be effective, Bioscience Education E-Journal 7 http://www.bioscience.heacademy.ac.uk/journal/vol7/beej-7-3.aspx Nicol, D (2009) Transforming assessment and feedback: enhancing integration and empowerment in the first year, The Quality Assurance Agency for Higher Education, Mansfield Stufflebeam, D. L. (1966). A depth study of the evaluation requirement. Theory Into Practice, 5(3), 121-133 Stufflebeam, D. L. and Shinkfield, A. J. (2007) Evaluation Theory, Models, and Applications Jossy-Bass: San Francisco Stratton-Berkessel, R (2010) Appreciative Inquiry for collaborative solutions: 21 strengthbased workshops, Pfeiffer, a Wiley Imprint San Francisco Whitelock, D. & Cross, S. (2011). Assessment Benchmarking: Accumulating and accelerating institutional know-how for best practice. International Journal of e-Assessment (IJEA), Vol 1. No. 1 http://journals.sfu.ca/ijea/index.php/journal/article/view/18 Page 45 of 45 14/03/2016 18:45