eAssessment_Conf_Slides_120907

advertisement
Increasing learner success with
technology supported assessment:
findings from the REAP project
Catherine Owen, Project Manager
www.reap.ac.uk
September 2007
Practice in e-Assessment Conference
The REAP Project
• 3 HEIs (Strathclyde, Glasgow Caledonian Business
School, Glasgow University)
• Focus is on large 1st year classes
• Pedagogy: assessment for learner self-regulation
• Range of technologies: online tests, simulations,
discussion boards, e-voting, e-portfolios, peer/feedback
software, admin systems, VLEs, offline-online
• Goals: Learning quality and teaching efficiencies
• Outputs: case studies of redesign: advice to support
strategic change in institutions (transformation).
• Dissemination across HE/FE sector
September 2007
Practice in e-Assessment Conference
Plan
• Review literature on formative assessment and feedback
• Suggest some simple ways of improving assessment
and feedback practices in relation to the research
• Outline some interesting case studies of first year course
redesign drawn from REAP project and elsewhere.
• Analyse these case studies in relation to assessment
principles from literature
• Questions and discussion
September 2007
Practice in e-Assessment Conference
Defining assessment and feedback
• Assessment – the narrow meaning is exam for external
accreditation
• Feedback – the narrow sense is what the tutor writes
about/on a finished piece of student work
• But in this talk also interested in:
• Self-assessment and reflection - a form of internally
generated feedback
• Peer dialogue as feedback and peer assessment
September 2007
Practice in e-Assessment Conference
Definitions (2)
• Who is involved in formative assessment and feedback
•
•
•
•
•
Tutor
Peers
External (e.g. placement supervisor)
Computer generated
Self
September 2007
Practice in e-Assessment Conference
Why take formative assessment and feedback seriously?
• Assessment is a key driver of student learning
• Assessment is a major cost in HE (staff time)
• Widely reported that students don’t read the feedback
• Dropout/retention – linked to academic experience
• First year experience – students need regular and
structured feedback opportunities.
• National student survey – students are dissatisfied with
feedback.
• QAA reports – main area of criticism in England
September 2007
Practice in e-Assessment Conference
Research on Assessment in HE
Teaching/learning paradigm
Constructivist
Transmission
[student-centred]
Assessment paradigm
Transmission
[teacher-centred]
September 2007
Practice in e-Assessment Conference
Key messages
• Formative assessment and feedback by others can only
have an impact on learning when it influences a
student’s own self-regulatory processes (adapted from
Boud, 1995)
• Students are already self-assessing and generating
feedback so we should build on this capacity (Nicol and
Macfarlane-Dick, 2004)
September 2007
Practice in e-Assessment Conference
Scaffolding self regulation: 7 principles of
good feedback (assessment design)
1. Clarify what good performance is (goals, criteria,
standards).
2. Facilitate reflection and self-assessment in learning
3. Deliver high quality feedback to students: feedback that
enables students to monitor and self-correct
4. Encourage peer and tutor dialogue around learning
5. Encourage positive motivational beliefs & self esteem
through assessment
6. Provide opportunities to close the feedback loop
7. Use feedback information to shape teaching
Source: Nicol and Macfarlane-Dick (2006),
September 2007
Practice in e-Assessment Conference
1. Help clarify what good performance is
•
•
•
•
•
•
•
•
Documents with printed criteria/ published online
Tutor explains/discusses criteria with students
Frequently asked questions database (FAQs)
Exemplars of performance with feedback (e.g. essays - made available online)
Students discuss good and bad examples of
assignments and identify criteria
Students rephrase criteria in own words
Students generate their own criteria for a task
Students construct own MCQs and post online
September 2007
Practice in e-Assessment Conference
2. Facilitates self assessment
and reflection in learning
• Online quizzes (e.g. MCQs/short answers) with feedback
(self monitoring)
• Self-assessment at submission (strengths/weaknesses)
• Open book use of online MCQs
• Peer processes where students judge work of others
against standards and give each other feedback
• Simulations – dynamic feedback shows effects of actions
• PDP involving ongoing reflection on learning (eportfolios)
• Confidence ratings with MCQs increases reflection
September 2007
Practice in e-Assessment Conference
3. Deliver high quality feedback information:
that enables students to self-correct
• Link feedback to pre-defined criteria (comment
databanks)
• Provide feedback soon after submission (clarify
expectations)
• Point to resources where answers can be found (hyperlinks)
• Have students request feedback
• Reader-response theory and feedback
September 2007
Practice in e-Assessment Conference
4. Encourage teacher and student
dialogue around learning
• Break out discussions in tutorial to discuss written
feedback and plan strategies for improvement
• Students give descriptive feedback to peers before
submission
• Electronic voting technology – three levels of feedback
(computer, peer and teacher)
• Unique weekly assessed tutorial sheets
• Online discussion as peer feedback (psychology
example)
September 2007
Practice in e-Assessment Conference
5. Encourage positive motivational beliefs
(self-efficacy) and self-esteem
• Online objective tests (repeatable, private, comparisons
with own learning goals)
• Using authentic ‘real life’ tasks that require the kinds of
thinking and skills required in employment
• Structured milestones with early low stakes tasks and
feedback
• Focus on learning goals not just performance goals (e.g.
rewarding effort and improvement)
• Two-stage assignments (drafts-feedback-marks)
• Supporting formation of learning communities around
tasks (discussion boards)
September 2007
Practice in e-Assessment Conference
6. Provide opportunities to close
the feedback loop
• Increase opportunities for resubmission
• Two-stage assignments (drafts-feedback-marks)
• Action points as feedback strategy
• Personal development planning – students keep records
of work at different stages
• Initiate online peer review processes as available to
academics when writing journal articles
• Not releasing the grade until students have commented
on the feedback provided.
September 2007
Practice in e-Assessment Conference
7. Provides information that can be used to
shape the teaching
• Student requested feedback (questions worth asking)
• Electronic voting systems
• Just-in time-teaching using MCQs
• Monitoring online discussions
• Monitoring project work within shared workspaces
• Blackboard/WebCT have some built in reporting
functionality about student activity
September 2007
Practice in e-Assessment Conference
Discussion
• Would any of this apply to your situation? How?
• How would you improve on the principles
described?
• Why don’t these ideas apply?
• What would you do instead?
• Any questions?
September 2007
Practice in e-Assessment Conference
Two super principles
Super-principle 1: developing learner self-regulation
(empowerment) i.e steers to encourage ownership of
learning – the seven principles discussed above.
Super-principle 2: time on task and effort (engagement)
i.e. steers on how much work to do and when –
Gibbs and Simpson 4 conditions
Case examples from REAP – applying these
principles/conditions
September 2007
Practice in e-Assessment Conference
Case 1:
Using a VLE to support collaborative
group tasks
(University of Strathclyde Psychology
Department)
September 2007
Practice in e-Assessment Conference
Case 1: Psychology
Context:
•
560 first year students
•
Mixture of psychology majors (130) and those taking psychology only
for one year (430)
•
6 topic areas, 48 lectures, 4 tutorials, 12 practicals
•
Assessment; 2 x MCQs (25%), tutorial attendance (4%), taking part in
experiment (5%), essay exam (66%)
•
85 VLE groups of 7-8 students
Aims:
•
To develop essay writing skills and provide effective feedback
•
To reduce lectures and burden on staff
September 2007
Practice in e-Assessment Conference
Structure of assigned group tasks
6 cycles of 3 weeks (one per major course topic)
Each cycle:
•
Week 1: "light" written task, ≈ 7 short answers, between group
members.
•
Week 2: Reading task only
•
Week 3: "Heavy" written task, typically 7 short answers followed by
combining them into a single coherent essay.
Within each week is:
•
The Monday lecture, introducing material.
•
Immediately afterwards, that week's task posted online, for delivery
before next Monday.
•
Model answers (a few selected from the best student answers
submitted) are posted for the previous week's task.
September 2007
Practice in e-Assessment Conference
Big success, apparently!
•
•
•
•
•
•
•
•
Students set targets beyond what was required: contradicts commonly held
beliefs about assessment and external motivation (marks)
Produced work at level ‘not seen before’ surpassing third year
Spontaneous discussions about learning and learner responsibility
Some students burdened by workload but easily detected
Some groups participants moved at own request (3 groups)
13,429 messages posted by groups (postings from 40-400 per group)
Quality of interactions ‘outstanding’ across the board
Atmosphere in lecture class improved and online community developed
September 2007
Practice in e-Assessment Conference
Case 1 and the principles
•
Model answers and repeated task format provide progressive clarification of
expectations (clear criteria, A1)
•
Students encouraged to self-assess against model answer (A2)
•
Online peer discussion aimed at reaching agreed response (A4)
•
Staged complexity and focus on learning rather marks (A5)
•
Repeated cycle of topics and tasks (A6)*
•
VLE captures all group interchanges, allowing course leader to monitor
progress and adapt (A7)
•
Tasks require significant study out of class (capture enough work B1)*
•
They are distributed across topics and weeks (are spread out evenly B2)*
•
They move students progressively to deeper levels of understanding (B3)
•
There are explicit goals and progressive increase in challenge (communicates
clear and high expectations B4)
September 2007
Practice in e-Assessment Conference
Case 2:
Using a VLE to support collaborative
group tasks
(Glasgow University Biology Department)
September 2007
Practice in e-Assessment Conference
Biology Case
• 700 students in first year class
• Module core for biologists but also optional for
numerous other subjects
• Group task a key element of the class
• Encouraging communication and group working
skills, independent research and presentation
skills
September 2007
Practice in e-Assessment Conference
Biology Case
• Students work in groups to create a poster investigating
a species (chosen by the group) and its environmental
impact
• Each group must argue for the extinction of a species in
a head to head debate with another group in class
• An overall group mark is given by the tutor (for poster
and debate), then peers moderate the marks in their
groups to reflect individual contributions
• Participants also asked to reflect on their own
contribution to the group and write a paragraph
explaining how they added value to the task
September 2007
Practice in e-Assessment Conference
Biology Case
• Improve the students’ learning experience (with positive
impact on retention and progression)
• Support students to participate in the group discussions
regardless of the timing or other commitments (family,
travel, work etc.)
• Avoid problems arising from student absence and ‘no
shows’, since all students in a group will have access to
material ‘posted’ in the forum
• Reduce tutor mediation time by providing a permanent
record of the group interactions for both students and
staff
September 2007
Practice in e-Assessment Conference
Biology Case
• 80 web forums for 80 groups of 8 members created on
Moodle
• Group tasks released by tutors as a series of
progressive mini-tasks with short submission deadlines
(e.g. rationale for species choice)
• Tutor feedback provided to whole class via open
message-board
• Peer-marking process refined to include pre-task group
definition of criteria and a post-task reflection on and
revision of criteria (reflection on criteria)
September 2007
Practice in e-Assessment Conference
Benefits of the redesign
• Using the VLE forums was popular with students
• Positive impact on student confidence and the
development of face-to-face as well as online
interactions
• Peer feedback valued as an addition to tutor feedback
• Students re-assured that contributions monitored by staff
to support fair peer marking process
• Tutor mediation in group disputes significantly reduced
• Anecdotal evidence of improved class cohesion and
greater social interaction around learning
• Quantitative results pending…
September 2007
Practice in e-Assessment Conference
Relation to seven feedback principles
• Group discussion around criteria for marking helps to define goals
and standards (principle 1, clear goals)
• Peer marking helps students to develop objectivity in judgement.
Students self-assess their own contribution to the group and give a
rationale for their contribution. (principle 2, self-assessment)
• GTAs monitor contributions of group members but teacher provides
generic feedback to the open discussion board (principle 3, teacher
feedback)
• Peer dialogue and feedback happens in groups and in producing a
group submission to forum. Online interactions support the
development of learning communities online and face-to-face
(principle 4, peer dialogue)
• Opportunity to communicate with peers online as well as F2F
liberating for many shy or anxious students (principle 5)
• Discussion boards provides record of all student outputs which
teachers can use to make adjustments in how they teach (principle
7, shape teaching)
September 2007
Practice in e-Assessment Conference
Relation to Gibbs and Simpson’s four conditions
• Structured, progressive tasks encourage time and effort on tasks
(condition 1, captures sufficient study time and effort)
• Requires students to work in and out of class and make regular
contributions (condition 2, distribute student effort)
• Debating with other students encourages deep rather than surface
learning (condition 3, productive learning)
• Marking criteria are negotiated and the tasks are challenging
(condition 4, clear and high expectations)
• Not forgetting efficiency gains in course delivery...
September 2007
Practice in e-Assessment Conference
Case 3:
Podcasts, electronic voting and SATs
(University of Strathclyde Hospitality and
Tourism Department)
September 2007
Practice in e-Assessment Conference
Case 3: Tourism and Management
• 230 students, level course, two semesters
• Traditionally been delivered as two lectures per week
supported by four one-hour tutorials per semester
• Attendance at lectures poor and declines a lot
throughout the year
• Changes: lectures halved and replaced by:
•
Interactive lectures using EVS
•
Regular online SATs replacing class tests
•
Video-podcasts of course materials
•
Group presentations supported by online micro-tasks
September 2007
Practice in e-Assessment Conference
Case 3: Tourism and Management
• Attendance at lectures improved
• Number of non-qualifying students
reduced (from 40 to 30)
• Average coursework mark rose from
47.8% to 60%
September 2007
Practice in e-Assessment Conference
Case 3 and the principles
•
Online SATs, EVS allow self-assess against given answer, and the
class’ response (A2)
•
Online SATs & EVS usually followed by teacher’s model answer (A3)
•
Group presentation encourages peer dialogue around task(A4)
•
Success at tests (rather than uncertainty) promotes self-esteem (A5)
•
Repeated cycle of tasks gives opportunities to close feedback loop(A6)
•
EVS also lets teacher assess state of the class as a whole (A7)
•
Podcasts, texts to be used before class (capture enough work B1)
•
They are distributed across topics and weeks (are spread out evenly
B2)
September 2007
Practice in e-Assessment Conference
Figure 1: A framework and ten principles for formative assessment and
feedback in the first year and beyond
EMPOWERMENT
(Adapt institution to student)
ACADEMIC
EXPERIENCE
• Involve learners in decisionmaking about assessment
policy and strategy
•
• Give choice in assessment content and processes
• Facilitate reflection and selfassessment in learning
• Adapt teaching to
• Encourage positive
student needs
motivational beliefs & selfesteem
•
• Deliver quality feedback that
helps learners self-correct
• Encourage time & effort on
purposeful learning tasks
• Help clarify what good
performance is
Support the development
of learning communities
Encourage interaction and
dialogue around learning
(peer and teacher-student)
ENGAGEMENT
(Fit student to Institution)
September 2007
Practice in e-Assessment Conference
SOCIAL
EXPERIENCE
© d.j.nicol 2007
Relevant papers
•
•
•
•
•
•
•
Nicol, D (in press), Laying the foundation for lifelong learning: cases studies of technology
supported assessment processes in large first year classes, British Journal of Educational
Technology (to be published July 2007).
Nicol, D (2007) E-assessment by design: using multiple-choice tests to good effect, Journal of
Further and Higher Education.
Nicol, D. & Milligan, C. (2006), Rethinking technology-supported assessment in relation to the
seven principles of good feedback practice. In C. Bryan and K. Clegg, Innovations in Assessment,
Routledge.
Nicol, D (2006), Increasing success in first year courses: assessment redesign, self-regulation
and learning technologies, Paper prepared for ASCILITE conference, Sydney, Australia, Dec 3-6.
Nicol, D, J. & Macfarlane-Dick (2006), Formative assessment and self-regulated learning: A
model and seven principles of good feedback practice, Studies in Higher Education, 31(2), 199218.
Nicol, D & Boyle, J. (2003), Peer Instruction versus Class-wide discussion in large classes.
Studies in Higher Education, 28(4), 457-473
Boyle, J.T. and Nicol, D. J. (2003). Using classroom communication systems to support
interaction and discussion in large class settings, Association for Learning Technology Journal,
11(3), 43-57
September 2007
Practice in e-Assessment Conference
Download