blank copy of the dimensions model

advertisement
Designing a Work Integrated Assessment
The concept of a work‐integrated assessment is an assessment where the tasks and conditions are
more closely aligned to what you would experience within employment.
Module:
Co-ordinator:
Year:
Collaborate Project
Multiple Assessment Points
Move to a more distributed pattern of assessment; consider
introducing ‘surprise’ points
Assessments are often delivered in the form of one summative
assessment, e.g. an exam or essay, at the end of a period of formal
learning. In employment however, ‘assessment’ or evaluation points
tend to occur frequently. In addition, timing is often out of individual
control, and consequently it can be necessary to juggle competing
tasks at short notice.
Using multiple assessment points helps to develop reflective
thinking, whilst ‘surprise’ points support task prioritisation.
Bringing together staff, students and employers to create
employability focused assessments enhanced by technology
Time
Peer / Self Review
Varied Audiences
Include peer and/or self review explicitly in the assessment process
Typically the review of assessments (i.e. feedback) in formal
education is only provided by teaching staff. In employment,
however, much of the review process comes in multiple forms, e.g.
informal peer feedback from colleagues, formal and informal reviews
from clients, and self-review of personal performance.
Including peer and/or self review explicitly within an assessment
helps students to develop critical thinking skills, and encourages
articulation and evidencing.
Aim to set explicit audiences for each assessment point
In higher education the audience for an assessment is implicitly the
academic that sets it, who will naturally be already aligned in some
way with the course and/or module. This contrasts with
employment, where the audience can be peers, but is more often
the client or another external third party, with different values,
priorities and expectations.
Having to think for a different audience on an assessment provokes
greater reflective thinking, and requires new types of synthesis.
Overview
Review
This assessment redesign sheet is designed to be used by
collaborative teams of staff, students and/or employers who
have already expressed a desire to create a work‐integrated
assessment. Six ‘dimensions’ are shown which are the
hallmarks of a work-integrated assessment.
Audience
The sheet is a thinking tool to help the design team reflect on
how a current assessment might be marked on the six
dimensions of the radar chart, and what changes might be
made to move along these six dimensions.
The model is designed to be used in three stages:
1)
2)
Structure
Light Structure
Lightly structure the overall assessment; reward student
approaches
Most thinking on assessment suggests that there should be explicit
guidance to students concerning how and where marks are attained.
However in employment part of the challenge for the individual
and/or team is the structuring of the work that needs to be
completed. Tasks need to be identified, processes decided, and
priorities allocated.
Using a light structure approach encourages students to plan tasks
and goals in order to solve a bigger problem, strengthening their
project management and prioritisation skills.
Analysis
Design
Evaluation
3)
An analysis stage to understand how the assessments for
a module currently map to the dimensions, followed
immediately by;
A design stage where changes to the assessments are
proposed that will create movement along the
dimensions, followed by;
An evaluation stage after the assessments have run to
assess the impact of designed changes.
Collaboration
Problem / Data
‘Real World’ Problem / Data
Set an overall real world problem, supported by real
world data
Purely academic learning might require a theoretical problem in
order to test a theoretical understanding. In employment though
problems tend to be very real, and data rarely comes in coherent,
standardised forms. It is usually in 'messier' formats that need to be
interpreted to be of use.
Using a real world problem and real world data helps to develop skills
in analysis, interpretation and evaluation.
Collaborative Working
Create teams of students from the outset, encourage collaboration
Many forms of assessment require working alone, yet employment
invariably requires some form of collaboration and team work, and
often with unknown and perhaps even challenging individuals.
Encouraging students to work collaboratively and in teams improves
their ability to negotiate and discuss, and develops their
understanding of team roles and role flexibility.
For more information contact Richard Osborne (r.m.osborne@exeter.ac.uk)
Project blog available at http://blogs.exeter.ac.uk/collaborate/
Designing a Work Integrated Assessment
The concept of a work‐integrated assessment is an assessment where the tasks and conditions are
more closely aligned to what you would experience within employment..
Module:
Co-ordinator:
Year:
Move to a more distributed pattern of assessment; consider
introducing ‘surprise’ points
Assessments are often delivered in the form of one summative
assessment, e.g. an exam or essay, at the end of a period of formal
learning. In employment however, ‘assessment’ or evaluation points
tend to occur frequently. In addition, timing is often out of individual
The analysis stage is an opportunity to think about how the assessments
for
control, and consequently
it can be necessary to juggle competing
tasks
at short It
notice.
the module under discussion currently map to the dimensions
model.
may
Using multiple assessment points helps to develop reflective
well be the case that some aspects of the current assessments
are already
thinking, whilst ‘surprise’ points support task prioritisation.
Bringing together staff, students and employers to create
employability focused assessments enhanced by technology
1. Analysis Stage
well developed in terms of these dimensions. The purpose of the tool is to
help the collaborative team visualise this, and highlight areas which aren’t
perhaps as well developed as others.
Peer / Self Review
Collaborate Project
Multiple Assessment Points
Time
2. Design Stage
The design stageVaried
is where
changes to the assessments are
Audiences
Aim create
to set explicit
audiences along
for eachthe
assessment
point
proposed that will
movement
axes. The
In higher education the audience for an assessment is implicitly the
adding (or potentially
removing)
of
components
is
discussed
academic that sets it, who will naturally be already aligned in some
that will help toway
create
movement
along
theThis
axes
as with employment,
with the
course and/or
module.
contrasts
where in
theaaudience
can be
peers, but is
more
often the client
appropriate, brining
different
audience,
for
example,
or or
external third party, with different values, priorities and
adding/splittinganother
assessment
points.
expectations.
No explicit attempt has been made to quantify the scales of the axes. Teams
Include peer and/or self review explicitly in the assessment process
encouraged
to use the boundaries implied by the top and bottom of the
Typically the review of assessmentsare
(i.e. feedback)
in formal
education is only provided by teaching
staff. In
employment,
scales,
together
with their interpretation of the best and worst aspects of
however, much of the review process comes in multiple forms, e.g.
their current assessments, as overall guides for rating within the model.
informal peer feedback from colleagues, formal and informal reviews
from clients, and self-review of personal performance.
Including peer and/or self review explicitly within an assessment
helps students to develop critical thinking skills, and encourages
articulation and evidencing.
Having to think for a different audience on an assessment provokes
greater
reflective
thinking,
requires new
of synthesis.
Technology “Top
Trumps”
have
beenand
designed
to types
be used
in
Audience
Review
conjunction with this model, which suggest digital technologies
that could support the proposed changes. Examples might be a
wiki to support the Collaboration dimension, a project
management tool to support the Structure dimension, or
perhaps a blog to support the Time dimension.
3. Evaluation Stage
The final stage using the dimensions model is an evaluation stage
after the assessments have run, which will measure the overall
impact of the designed changes.
Structure
ALight
separate
evaluation tool has been designed for use with the
Structure
model,
whichthe
creates
a third polygon
to illustrate visually how
Lightly structure
overall assessment;
reward student
approaches
those
involved with the assessment think they have developed in
Most thinking on assessment suggests that there should be explicit
terms
of the six dimensions. The dimensions themselves are not
guidance to students concerning how and where marks are attained.
directly
referenced,
instead
forty eight
skill
cards are rated by
However in
employment part
of the challenge
for the
individual
and/or team is the
the worktype
that needs
be cards are preparticipants
onstructuring
a 7-stageof Likert
scale.toThe
completed. Tasks need to be identified, processes decided, and
coded
with one of the six dimensions, and hence a score for each
priorities allocated.
dimension
can beapproach
extracted
from the
datatoby
evaluators, and
Using a light structure
encourages
students
planthe
tasks
and goals to
in order
solve a bigger problem, strengthening their
applied
the to
model.
project management and prioritisation skills.
Analysis
Design
Problem / Data
‘Real World’ Problem / Data
Collaboration
Set an overall real world problem, supported by real
world data
Purely academic learning might require a theoretical problem in
order to test a theoretical understanding. In employment though
problems tend to be very real, and data rarely comes in coherent,
standardised forms. It is usually in 'messier' formats that need to be
interpreted to be of use.
Using a real world problem and real world data helps to develop skills
in analysis, interpretation and evaluation.
Collaborative Working
Evaluation
Create teams of students from the outset, encourage collaboration
Many forms of assessment require working alone, yet employment
invariably requires some form of collaboration and team work, and
often with unknown and perhaps even challenging individuals.
Encouraging students to work collaboratively and in teams improves
their ability to negotiate and discuss, and develops their
understanding of team roles and role flexibility.
For more information contact Richard Osborne (r.m.osborne@exeter.ac.uk)
Project blog available at http://blogs.exeter.ac.uk/collaborate/
Designing a Work Integrated Assessment
The concept of a work‐integrated assessment is an assessment where the tasks and conditions are
more closely aligned to what you would experience within employment.
Module:
Co-ordinator:
Year:
Collaborate Project
Multiple Assessment Points
Move to a more distributed pattern of assessment; consider
introducing ‘surprise’ points
Assessments are often delivered in the form of one summative
assessment, e.g. an exam or essay, at the end of a period of formal
learning. In employment however, ‘assessment’ or evaluation points
tend to occur frequently. In addition, timing is often out of individual
control, and consequently it can be necessary to juggle competing
tasks at short notice.
Using multiple assessment points helps to develop reflective
thinking, whilst ‘surprise’ points support task prioritisation.
Bringing together staff, students and employers to create
employability focused assessments enhanced by technology
Time
Peer / Self Review
Varied Audiences
Include peer and/or self review explicitly in the assessment process
Typically the review of assessments (i.e. feedback) in formal
education is only provided by teaching staff. In employment,
however, much of the review process comes in multiple forms, e.g.
informal peer feedback from colleagues, formal and informal reviews
from clients, and self-review of personal performance.
Including peer and/or self review explicitly within an assessment
helps students to develop critical thinking skills, and encourages
articulation and evidencing.
Aim to set explicit audiences for each assessment point
In higher education the audience for an assessment is implicitly the
academic that sets it, who will naturally be already aligned in some
way with the course and/or module. This contrasts with employment,
where the audience can be peers, but is more often the client or
another external third party, with different values, priorities and
expectations.
Having to think for a different audience on an assessment provokes
greater reflective thinking, and requires new types of synthesis.
Review
Audience
Structure
Problem / Data
Light Structure
‘Real World’ Problem / Data
Lightly structure the overall assessment; reward student
approaches
Most thinking on assessment suggests that there should be explicit
guidance to students concerning how and where marks are attained.
However in employment part of the challenge for the individual
and/or team is the structuring of the work that needs to be
completed. Tasks need to be identified, processes decided, and
priorities allocated.
Using a light structure approach encourages students to plan tasks
and goals in order to solve a bigger problem, strengthening their
project management and prioritisation skills.
Set an overall real world problem, supported by real
world data
Purely academic learning might require a theoretical problem in
order to test a theoretical understanding. In employment though
problems tend to be very real, and data rarely comes in coherent,
standardised forms. It is usually in 'messier' formats that need to be
interpreted to be of use.
Using a real world problem and real world data helps to develop skills
in analysis, interpretation and evaluation.
Analysis
Design
Evaluation
Collaboration
Collaborative Working
Create teams of students from the outset, encourage collaboration
Many forms of assessment require working alone, yet employment
invariably requires some form of collaboration and team work, and
often with unknown and perhaps even challenging individuals.
Encouraging students to work collaboratively and in teams improves
their ability to negotiate and discuss, and develops their
understanding of team roles and role flexibility.
For more information contact Richard Osborne (r.m.osborne@exeter.ac.uk)
Project blog available at http://blogs.exeter.ac.uk/collaborate/
Designing a Work Integrated Assessment
The concept of a work‐integrated assessment is an assessment where the tasks and conditions are
more closely aligned to what you would experience within employment.
Module:
Co-ordinator:
Year:
Collaborate Project
Multiple Assessment Points
Move to a more distributed pattern of assessment; consider
introducing ‘surprise’ points
Assessments are often delivered in the form of one summative
assessment, e.g. an exam or essay, at the end of a period of formal
learning. In employment however, ‘assessment’ or evaluation points
tend to occur frequently. In addition, timing is often out of individual
control, and consequently it can be necessary to juggle competing
tasks at short notice.
Using multiple assessment points helps to develop reflective
thinking, whilst ‘surprise’ points support task prioritisation.
Bringing together staff, students and employers to create
employability focused assessments enhanced by technology
Time
Peer / Self Review
Varied Audiences
Include peer and/or self review explicitly in the assessment process
Typically the review of assessments (i.e. feedback) in formal
education is only provided by teaching staff. In employment,
however, much of the review process comes in multiple forms, e.g.
informal peer feedback from colleagues, formal and informal reviews
from clients, and self-review of personal performance.
Including peer and/or self review explicitly within an assessment
helps students to develop critical thinking skills, and encourages
articulation and evidencing.
Aim to set explicit audiences for each assessment point
In higher education the audience for an assessment is implicitly the
academic that sets it, who will naturally be already aligned in some
way with the course and/or module. This contrasts with employment,
where the audience can be peers, but is more often the client or
another external third party, with different values, priorities and
expectations.
Having to think for a different audience on an assessment provokes
greater reflective thinking, and requires new types of synthesis.
Review
Audience
Structure
Problem / Data
Light Structure
‘Real World’ Problem / Data
Lightly structure the overall assessment; reward student
approaches
Most thinking on assessment suggests that there should be explicit
guidance to students concerning how and where marks are attained.
However in employment part of the challenge for the individual
and/or team is the structuring of the work that needs to be
completed. Tasks need to be identified, processes decided, and
priorities allocated.
Using a light structure approach encourages students to plan tasks
and goals in order to solve a bigger problem, strengthening their
project management and prioritisation skills.
Set an overall real world problem, supported by real
world data
Purely academic learning might require a theoretical problem in
order to test a theoretical understanding. In employment though
problems tend to be very real, and data rarely comes in coherent,
standardised forms. It is usually in 'messier' formats that need to be
interpreted to be of use.
Using a real world problem and real world data helps to develop skills
in analysis, interpretation and evaluation.
Analysis
Design
Evaluation
Collaboration
Collaborative Working
Create teams of students from the outset, encourage collaboration
Many forms of assessment require working alone, yet employment
invariably requires some form of collaboration and team work, and
often with unknown and perhaps even challenging individuals.
Encouraging students to work collaboratively and in teams improves
their ability to negotiate and discuss, and develops their
understanding of team roles and role flexibility.
For more information contact Richard Osborne (r.m.osborne@exeter.ac.uk)
Project blog available at http://blogs.exeter.ac.uk/collaborate/
Download