“Authentic Assessment” - Erie Community College

advertisement
Bob Knipe
Dean, Learning Technologies
Genesee Community College
rgknipe@genesee.edu
How will you be able to tell
(measurably, tangibly, observably,
demonstrably)
that your students really know
(or can do, or will feel)
what you want them to know
(or do, or appreciate)?
*
“what you want them to know”
= your student learning outcomes
(objectives)
The hard part:
“Backward design” requires that rather than
move from target to teaching, we first identify
what counts as evidence of learning.
Backward design
1.
Identify desired outcome
2.
How will you know that they know it?
3.
Design instructional activities

Authentic:
The skill, knowledge or attitude being
measured is actually the desired one you
have identified.
Or as close to it as possible.

Authentic:
The skill, knowledge or attitude measured is actually the
desired one you have identified.
Or as close to it as possible.
In other words, the assessment approximates
the application.

Authentic:
In other words, the assessment approximates
the application.
If the skill, knowledge or attitude
measured “isn’t really important…”
then change the outcome statement.
Conventional vs. authentic assessment:
Conventional ----------------------------------------Authentic
Content is Covered---------------Learning is demonstrated
“Know” (math, history, nursing)------------”DO” (math, history, nursing)
What ---------------------------------------------------------How
Selecting a Response ---------------------Performing a Task
Contrived -----------------------------------------------Real-life
Conventional vs. authentic assessment:
Conventional ---------------------------------------Authentic
Recall/Recognition ---------------Construction/Application
Teacher-structured ----------------------Student-structured
Indirect Evidence -----------------------------Direct Evidence

An authentic assessment will:
 Last longer and be more meaningful
 Correlate highly with student success in
subsequent courses, program, career
 Not result in “bulimic learning”
 Be perceived by students as fair
 Result in better student course evaluations!
ASSESSMENT (how will you know that
they know it?) can be applied



To individuals
To small groups
To large groups
ASSESSMENT can be applied
To individuals
 To small groups
 To large groups

ASSESSMENT can be applied at the
 Activity level
 Unit level
 Course level
 Program level
 Institutional level
 System level
 National or international level
ASSESSMENT can be applied at the
 Activity level
 Unit level
 Course level
 Program level
 Institutional level
 System level
 National or international level
 Start with the active verb
 What does the Student Learning Outcome or
instructional objective ask the student to do?)
 Start with the active verb (what does
the SLO ask the student to do?)
 What are the characteristic responses we’re looking
for?

Start with the active verb (what does the SLO ask the student
to do?)
 Look for tangible evidence
(measurable, observable, demonstrable)


Start with the active verb (what does the SLO ask the student
to do?)
Look for tangible evidence (measurable, observable,
demonstrable)
 What tasks & evidence anchor the
assessment to the curriculum (what’s
the context) ?




Start with the active verb (what does the SLO ask the student
to do?)
Look for tangible evidence (measurable, observable,
demonstrable)
What tasks & evidence anchor the assessment to the
curriculum (what’s the context) ?
How do we look for evidence of
“understanding” (higher order skills)?
(see Bloom, http://www.odu.edu/educ/roverbau/Bloom/blooms_taxonomy.htm et al)






Knowledge:
Recall facts and concepts
Comprehension:
Understand what the facts and concepts mean.
Application:
Apply the understanding of facts and concepts
in a given situation.
Analysis:
Extract from a context the facts you need to
know
Synthesis:
Combine facts and concepts you understand
to achieve a specified goal.
Evaluation
Assess a situation where knowledge is partial
or ambiguous.

Looking for mastery (criterion referenced) or
relative scale (normative referenced – “curve”)?

How important is “grading” ?

Are a range of assessment strategies
employed ?

Scrapbook or portfolio, not snapshot

Are a range of assessment strategies
employed ?


Scrapbook or portfolio, not snapshot
Are learning styles accommodated?

Are a range of assessment strategies
employed ?



Scrapbook or portfolio, not snapshot
Are learning styles accommodated?
Are disabilities accommodated (Universal Design)?

More and varied assessment tools
are preferable to
fewer and monolithic assessment tools

For asynchronous (online etc.) courses,
assume ‘objective’ tests are open-book,
open-note, done collaboratively


For asynchronous (online etc.) courses,
assume ‘objective’ tests are open-book,
open-note, done collaboratively
Assessments can be learning activities

Objective tests… aren’t
http://www.ernweb.com/public/892.cfm
http://www.avc.edu/administration/organizations/slo/common/documents/ProsandConsofAssessmenttools.pdf

Determine true costs and benefits of
assessment activities
 Sequential building toward 100%
authenticity
(explain  exemplify  show  demonstrate  lab 
simulation  supervised practice  actual skill
assessment)

How assessments are used by others?
What do you need to report, when, and to
whom?
Is this is a recurring assessment
requirement (i.e. need for longitudinal data)?
 Bad SLOs
(dated, ambiguous, unrelated to desired outcome, etc.)
can & should be rewritten

Understand the requirements of the 2008
Higher Education Opportunities Act
regarding learning assessment for online
courses
Some Authentic Assessment resources:
 http://wik.ed.uiuc.edu/index.php/Authentic_Assessment
 http://jolt.merlot.org/documents/vol1_no1_mueller_001.pdf
 http://jfmueller.faculty.noctrl.edu/toolbox/whatisit.htm
 http://www.uwstout.edu/soe/profdev/assess.cfm
 http://www.park.edu/cetl2/quicktips/authassess.html
 http://pareonline.net/getvn.asp?v=2&n=2
OK, let’s try it….
Download