Why to assess?

advertisement
MODULE 3
3rd
2nd
1st
The Backward Design
Learning Objectives
• What is the purpose of doing an assessment?
• How to determine what kind of evidences to look for?
• What kind of methods can be used? When?
• How to create assessment tasks and evaluation criteria?
• How to make sure the assessment is valid and reliable?
Why to assess?
The assessment purpose is to measure understanding,
not to generate grades!!!!
Provide professor with:
• Reliable information to infer about student learning
• Feedback to improve their teaching methods
Provide students with:
• Feedback on how well they understand the content
• Feedback to improve their learning
How to create assessments?
1
Assessment Objectives
2
Evidences of Learning
3
Assessment
4
5
Evaluation Criteria
Validity and Reliability
1
Assessment Objectives
Worth to be familiar with
Superficial knowledge
Important to know and do
Big Ideas
Core Concepts
2
Evidences of Learning
Related to the ABILITY of doing something
EVIDENCE refers to something that can be DEMONSTRATED!
Know Concepts,
Definitions
Worth to be familiar with
Superficial knowledge
Important to know
and do
Micro-descriptive
Ability to apply a
specified framework
to contexts
Big Ideas
approached in class
Core Concepts
Micro, domain-specific
Ability to transfer
knowledge to different
contexts
Macro, across domains, multi-disciplinary
2
Evidences of Learning
Bloom
2001
Judge results of concepts application and make
decision about the quality of the application
Apply concepts to situations different from the ones
approached in class. Create new application or
interpretation of the concepts
Break concepts into parts and understand
their relationship
Apply concepts to situations similar to
the ones approached in class
Summarize ideas, explain
concepts
Recall Definitions
Assessment
3
Assessment Tasks
When to
assess?
Which
Method?
3
When to assess?
Snapshot
Summative
vs
Photo Album
Formative + Summative
Formative and Summative
Assessment
3
Both are necessary! At least 50% of each!
Formative
•
Summative
Objective is give feedback to
•
More focused on grade
students
•
End of the grading period.
•
Build learning
There is no opportunity to
•
Students can adjust
adjust and show improvement
Combination of both leads to a good result!
F
F
S
3
Continuous Assessment
Different
Moments and
Different
Methods!!
Assessment Tasks
3
Worth to be familiar with
Superficial knowledge
Important to know
and do
Big Ideas
Traditional
Quizzes and Tests
• Paper-and-pencil
• Multiple-Choice
• Constructed response
Core Concepts
Performance and
Task Projects
Adapted from “Understanding by Design”, Wiggins and McTighe
• Complex
• Open-Ended
• Authentic
Assessment Tasks
3
Bloom
2001
Result of Analysis - Decision
Pros vs Cons, Cost vs Benefits, Reflection
Authentic
Tasks
Complex Performance Task
Application to new contexts and situations,
create artifact or project
Analytical Task
Experiments, Scenarios Simulation, Cases
Simple Performance Task
Straightforward application, Exercises
Open-Ended Questions
Quizzes and
Traditional Tests
Ask about definition
3
Authentic Task
Task that reflects possible real-world challenges
It is a performance-based assessment!
•
Is realistic contextualized
•
Replicates key challenging real-life situations
•
Requires judgment and innovation
•
Students are asked to “do” the subject
•
Assesses students ability to integrate concepts and ideas
•
Gives the opportunity to practice and get feedback
It is problem-based NOT an exercise!
From “Understanding by Design”, Wiggins and McTighe
3
Authentic Task vs. Exercise
Authentic Task
•
Exercise
Question is “noisy” and
complicated
•
Various approaches can be used
•
Integration of concepts and skills
•
•
•
Right approach
Appropriate solution
•
Right solution and answer
Arguments is what matters
•
Accuracy is what matters
Out of class,
summative
From “Understanding by Design”, Wiggins and McTighe
In Class,
Formative
3
How to formulate an Authentic Task?
What is the goal of the task?
What is the problem that has to be solved?
oal
What is the student role?
What students will be asked to do?
ole
udience
Who is the audience?
Who is the client? Who students need to convince?
ituation
erformance
tandards
From “Understanding by Design”, Wiggins and McTighe
What is the situation or the context?
What are the challenges involved?
Evaluation Criteria
4
Must…
•
Provide feedback for students
•
Be clear
•
Communicated in advance
•
Be consisted of independent variables
•
Focus on the central cause of performance
•
Focus on the understanding and use of the Big Idea
4
Types of Evaluation Criteria
Criteria
Check List
4
Check List
There are two types of Check Lists
1. List of questions and their correct answers
2. List of individual traits with the maximum points associate to each of them
4
Check List: Questions and answers
This type is used to Multiple-choice, True/False, etc. In other words,
where there is a correct answer
1.
2.
3.
4.
5.
6.
A
C
D
B
B
D
4
Check List: Traits and their value
Performance
Trait 1
Weight (%) or points
Trait 2
Weight (%) or points
Trait ...
Weight (%) or points
Grade = weighted average or Grade = sum of points
Analytic Rubric is better
4
•
Provides more detailed feedback for students
•
Provides students with information about how they
will be evaluated
•
Is clearer
•
Evaluates independently each characteristic that
composes performance
On the other hand…
•
Holistic Rubric is used when it is required only an overall impression
4
Analytic Rubrics
How to create them?
4
How to create Analytical Rubrics?
Example: a simple rubric to evaluate an essay
Levels of achievement
Traits
Ideas
Organization
Grammar
Excellent
Satisfactory
Poor
4
It can be created from a Check List!
The difference is that each trait is broken down into levels of
achievement, which have detailed description!
Excellent
Trait 1
Weight (%) or points
Acceptable
Unacceptable
Excellent
Performance
Trait 2
Weight (%) or points
Acceptable
Unacceptable
Excellent
Trait ...
Weight (%) or points
Acceptable
Unacceptable
How to define traits?
4
It can be defined based on experience or on historical data:
1.
Get samples of students’ previous work
2.
Classify the sample into different levels (strong, middle, poor…) and write
down the reasons
3.
Cluster the reasons into traits
4.
Write down the definition of each trait
5.
Select among the samples the ones that illustrate each trait
6.
Continuously refine the traits’ definitions
It can also be defined based specific objectives and learning questions
From “Understanding by Design”, Wiggins and McTighe
4
How to build Analytic Rubric?
The following website is a free tool that helps to create rubrics
http://rubistar.4teachers.org/index.php
5
Validity and Reliability
5
Validity and Reliability
Target
Desired understandings / objectives
Shots
Assessment Outcomes
http://ccnmtl.columbia.edu/projects/qmss/images/target.gif
5
Checking for Validity
Self-assess the assessment tasks by asking yourself the following
questions:
•
Is it possible to a student do well on the assessment task, but really not
demonstrate the understandings you are after?
•
Is it possible to a student do poorly, but still have significant understanding of the
ideas? Would this student be able to show his understanding in other ways?
If yes, the assessment is not valid. It does not provide a good evidence to make
any inference
(Note: for both questions, consider the task characteristics and the rubrics used for evaluation)
Adapted from “Understanding by Design”, Wiggins and McTighe
5
Checking for Validity
The previous questions can be broken down into more detailed questions:
How likely is that a student could do well on the assessment by:
•
Making clever guesses based on limited understanding?
•
Plugging in what was learned, with accurate recall but limited understanding?
•
Making a good effort, with a lot of hard work, but with limited understanding?
•
Producing lovely products and performance, but with limited understanding?
•
Applying natural ability to be articulated and intelligent, but with limited
understanding?
Next Slide
From “Understanding by Design”, Wiggins and McTighe
Checking for Validity
5
How likely is that a student could do poorly on the assessment by:
•
Failing to meet performance goals despite having a deep understanding of the Big
Ideas?
•
Failing to meet the grading criteria despite having a deep understanding of the Big
Ideas?
Make sure all the answers are “very unlike” !!!
From “Understanding by Design”, Wiggins and McTighe
Checking for Reliability
5
Assess rubric reliability by asking:
•
Would different professors grade similarly the same exam?
•
Would the same professor give the same grade if he grades the test twice, but at
different moments?
Assess task reliability by asking:
•
If a student did well (or poorly) in one exam, would he do well (or poorly) in a similar
exam?
Task reliability can be achieved by applying continuous assessments
From “Understanding by Design”, Wiggins and McTighe
Summary
Observable,
Demonstrable
Learning
Evidences
Objectives
of Learning
Summary
Evidences of Learning
Time
Summative
Formative Assessment Tasks
Assessment
Task
•
Complexity depends on the desired level of understanding
•
Clear evaluation criteria (Rubrics)
•
Task and criteria must provide accurate and consistent judgments
Learning Objectives
• What is the purpose of doing an assessment?
• How to determine what kind of evidences to look for?
• What kind of methods can be used? When?
• How to create assessment tasks and evaluation criteria?
• How to make sure the assessment is valid and reliable?
References
• The main source of information used in this module is the following book
Wiggins, Grant and McTighe, Jay. Understanding by Design. 2nd Edition. ASCD, Virginia,
2005.
• Rubrics
http://rubistar.4teachers.org/index.php
Download