PD Specialists and Cadre Assessment Workshop V2

advertisement
Assessment
PD Cadre Workshop
June 26, 2014
“Not everything that counts can be
counted. And not everything that
can be counted, counts.”
- Albert Einstein
Plan For The Day
•
•
•
•
•
•
Curriculum, Instruction and Assessment
What is Assessment?
Instructional Design Model (UBD)
Target-Method Match
Feedback
Data Conversations
• Rubrics
• Rigor and Relevance
Resources
• Information you may already know
• Resources to support you in your PD roles
• WikiSpace:
http://harlemassessment.wikispaces.com/
• Take a minute now to “join”. Then review the
site.
Change
Reminder!!
Harlem Learning Process
What do we
want
students to
know and be
able to do?
Curriculum
&
Instruction
Intervention
Students
What will we
do if they
don’t
How will we
know that
they’ve
learned it?
Assessment
Curriculum, Instruction and Assessment
Curriculum
Curriculum Aligned
with Standards and
State Accountability
Assessments
Assessment
• Balanced System
• Of and For learning
• Informs instructional
practice
Curriculum Guides:
Skills or content
that will be taught
Based on New Illinois State
Learning Standards
Instruction
Delivery of
Curriculum Content
Engaged
Instructional
Strategies
Time on Task
Data-Informed Decision Making Process
What Role Does Assessment
Play in the Instructional
Process?
What distinctions do you make
between "testing" and "assessment"?
• Turn and Talk
Why is it important that teachers
consider assessment before they
begin planning lessons or projects?
• Turn and Talk
Assessment inspire us to ask these
hard questions:
• “Are we teaching what we think we are
teaching?”
• “Are students learning what they are supposed
to be learning?”
• “Is there a way to teach the subject better,
thereby promoting better learning?”
When assessment works best, it does
the following:
Provides diagnostic feedback
• What is the student's knowledge base?
• What is the student's performance base?
• What are the student's needs?
• What has to be taught?
When assessment works best, it does
the following:
Helps educators set standards
• What performance demonstrates
understanding?
• What performance demonstrates knowledge?
• What performance demonstrates mastery?
When assessment works best, it does
the following:
Evaluates progress
• How is the student doing?
• What teaching methods or approaches are most
effective?
• What changes or modifications to a lesson are
needed to help the student?
When assessment works best, it does
the following:
Relates to a student's progress
• What has the student learned?
• Can the student talk about the new knowledge?
• Can the student demonstrate and use the new
skills in other projects?
When assessment works best, it does
the following:
• Motivates performance
For student self-evaluation:
• Now that I'm in charge of my learning, how am I
doing?
• Now that I know how I'm doing, how can I do
better?
• What else would I like to learn?
For teacher self-evaluation:
• What is working for the students?
• What can I do to help the students more?
• In what direction should we go next?
Think about your own assessment
practice.
• Turn and Talk
▫ How do you approach assessment in your
classroom?
Measuring Mastery
Comprehensive Balanced Assessment
A comprehensive balanced assessment system
includes:
• State (Accountability) Assessments
• Interim/Benchmark Assessments
• Classroom Assessments
▫ Formative
▫ Summative
Each should be aligned to standards
State Assessments
According to The US Department of Education,
The purpose of state assessments required under
No Child Left Behind is to provide an
independent insight into each child's progress, as
well as each school's. This information is
essential for parents, schools, districts and states
in their efforts to ensure that no child--regardless
of race, ethnic group, gender or family income-is trapped in a consistently low-performing
school.
Benchmark/Interim Assessments
Benchmark assessments are assessments
that are administered periodically
throughout the school year, at specified
times during a curriculum sequence, to
evaluate students’ knowledge and skills
relative to an explicit set of longer-term
learning goals (generally a semester or
school year).
Universal Screener
• In the context of an RTI
prevention model, universal
screening is the first step in
identifying the students who are
at risk for learning difficulties.
• Universal screening is typically
conducted three times per
school year, in the fall, winter,
and spring.
• Universal screening measures
consist of brief assessments
focused on target skills (e.g.,
phonological awareness) that are
highly predictive of future
outcomes .
• Assists in identifying grade-wide
deficits in curriculum and
instruction.
• Provides a baseline for grade-wide
goal setting.
• Identifies students at risk of
academic or behavioral
difficulties.
• Can generate local norms and
benchmarks.
• Screening data, while brief, is
authentic, timely, and your first
indicator of difficulty with your
school, class, or student
Progress Monitor
• Provide on-going indication of
instructional effectiveness
• Inform decisions regarding changes in
instructional programs/interventions
• Provide data for level of responsiveness to
intervention
• A General Outcome Measure (GOM),
application of skills learned
Summative Assessment
Summative assessments are a measure of
achievement to provide evidence of student
competence or program effectiveness.
Summative assessments are found at the
classroom, district and state level and can be
graded and used in accountability systems.
The information gathered from summative
assessments is evaluative and is used to
categorize students so performance among
students can be compared.
Formative Assessment
• A process used by teachers and students during
instruction that provides feedback to adjust
ongoing teaching and learning to help students
improve their achievement of intended
instructional outcomes.
• Formative assessment is found at the
classroom level and happens minute-to-minute
or in short cycles.
• Formative assessment is not graded or used in
accountability systems.
• The feedback involved in formative assessment
is descriptive in nature so that students know
what they need to do next to improve learning.
District Assessment Framework
Elementary
Level Of
Assessment
Middle School
High School
Statewide
PARCC
PARCC
PARCC/PSAE
Interim/Benchmark
STAR
STAR
ACT Aspire
Classroom
(Summative)
Writing Benchmarks
*Common Assessments
Teacher Developed/From
textbooks
Chapter/Unit assessments
*Common Assessments
Teacher Developed/from
textbooks
Classroom
(Formative)
Determined by teacher
Determined by teacher
Chapter/Unit Assessments
Common Finals
*Common Assessments
Teacher Developed/from
textbooks
Common Pre-Assessments
Determined by teacher
Universal Screener
K: STAR Early Literacy
1st: STAR Early Literacy and
STAR Math
2nd-6th: STAR Reading and
STAR Math
STAR Reading and STAR
Math
Progress Monitor
Reading:
Reading and Math
K-6:, STAR Reading,
7-8: STAR, AIMSWeb
Running Records, AIMSWeb Running Records
Math:
K-6: STAR, AIMSWeb
Locally developed
reading and math
Running Records
AIMSWeb
The Backwards Design Model
• Stage 1: Identify Desired Results
▫ Linked to Content Standards
• Stage 2: Determine Acceptable Evidence
▫ Ongoing Assessment
▫ Performance Tasks
• Stage 3: Plan Learning Experiences and
Instruction
▫ Sequence of experiences
▫ Scaffolded
▫ Differentiated
Before Instruction
• Determine what you want students to know and be
able to do
▫ Essential learning, power standards/priority standards
▫ Identify learning progressions
 What if students don’t know foundational or “prerequisite” skills
• Review current data to determine students’ current
knowledge
▫ Collect additional data as needed
• Group students
• Develop differentiated classroom instruction based
on data
▫ Work with Special Education Teachers and Consultants
to determine how instruction will be supported for
students with IEPs
During Instruction
• What formative
assessments (not just
tests) will I use to
determine if students are
learning?
• How will I modify
instruction based on that
data?
• Examples of formative
assessment (Checking for
Understanding)
• http://wvde.state.wv.us/t
each21/ExamplesofForm
ativeAssessment.html
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Observations
Questioning
Discussion
Exit/Admit Slips
Learning/Response Logs
Graphic Organizers
Peer/Self Assessments
Practice Presentations
Visual Representations
Kinesthetic Assessments
Individual Whiteboards
Laundry Day
Four Corners
Constructive Quizzes
Think Pair Share
Appointment Clock
As I See It
Explore
• Summative example links
• Share your suggestions/recommendations
Keys to Quality Classroom Assessment
Accurate Assessment
Clear Targets
Clear Purposes
Assess What?
What are the learning targets?
Are they clear?
Are they good?
Why Assess?
What’s the purpose?
Who will use results?
Good Design
Assess How?
What method?
Sampled how?
Avoid bias how?
Sound Communication
Effectively Used
Communicate How?
How manage information?
How report?
Student Involvement
Students are users, too.
Students need to understand learning targets, too.
Students can participate in the assessment process, too.
Students can track progress and communicate, too.
Target -Method Match
Selecting The Right Type of Assessment
Clear Targets: Benefits to Students
Students who could identify their
learning scored 27 percentile points
higher than those who could not.
~Marzano, 2005
Students can hit any target they can
see that holds still for them.
37
A Math Example
Geometry
Subject
Pythagorean Theorem
Topic
Chapter 10
Resource
Use Geometric Tiles for
proof
Activity
The lengths of the three
sides of a right triangle are
related
Learning Target
Kinds of Targets
• Master content knowledge
• Use knowledge to reason and
solve problems
• Demonstrate performance skills
• Create quality products
Learning Targets with Associated Verbs
Knowledge Reason
Skill
Product
List
Predict
Measure
Construct
Define
Infer
Demonstrate
Develop
Understand
Classify
Use
Create
Recognize
Evaluate
Operate
Produce
Explain
Summarize Calculate
Converting Learning Targets to
Student-Friendly Language
•
•
•
•
Identify important or difficult learning goal.
Identify word(s) needing clarification.
Define the word(s).
Rewrite the definition as an “I can”
statement, in terms that students will
understand.
• Try it out and refine as needed.
• Have students try this process.
Student-Friendly Language
Word to be defined: PREDICTION
A statement saying something will
happen in the future
Student-friendly language:
I can make predictions.
This means I can use information
from what I read to guess at what will
happen next.
4 types Learning Targets
•
•
•
•
Knowledge
Reasoning
Performance/Skill
Product
Why It’s Important to Determine
Target Type
• Know if the assessment adequately covers what
we taught
• Correctly identify what students know and don’t
know
• Keep track of student learning target by target or
standard by standard
• Helps determine HOW to assess (method)
Target -Method Match:
What is it?
• A way to design assessments that cover our
targets
• Answers “ASSESS HOW?”
Types of Assessment Methods
•
•
•
•
Selected response & short answer
Extended written response
Performance assessment
Personal communication
Selected Response (SR)
• Students select correct or best response from a
list provided
• Students’ scores are figured as the number or
proportion of questions answered correctly
• Formats include:
▫
▫
▫
▫
▫
Multiple choice
True/false
Matching
Short answer
Fill-in questions
Extended Written Response (EWR)
• Requires students to construct a written answer
in response to a question or task (not select one
from a list)
• Extended = several sentences in length
• Examples:
▫
▫
▫
▫
▫
Compare pieces of literature
Solve a math problem, show & explain work
Interpret music, scientific info. or polling data
Analyze artwork
Describe in detail an economics process
Extended Written Response
• Correctness judged by:
▫ Giving points for specific info. present OR
▫ Use of a rubric
• Scores can be:
▫ Percentage of points attained OR
▫ Rubric scores
Performance Assessment (PA)
• Based on observation & judgment
▫ Rubric
• Judgment made on quality
• Examples:
▫ Playing instrument; speaking in foreign language;
working in a group (the doing/process is important)
▫ Creating products like a lab report, term paper, work
of art (quality of product is important)
Performance Assessment
• 2 parts:
▫ Performance task or exercise
▫ Scoring guide/Rubric
• Scoring guide:
▫ Can award points for specific features of performance
or product
▫ Can take form of rubric: levels of quality described
• Scores could be number or percent of points
earned or a rubric score
Personal Communication (PC)
• Find out what students have learned through
interacting with them
• Often an informal assessment, but if clear &
accurate info. is gathered, can be used for
feedback to students, self-reflection for students,
goal setting
• Examples:
▫ Oral examinations
▫ Interviewing students in conferences
▫ Looking at & responding to students’ comments in
journals and logs
Personal Communication
• Student responses evaluated in 2 ways:
▫ Correct/incorrect (for short, simple answers;
parallels scoring of written selected response
questions)
▫ Evaluate quality (for longer, more complex;
parallels to extended written response)
 Could use a rubric to score or scoring guide
Matching Target and Assessment Methods
Accuracy in the classroom assessment revolves
around matching the different target TYPES
with the appropriate form of assessment
METHODS
Target-Method Match
Knowledge
Selected Response
Written Response
Performance
Assessment
Personal
Communication
Good
Strong
Partial
Strong
Can assess elements of
knowledge and
relationships among them
in certain contexts
Can assess elements of
knowledge and
relationships among
them
Partial
Strong
Can assess isolated
elements of knowledge
and some relationships
among them
Reasoning
Good
Can assess many but
not all reasoning
targets
Skill
Partial
Good match for some
measurement skill
targets; not a good
match otherwise
Product
Poor
Cannot assess the
quality of a product;
can only assess
prerequisite knowledge
and reasoning
Can assess elements of
knowledge and
relationships among
them
Strong
Can assess all reasoning Can assess reasoning
targets
targets in the context of
certain tasks in certain
contexts
Poor
Strong
Can assess all
reasoning targets
Cannot assess skill
level; can only assess
prerequisite knowledge
and reasoning
Strong match for some
oral communication
proficiencies; not a
good match otherwise
Poor
Can observe and assess
skills as they are being
performed
Strong
Cannot assess the
Can directly assess the
quality of a product; can attributes of quality of
only assess prerequisite products
knowledge and
reasoning
Partial
Poor
Cannot assess the
quality of a product;
can only assess
prerequisite knowledge
and reasoning
55
Effective Design
• Select a proper assessment method
• Sufficient sampling to demonstrate mastery
• Select or create quality items, tasks, and
rubrics
• Sample—gather enough evidence
• Control for bias
• Design assessments so students can selfassess and set goals
Authentic assessment can include
many of the following:
•
•
•
•
•
•
•
•
•
•
Observation
Essays
Interviews
Performance tasks
Exhibitions and demonstrations
Portfolios
Journals
Teacher-created tests
Rubrics
Self- and peer-evaluation
Let’s try it….but first….
• Google Docs
• Set up your Harlem Gmail account
• Once you are logged in to your computer as yourself,
please visit: http://goo.gl/idcffu.
▫ Setting your password through this link will sync your
Harlem and Google account so you can activate your
account.
• Your log in will be
Firstname.Lastname@h122.org. Example:
Albert.Einstein@h122.org. Your password will be
the same password you set in the password reset
portal.
• How to guide is on the Resources page of our Wiki
so you can share with teachers in your building.
Let’s Try It!
• Go to the standards page on the WikiSpace
• In groups of 2 or 3, select one standard
• Identify the target and write in student friendly
language, “I-Can” statements
• Determine the type of target it is
• What /are the best method(s) to assess this
standard?
• How would you assess mastery?
Share
• Google Docs
• Set up your Harlem Gmail account
• Once you are logged in to your computer as yourself,
please visit: http://goo.gl/idcffu.
▫ Setting your password through this link will sync your
Harlem and Google account so you can activate your
account.
• Your log in will be
Firstname.Lastname@h122.org. Example:
Albert.Einstein@h122.org. Your password will be the
same password you set in the password reset portal.
• How to guide is on the Resources page of our Wiki so
you can share with teachers in your building.
Virtual Gallery Walk
• Debrief
Providing Students with
Effective Feedback
What is Feedback?
“Feedback is an objective description of a
student’s performance intended to guide future
performance. Unlike evaluation, which judges
performance, feedback is the process of helping
our students assess their performance, identify
areas where they are right on target and provide
them tips on what they can do in the future to
improve in areas that need correcting.”
~ W. Fred Miser
What Does the Research Say?
“Feedback seems to work well in so many
situations that it led researcher John Hattie
(1992) to make the following comment after
analyzing almost 8,000 studies:
‘The most powerful single modification that
enhances achievement is feedback. The simplest
prescription for improving education must be
dollops of feedback.’”
~ Robert Marzano
What is Feedback?
• “Research has shown that
effective feedback is not a
discrete practice, but an
integral part of an
instructional dialogue
between teacher and
student, (or between
students, or between the
student and
him/herself).”
From “Providing Students with Effective
Feedback”
What is Feedback?
• “Feedback is not about praise or blame, approval
or disapproval. That’s what evaluation is –
placing value. Feedback is value-neutral. It
describes what you did and did not do.”
~ Grant Wiggins
What is Feedback?
• “Effective feedback, however, shows where we are
in relationship to the objectives and what we
need to do to get there.
• “It helps our students see the assignments and tasks
we give them as opportunities to learn and
grow rather than as assaults on their self-concept.
• “And, effective feedback allows us to tap into a
powerful means of not only helping students learn,
but helping them get better at learning.”
~ Robyn R. Jackson
What is Feedback?
• “Effective feedback not only
tells students how they
performed, but how to improve
the next time they engage the
task. Effective feedback is
provided in such a timely
manner that the next
opportunity to perform the task
is measured in seconds, not
weeks or months.”
~ Douglas Reeves, p. 227
Primary Purposes of Feedback
• To keep students on course so they arrive
successfully at their predetermined
destination.
~ W. Fred Miser
“It is one thing to collect feedback about
students’ progress, but if you simply collect
this feedback and never use it to adjust your
instruction, then you are collecting it in vain.
The data you receive from grading your
assignments and assessments will give you
feedback about the effectiveness of your own
instruction.”
~ Robyn R. Jackson
What Does the Research Say?
“Academic feedback is more strongly and consistently
related to achievement than any other teaching
behavior….This relationship is consistent regardless of
• grade, socioeconomic
status, race, or school
setting….When feedback
and corrective
procedures are used,
most students can
attain the same level of
achievement as the top
20% of students.”
• ~ Bellon, Bellon & Blank
What Does the Research Say?
“In a major review of the research on assessment, Paul
Black and Dylan Wiliam (1998) noted
The research reported here shows conclusively that
formative assessment does improve learning. The
gains in achievement appear to be quite considerable,
and as noted earlier, amongst the largest ever reported
for educational interventions. As an illustration of just
how big these gain are, an effect size of 0.7, if it could be
achieved on a nationwide scale, would be equivalent to
raising the mathematics achievement score of an
‘average’ country like England, New Zealand or the
United States into the ‘top five’ after thee Pacific rim
countries of Singapore, Korea, Japan and Hong Kong.”
~ What Works in Schools, p. 38
Power of Accurate Feedback
•
•
•
•
•
Immediate impact on results
Lower failures
Better attendance
Fewer suspensions
Failure here undermines EVERY
OTHER EFFORT in curriculum,
assessment, and teaching
~ Douglas Reeves
Characteristics of Feedback
• Timely
▫ “The more delay that occurs in giving feedback, the less
improvement there is in achievement.” (Marzano)
▫ As often as possible, for all major assignments
• Constructive/Corrective
▫ What students are doing that is correct
▫ What students are doing that is not correct
▫ Choose areas of feedback based on those that relate to
major learning goals and essential elements of the
assignment
▫ Should be encouraging and help students realize that
effort on their part results in more learning (Marzano)
Characteristics of Feedback
• Specific to a Criterion
▫ Precise language on what to do to improve
▫ Reference where a student stands in relation to a specific
learning target/goal
▫ Also specific to the learning at hand
▫ Based on personal observations
• Focused on the product/behavior – not on the
student
• Verified
▫ Did the student understand the feedback?
▫ Opportunities are provided to modify assignments,
products, etc. based on the feedback
▫ What is my follow up plan to monitor and assist the student
in these areas?)
Essential Elements of Feedback
1. Recognition of the Desired Goal
2. Evidence about Present Position (current
work)
3. Some Understanding of a Way to Close
the Gap Between the Two
~ Black & William
Recognition of the Desired Goal
Includes:
•
•
•
•
•
Clarity of the Learning Goal
Clarity about Content Area
Clarity of Curricular Indicators
Clarity of Mastery Objectives
Clearly communicating the desired
learning goal to students through
instruction.
• A “Vision of Excellence”
Methods to Ensure Student Understanding
of Learning Goals
• Have students define what successful achievement of the
goals looks or sounds like. (Developing a “criteria for
success”)
• Provide several samples, models, exemplars, etc. of products
that achieve the learning goal in exemplary fashion.
• Lead students through an analysis of the criteria of successful
achievement in terms of the samples provided. Could be
through the use of rubrics or descriptions of the
practice/product.
• Compare students’ product to the criteria for success
• Have students continue working on a task until they succeed.
The Language of Assessment
• “As a result of understanding the learning
destination and appreciating what quality work and
success look like, students:
▫ Begin to learn the language of assessment. This means
students learn to talk about and reflect on their
own work using the language of criteria and
learning destinations.
▫ Gain the knowledge they need to make decisions that
help close the gap between where they are in their
learning and where they need to be.”
~ Anne Davies, p. 38
Evidence About Present Position
• What student work/assignments/projects
look like – “what is”
▫ I statements-students tell what they know
▫ “I can explain the difference between fact
and opinion.”
• Current work samples
Ways to Close the Gap between
Goals & Current State
• Provide guidance on how to improve (strategies,
tips, suggestions, reflective questioning, etc.)
• Provide student-friendly version of learning
targets along with actual samples of student
work—use exemplars!
• Provide help to improve performance
• Provide time to work on the improvement, apply
the feedback
Sharing Feedback
• Oral, interactive (one-on-one) feedback is best whenever
possible
• Use descriptive, not evaluative language
• Focus on what went well and what can be improved in
language students understand
• Seek consensus with the student(s) – do you agree on the
assessment of this product?
• Focus on the performance and/or behavior – not the
student
• Focus on those behaviors that the student can do something
about.
• Provide a demonstration if “how to do something” is an
issue or if the student needs an example.
• Group/class feedback works when most students missed
the same concept, providing an opportunity for reteaching.
Feedback Timing
Good Timing
Bad Timing
• Returning a test or assignment
the next day
• Giving immediate oral
responses to questions of fact
• Giving immediate oral
responses to student
misconceptions
• Providing flash cards (which
give immediate right/wrong
feedback) for studying facts
• Returning a test or assignment
two weeks after it is completed
• Ignoring errors or
misconceptions (thereby
implying acceptance)
• Going over a test or
assignment when the unit is
over and there is no
opportunity to show
improvement
~ Susan Brookhart
Amount of Feedback
• For students to get enough feedback so that they
understand what to do but not so much that the
work has been done for them (differs case by
case)
• For students to get feedback on “teachable
moment” points but not an overwhelming
number
~ Susan Brookhart
Amounts of Feedback
Good Amounts
• Selecting 2-3 main points
about a paper for comment
• Giving feedback on important
learning targets
• Commenting on at least as
many strengths as weaknesses
Too Much
• Returning a student’s paper
with every error in mechanics
edited
• Writing comments on a paper
that are more voluminous that
the paper itself
• Writing voluminous
comments on poor-quality
papers and almost nothing on
good-quality papers
~ Susan Brookhart
Strategies to Help Students
Learn to Use Feedback
• Model giving and using feedback yourself.
• Teach students self- and peer assessment
skills to:
▫ Teach students where feedback comes
from.
▫ Increase students’ interest in feedback
because it’s “theirs”.
▫ Answer students’ own questions.
▫ Develop self-regulation skills, necessary
for using any feedback.
~ Susan Brookhart
Strategies to Help Students
Learn to Use Feedback
• Be clear about the learning target and the criteria
for good work.
▫ Use assignments with obvious value and interest.
▫ Explain to the student why an assignment is
given – what the work is for.
▫ Make directions clear.
▫ Use clear rubrics.
▫ Have students develop their own rubrics or
translate yours into “kid-friendly” language.
▫ Design lessons that incorporate using the rubrics
as students work.
~ Susan Brookhart
Strategies to Help Students
Learn to Use Feedback
• Design lessons in which students use feedback
on previous work to produce better work.
▫ Provide opportunities to redo assignments.
(Comparing a rough draft to the
rubric/criteria/exemplar.)
▫ Give new but similar assignments for the same
learning targets.
▫ Give opportunities for students to make the
connection between the feedback they received
and the improvement in their work.
~ Susan Brookhart
Attaining Excellence
• “Students must have routine access to the
criteria and standards for the task they need
to master; they must have feedback in their
attempts to master those tasks; and they must
have opportunities to use the feedback to
revise work and resubmit it for evaluation
against the standard. Excellence is attained
by such cycles of model-practice-performfeedback-perform.”
~ Grant Wiggins
Feedback Levels
Feedback may be directed at one of four levels:
1. The task
“The best task-level feedback corrects flawed
interpretations rather than a lack of knowledge and
helps students focus on using strategies to achieve their
learning goals.” ~ Center on Instruction
2. The processing of the task ~ facilitating depth in
learning (encouraging students’ use of strategies to
check their work, recognize errors, and self-correct)
3. Self-regulation ~ helping students internalize the
practice of self-monitoring their learning and work.
4. The student as an individual ~ least effective feedback
Video Example
• https://www.youtube.com/watch?v=0DAeiBB6z
T0
HATTIE & TIMPERLEY’S FEEDBACK MODEL
To reduce discrepancies between
current understandings /
performance and a desired goal
PURPOSE
The Discrepancy Can Be
Reduced By
Teachers
Providing appropriate challenging and
specific goals
OR
Assisting students to reach them
through affective strategies
Students
Increased effort and employment of
more effective strategies
OR
Abandoning, blurring or lowering
the goals
EFFECTIVE FEEDBACK ANSWERS
THREE QUESTIONS
Feed Up
Where am I going?
(The Goals)
Feed Back
Feed Forward
How am I going?
Where to next?
Let’s Practice
• Exemplars
• Pairs
▫ 1 student, 1 teacher
▫ Focus on “Development of Details”
▫ Switch
Facilitating Data Conversations
You don’t need an advanced
degree in statistics and a room
full of computers to start
asking data-based questions
about your school, and using
what you learn to guide reform.
-
Victoria Bernhardt
96
Effective data conversations takes
place when…
• There is mutual trust among group members
• Data is not used to place blame or point fingers
▫ “The data”, not “Your data”
• Everyone needs to understand what the data being
presented represents and how it was derived
• Everyone must acknowledge that they play a role
Adapted from “The Data Dialogue”, Laurie Olsen
Propose Norms for Looking at Data –
Why is This Important ?
• i.e. Describe only
what you see.
• Just describe the data
in front of you
• Resist the urge to
immediately work on
solutions
• Seek to understand
differences
• Ask questions when
you don’t understand
• Surface the lenses and
experiences you bring
to the data
• Surface assumptions
and use data to
challenge and support
them
1. Begin with a question
2. Examine the data
5.Propose
actions/
interventions:
Set goals
4. Interpret the data
3. Raise questions about the data
Nancy Love (2002). Using Data/Getting Results. Norwood, MA: Christopher-Gordon.
Eisenhower National Clearinghouse for Mathematics and Science Education - http://www.enc.org
100
Data Driven Dialog
• Phase 1: Predictions - Surfacing of perspectives, beliefs,
assumptions, predictions, possibilities, questions, and expectations.
• Phase 2: Observations – Examining and analyzing the data for
patterns, trends, surprises, and new questions that “jump” out.
• Phase 3: Inferences – Generating hypotheses, inferring,
explaining, and drawing conclusions. Defining new actions and
interactions and the data that is needed to guide implementation
(monitor) and build ownership for decisions.
Based on the work presented by Nancy Love, author of
Using Data/Getting Results, (2002).
101
Phase 1: Predictions and
Assumptions
• Predictions: Informed by your knowledge of what
work your school has been engaged in for closing the
achievement gap, as well as your own critical findings,
make predictions.
• Assumptions: What thinking, beliefs, or expectations
drive your predictions?
• Hear and honor all assumptions and ideas
▫ Active Listening---not a conversation
▫ Each person shares their own ideas
Phase 2: Observations
• Examining and analyzing the data for patterns,
trends, surprises, and new questions that “jump”
out.
Phase 3: Inferences
• Generating hypotheses, inferring, explaining,
and drawing conclusions. Defining new actions
and interactions and the data that is needed to
guide implementation (monitor) and build
ownership for decisions.
106
Practice
• Illinois Interactive Report Card
▫ Look at District Data
• Follow the three steps
Debrief
Process of Transforming Data Into
Knowledge
Decision making
Synthesizing
Analyzing
Summarizing
Organizing
Collecting
Adapted from Keeping Teachers in the Center : A Framework of Data Driven Decision Making
Daniel Ligh.t Education Development Center ,Inc. Center for Children and Technology USA,2004
Using Data To Inform Your
Instruction
Using Data To Inform Instruction
Thinking Sheet
Instructional
Focus
Data Source
Students
Above
Proficiency
Students At
Proficiency
Students Not
Proficient
On Resources page in Wiki
Students Far
Below
Proficiency
CRITICAL Step
What Proficient Work
Looks Like
Obstacles/Misconceptions
/Challenges
Rigor and Relevance
• What is the framework? What do you know?
• Read the article
• With 1-2 colleagues, discuss connection to
problem-based learning
• Be prepared to share your thinking
Tips For Effective Rubric Design
• How to:
design a rubric that does its job
write precise criteria and descriptors
make your rubric student-friendly
Expert Input
Experts agree:
▫ Rubrics are hard to design.
▫ Rubrics are time-consuming to design.
▫ “A rubric is only as useful as it is good. Using a bad
rubric is a waste of time…”
--Michael Simkins in “Designing Great Rubrics”
Experts disagree:
▫ how to design a “good” rubric
Bottom line: Is it working for you and for your
students?
The Cookie
Task: Make a chocolate chip cookie that I would want to
eat.
Criteria: Texture, Taste, Number of Chocolate Chips,
Richness
Range of performance:
▫
▫
▫
▫
Delicious(14-16 pts)
Tasty(11-13 pts)
Edible(8-10 pts)
Not yet edible(0-7 pts)
The Rubric
Delicious
4
Tasty
3
Edible
2
Not yet
edible
1
# chips
Chips in every
bite
75% chips
50% chips
Less than
50% chips
texture
Consistentlych
ewy
Chewy
Crunchy
middle, crispy
edges
Like a dog
biscuit
color
Even golden
brown
Brown with
pale center
All brown
Or all pale
Burned
richness
Buttery, high
fat
Medium fat
Low-fat flavor
Nonfat flavor
Assess The Cookie
Overall score
▫
▫
▫
▫
Delicious
Tasty
Edible
Not yet edible
By criteria
▫
▫
▫
▫
Number of chips
Texture
Taste
Richness
Oops, What Went Wrong?
• Did the “product”
match expectations?
• Effective rubrics don’t
exist in a vacuum.
• The good news…
Holistic Or Analytic—Which To Use?
HOLISTIC—views product or performance as a whole;
describes characteristics of different levels of
performance. Criteria are summarized for each score
level.
(level=degree of success—e.g., 4,3,2,1 or “Tasty”)
(criteria= what counts, facets of performance—e.g.,
research or number of chips or presentation)
Holistic Or Analytic?
HOLISTIC—pros and cons
+Takes less time to create. Well…
+Effectively determines a “not fully developed”
performance as a whole
+Efficient for large group scoring; less time to assess
- Not diagnostic
- Student may exhibit traits at two or more levels at the
same time.
Holistic Example
Cookie
Delicious level (4)

Chips in every bite

Consistently chewy

Even golden brown

Buttery, high fat
Holistic Or Analytic?
Analytic=Separate facets of performance
are defined, independently valued, and
scored.
Example: Music—skill=string improvisation
development
Facets scored separately: melody; harmonics;
rhythm; bowing & backup; confidence
Holistic Or Analytic?
Analytic—pros and cons
+Sharper focus on target
+Specific feedback (matrix)
+Instructional emphasis
-Time consuming to articulate components and to
find language clear enough to define
performance levels effectively
The Debate
• Is the whole the sum of its parts?
Wiggle room or valid criterion—
Overall Development
Overall Impression
Overall impact
Weighting
Number range
Tip #1
• Don’t use generic or “canned” rubrics without
careful consideration of their quality and
appropriateness for your project.
 These are your students, not someone else’s.
 Your students have received your instruction.
Tip #2
• Avoid dysfunctional detail.
▫ “…in most instances, lengthy rubrics probably
can be reduced to succinct…more useful versions
for classroom instruction. Such abbreviated
rubrics can still capture the key evaluative criteria
needed to judge students’ responses. Lengthy
rubrics, in contrast, will gather dust” (Benjamin
23).
--Includes wordiness, jargon, negativity
Tip #3
• Limit the number of criteria
▫ Don’t combine independent criteria.
 “very clear” and “very organized” (may be clear but
not organized or vice versa).
Tip #4
• Use key, teachable “criteria” (What counts)
▫ Don’t vaguely define levels of quality.
▫ Concrete versus abstract
 “poorly organized” (Organization: sharply focused
thesis, topic sentences clearly connected to thesis,
logical ordering of paragraphs, conclusion ends with
clincher)
 “inventive” “creative” “imaginative” UNLESS…
Key Question to ask yourself: What does it
look like?
Tip #5
• Use measurable criteria.
--Specify what quality or absence looks like
vs. comparatives (“not as thorough as”)
or value language (“excellent content”)
---Highlight the impact of the performance
--Was the paper persuasive or problem
solved?
(Note importance of PURPOSE)
--What are the traits of effective persuasion?
Tip #6
• Aim for an even number of levels
▫ Create continuum between least and most
▫ Define poles and work inward
▫ List skills and traits consistently across levels
Tip #7
• Include students in creating or adapting rubrics
• Consider using “I” in the descriptors
 I followed precisely—consistently—inconsistently—
MLA documentation format.
 I did not follow MLA documentation format.
Tip #8
Do they understand the criteria and descriptors? How do
you know?
When do you give the rubric to your students?
Tip #9
Provide models of the different performance
levels.
Don’t Forget the Check-in Stage
• Use your rubric as a formative assessment to
give students feedback about how they are
doing.
▫ Isolate a particularly challenging aspect
▫ Have student isolate an area of difficulty
▫ Center revision instruction around rubric
Steps in Developing a Rubric
• Decide on the criteria for the product or
performance to be assessed.
• Write a definition or make a list of concrete
descriptors—identifiable-- for each criterion.
• Develop a continuum for describing the range of
performance for each criterion.
• Keep track of strengths and weaknesses of rubric
as you use it to assess student work.
• Revise accordingly.
• Step back; ask yourself, “What didnt I make clear
instructionally?” The weakness may not be the
rubric.
Steps in Modifying a “Canned” Rubric
• Find a rubric that most closely matches your
performance task.
• Evaluate and adjust to reflect your instruction,
language, expectations, content, students
▫ Criteria
▫ Descriptors
▫ Performance levels
It’s hard work…
• Expect to revise…and revise…
▫ One problem is that the rubric must cover all potential
performances; each should fit somewhere on the rubric.
• “There are no final versions, only drafts and deadlines.”
• When you’ve got a good one, SHARE IT!
The Mini-Rubric
These are the quick ones.
Fewer criteria and shorter descriptions of quality
▫ Yes/no checklists
▫ Describe proficient level of quality and leave other boxes for
commentary during grading.
▫ Use for small products or processes:
 Poster
 Outline
 Journal entry
 Class activity
Mini-rubric Example
Vocabulary Poster
Purpose: to inform
Content criterion (50%) 4 3
2
1
____written explanation of denotation—accuracy/thoroughness
____examples in action—accuracy/variety
____visual symbol or cartoon conveys word meaning—
accuracy/clarity
____wordplay---weighs synonyms for subtleties of meaning-accuracy/thoroughness
Presentation criterion (50%)
4,3,2,1--neat
4,3,2,1--clear organizational pattern
4,3,2,1--no error in Conventions
4,3,2,1--uses visual space to catch and hold attention
Score= Content__+Presentation___divided by 2=______GRADE
Comments:
Caution
▫ Don’t let the rubric stand
alone
▫ ALWAYS, ALWAYS
provide specific feedback
on your rubric and/or on
the student product itself.
Sentence Stems
To establish 4 levels of performance, try
sentence stems.
Example:
• Yes, I used surface texture and deep carvings
effectively to create individualizing detail.
• Yes, I used surface texture and deep carvings,
but I needed to include more for individualizing
detail.
• No, I did not use surface texture, but I did use
deep carvings –or vice, versa—to create some
individualizing detail.
• No, I did not use surface texture or deep
carvings.
Rubric Criterion Across The Curriculum
• Content (substance, support, proof, details)
▫
▫
▫
▫
▫
▫
▫
Relevant
Specific
Thorough
Synthesized
Balanced
Convincing
Accurate
References
• Bellon, Jerry, Bellon, Elner, & Blank, Mary Ann. Teaching from
a Research Knowledge Base: A Development and Renewal
Process, New York: Macmillan Publishing Company, 1992.
• Black & William, “Inside the Black Box: Raising Standards
through Classroom Assessment” Phi Delta Kappan, October
1998.
• Brookhart, Susan M. How to Give Effective Feedback to Your
Students. ASCD, 2008.
• Davies, Anne. “Involving Students in the Classroom
Assessment Process” Ahead of the Curve: The Power of
Assessment to Transform Teaching and Learning. Douglas
Reeves, Editor. Solution Tree, 2007.
• Jackson, Robyn R. Never Work Harder Than Your Students &
Other Principles of Great Teaching. ASCD, 2009.
• Marzano(1), Robert. Classroom Instruction that Works. ASCD,
2001.
• Marzano(2), Robert. “Designing a Comprehensive Approach to
Classroom Assessment.” Ahead of the Curve: The Power of
Assessment to Transform Teaching and Learning. Douglas
Reeves, Editor. Solution Tree, 2007.
References, page 2
• Marzano(3), Robert. What Works in Schools: Translating
Research into Action. ASCD, 2003.
• Miser, W. Fred. “Giving Effective Feedback”
• “Providing Students with Effective Feedback” Academic
Leadership LIVE: The Online Journal; Volume 4, Issue 4,
February 12, 2007.
• Reeves, Douglas. “Challenges and Choices: The Role of
Educational Leaders in Effective Assessment.” Ahead of the
Curve: The Power of Assessment to Transform Teaching and
Learning. Douglas Reeves, Editor. Solution Tree, 2007.
• Stiggins, Rick. “Assessment for Learning: An Essential
Foundation of Productive Instruction.” Ahead of the Curve: The
Power of Assessment to Transform Teaching and Learning.
Douglas Reeves, Editor. Solution Tree, 2007.
• “Synopsis of ‘The Power of Feedback’” by Center on Instruction,
2008. [Hattie & Timperley’s research]
• Wiggins, Grant. Educative Assessment: Designing Assessments
to Inform and Improve Student Performance. San Francisco:
Jossey-Bass Inc., 1998.
Download