Assessment Workshop I Final Final with Shifts

advertisement
Assessment Workshop I
Creating and Evaluating High
Quality Assessments
Dr. Deborah Brady
Do Now
Good Morning!
 Please make sure you sign in (table at left with
printer)
 Look over the 2 handouts: the PowerPoint and the
Agenda/Handout
 Sit with team members if possible
 Please set your cell phones to vibrate
 Coffee and… are at table at back; help yourselves;
thank you Central Mass Readiness Center and Tahanto
District
Norms
 Processing
Partners
 Movement
 Exit
Slips
Planning
next class
Questions,
Deborah
resources
Brady dbrady3702@msn.com
Information Overload Ahead!
Agenda
Introductions: Overview
I.
Break at about 10, lunch at about 11:30, session ends at about 3:00
II.
Morning presentation (with frequent processing breaks) and afternoon time for beginning to plan
High quality Assessments (DESE criteria)
II.
III.
I.
I.
Tools to evaluate assessments
II.
Tools to track all educators’ DDMs
I.
Quality Tracking Tool
II.
Educator Alignment Tool
Measuring Student Growth
Direct measures
I.
I.
Local alternatives to determine growth
II.
Pre-/Post, Holistic Rubrics, Measures over time, Post-test only
III.
“Standardization” an alternative, but not required
IV.
Indirect measures
IV.
Piloting, preparing for full implementation in SY 2015
V.
TIME to work
Where Are You in This Journey?
“Living” Likert Scale
• Adapting
present
assessments
• Creating new
assessments
• Writing to
text?
Developing
Assessments
Assessing
Quality
• Alignment of
Content
• Rigorous and
appropriate
expectations
• Plus
• Security
• Calibration
of standards
• Rubric
quality
• Analysis of
results: HighM-Low
Growth
Piloting
2015 Full
Implementation
• 2 DDMs per
educator
• Directions for
teachers
• Directions for
students
• Organizing for the
actual assessments
• Storing, tracking
the information
• Data storage
• Data Analysis
• L-M-H Growth
Interpreting
the results
Student
Impact
Carousel Walk/Living Likert Scale
1) CAROUSEL WALK
Take a walk past each of the phases in this process
2. Put a check
to the left of each area that you have addressed (even partially)
1.
3.
Put an
4.
Put a
5.
Add
! next to each bullet/category that you have some concerns about
? Next to any area that seems problematical or is unfamiliar to you
+
if you see something missing that is a concern
2) LIVING LIKERT SCALE
1. After your walk, stand by the stage of DDM development where you (and your
team, school, or district) are.
Developing
Assessing
quality
Piloting
Fully
Implement
Interpreting
the Results
Potential as Transformative Process
When Curriculum, Instruction or Assessment is changed….
Elmore, Instructional Rounds, and the “task predicts performance”
Assessment
Instruction
Curriculum
District Determined Measures
DEFINITION
DDMs are defined as:
TYPES OF MEASURES
 Portfolio assessments
“Measures of student
learning, growth, and
achievement related to the
Curriculum Frameworks,
that are comparable across
grade or subject level
district-wide”

Approved commercial
assessments

District developed pre and
post unit and course
assessments

Capstone projects
The Role of DDMs
To provide educators with an opportunity to:
Understand student knowledge and learning
patterns more clearly
 Broaden the range of what knowledge and skills
are assessed and how learning is assessed
 Improve educator practice and student learning
 Provide educators with feedback about their
performance with respect to professional practice
and student achievement
 Provide evidence of an educator’s impact on
student learning

Bottom Line: Time to do this is critically important!
District Determined Measures
Regulations:

Every educator will need data from at least 2 different measures

Trends must be measured over a course of at least 2 years

One measure must be taken from State-wide testing data such as MCAS if
available (grades 4-8 ELA and Math SGP for classroom educators)

One measure must be taken from at least one District Determined Measure
which can include Galileo, normed assessments (DRA, MAP, SAT)
The Development of DDMs
Timeline
2013-2014
District-wide training, development of assessments and pilot
2014-2015
All educators must have 2 DDMs in place and collect the first year’s data
2015-2016
Second year data is collected and all educators receive an impact rating that
is sent to DESE
Performance & Impact Ratings
Performance Rating
Ratings are obtained through data
collected from observations, walkthroughs and artifacts
 Exemplary
 Proficient
 Needs Improvement
 Unsatisfactory

4 Standards plus 2 Goals
Impact Rating
Ratings are based on trends and
patterns in student learning,
growth and achievement over a
period of at least 2 years of data
gathered from DDM’s and Statewide testing

High

Moderate

Low
Summative Rating
Exemplary
Proficient
1-yr SelfDirected Growth
Plan
2-yr Self-Directed Growth Plan
Needs
Improvement
Directed Growth Plan
Unsatisfactory
Improvement Plan
Low
Impact Rating
on
Student
Performance
Moderate
High
Rating of Impact on Student Learning
Massachusetts Department of Elementary and Secondary Education
14
What kinds of assessments will work for administrators,
guidance, nurses, school psychologists?

Use School-wide Growth Measures

Use MCAS growth measures and extend them to all
educators in a school

Use “indirect measures” such as dropout rates,
attendance, etc., as measures

Use Student Learning Objectives (SLOs)

Or create measures

Pre- and post-tests are generally required to measure
growth except with normed assessments
 Indirect measures of student learning, growth, or achievement
provide information about students from means other than
student work.
 These measures may include student record information (e.g.,
grades, attendance or tardiness records, or other data related
to student growth or achievement such as high school
graduation or college enrollment rates).
 To be considered for use as DDMs, a link (relationship)
between indirect measures and student growth or achievement
must be established.
 For some educators such as district administrators and
guidance counselors, it may be appropriate to use one indirect
measure of student learning along with other direct measures;
 ESE recommends that at least one of the measures used to
determine each educator’s student impact rating be a direct
measure.
Indirect Measure Examples

Consider Student Support Team (SST) Process for a team

High school SST team example—increase in-depth studies

Child Study Team example—make the process consistent district-wide

RTI team example—follow the referral process

High school guidance example

Subgroups of students can be studied (School Psychologist group example)—school anxiety

Social-emotional growth is appropriate (Autistic/Behavioral Program example)—saying
hello


Number of times each student says hello to a non-classroom adult on his or her way to gym or
class

Number of days (or classes) a student with school anxiety participates

Assess level of participation in a class

“Spot-check,” for example every Friday for 15 minutes

Increase applications to college
IEP goals can be used as long as they are measuring growth (academic or social-emotional)
GROWTH SCORES for Educators Will Need to Be Tabulated
for All Locally Developed Assessments
MCAS SGP (for students) in this example
244/ 25 SGP
4503699
230/ 35 SGP
225/
92 SGP
What are the requirements?
 1.
Is the measure aligned to content?
Does
it assess what is most important for students to learn
and be able to do?
Does
it assess what the educators intend to teach?
Bottom
Line: “substantial” content of course

At least 2 standards

ELA: reading/writing

Math: Unit exam

Not necessarily a “final” exam (unless it’s a high quality exam)
19
 2.
Is the measure informative?
 Do
the results of the measure inform educators about curriculum,
instruction, and practice?
 Does
it provide valuable information to educators about their students?
 Does
it provide valuable information to schools and districts about their
educators?
Bottom Line: Time to analyze is essential
20
Five Considerations (DESE)
1.
Measure growth
2.
Employ a common administration
procedure
3.
Use a common scoring process
4.
Translate these assessments to an Impact
Rating
5.
Assure comparability of assessments (rigor,
validity).
21
Comparability
 Comparable
within a grade, subject, or
course across schools within a district
 Identical
measures are recommended
across a grade, department, course
 Comparable
across grade or subject
level district-wide
 Impact
Ratings should have a consistent meaning
across educators; therefore, DDMs should not
have significantly different levels of rigor
22
Two Considerations for Local DDMs,
1. Comparable across schools
 Where
possible, measures are identical
 Easier
 Do

to compare identical measures
identical measures provide meaningful information about all students?
Exceptions: When might assessments not be identical?
 Different
content (different sections of Algebra I)
 Differences
in untested skills (reading and writing on math test for ELL
students)
 Other

accommodations (fewer questions to students who need more time)
NOTE: Roster Verification and Group Size will be considerations by DESE
23
“Common Sense”

The purpose of DDMs is to assess Teacher Impact

The student scores, the Low, Moderate, and High growth
rankings are totally internal

DESE (in two years) will see
 MEPIDS
 L,

and
M or H next to a MEPID
The important part of this process needs to be the focus:
 Your
discussions about student learning with colleagues
 Your
discussions about student learning with your evaluator
 An
ongoing process
Writing to Text and PARCC
The Next Step?

The 2011 MA Frameworks Shifts to the Common Core

Complex Texts

Complex Tasks

Multiple Texts

Increased Writing
A Giant Step?
Increase in cognitive load

Mass Model Units—PBL with Performance-Based Assessments (CEPAs)

PARCC assessments require matching multiple texts
2. Comparable across the District


Aligned to your curriculum (comparable content) K-12 in all disciplines

Appropriate for your students

Aligned to your district’s content

Informative, useful to teachers and administrators
“Substantial” Assessments (comparable rigor):

“Substantial” units with multiple standards and/or concepts assessed.
(DESE began talking about finals/midterms as preferable recently)
See Core Curriculum Objectives (CCOs) on DESE website if you are concerned
http://www.doe.mass.edu/edeval/ddm/example


/
Quarterly, benchmarks, mid-terms, and common end of year exams
NOTE: All of this data stays in your district. Only HML goes to DESE with a
MEPID for each educator.
Approaches to Measuring Student Growth
Pre-Test/Post
Test
Repeated Measures
Holistic Evaluation
Post-Test Only
27
Pre/Post Test
Description:
 The same or similar assessments administered at the
beginning and at the end of the course or year
 Example: Grade 10 ELA writing assessment aligned to
College and Career Readiness Standards at beginning
and end of year with the passages changed
 Measuring Growth:
 Difference between pre- and post-test.
 Considerations:
 Do all students have an equal chance of demonstrating
growth?

28
Repeated Measures



Description:
 Multiple assessments given throughout the year.
 Example: running records, attendance, mile run
Measuring Growth:
 Graphically
 Ranging from the sophisticated to simple
Considerations:
 Less pressure on each administration.
 Authentic Tasks
29
Repeated Measures Example
Running Record
Running Record Error Rate
70
60
50
# of
errors
40
Low Growth
30
High Growth
Mod Growth
20
10
0
30
Date of
Holistic



Description:
 Assess growth across student work collected
throughout the year.
 Example: Tennessee Arts Growth Measure System
Measuring Growth:
 Growth Rubric (see example)
Considerations:
 Option for multifaceted performance assessments
 Rating can be challenging & time consuming
31
Holistic Example
1
No improvement in
the level of detail.
One is true
* No new details
across versions
* New details are
added, but not
included in future
versions.
Details
* A few new details
are added that are
not relevant, accurate
or meaningful
2
3
4
Modest
improvement in the
level of detail
Considerable
Improvement in the
level of detail
Outstanding
Improvement in the
level of detail
One is true
All are true
All are true
* There are a few
details included
across all versions
* There are many
examples of added
details across all
versions,
* On average there
are multiple details
added across every
version
* At least one
example of a detail
that is improved or
elaborated in future
versions
* There are multiple
examples of details
that build and
elaborate on previous
versions
*Details are
consistently included
in future versions
* The added details
reflect the most
relevant and
meaningful additions
* There are many
added details are
included, but they are
not included
consistently, or none
are improved or
elaborated upon.
* There are many
added details, but
several are not
relevant, accurate or
meaningful
*The added details
reflect relevant and
meaningful additions
Example taken from Austin, a first grader from Anser Charter School in
Boise, Idaho. Used with permission from Expeditionary Learning. Learn
32
more about this and other examples at http://elschools.org/studentwork/butterfly-drafts
Post-Test Only



Description:
 A single assessment or data that is paired with
other information
 Example: AP exam
Measuring Growth, where possible:
 Use a baseline
 Assume equal beginning
Considerations:
 May be only option for some indirect measures
 What
is the quality of the baseline information?
33
MCAS Has 2 Holistic Rubrics
Topic/
Developm
ent
Conventi
ons
6
5
4
4
5
6
Rich
topic/idea
development
Careful,
subtle
organization
Effective rich
use of
language
Full topic/idea
development
Logical
organization
Strong details
Appropriate
use of
language
Moderate
topic/idea
development and
organization
Adequate,
relevant details
Some variety in
language
Rudimentary
topic/idea
development
and/or
organization
Basic supporting
details
Simplistic language
Limited or weak
topic/idea
development,
organization,
and/or details
Limited awareness
of audience and/or
task
Little topic/idea
development,
organization,
and/or details
Little or no
awareness of
audience and/or
task
Control of
sentence
structure,
grammar, usage,
and mechanics,
(length and
complexity of
essay) provide
opportunity for
student to show
control of
standard English
conventions)
Errors do not
interfere with
communication
and/or
Few errors relative
to length of essay
or complexity of
sentence structure,
grammar and
usage, and
mechanics
Errors interfere
somewhat with
communication
and/or
Too many errors
relative to the
length of the essay
or complexity of
sentence
structure, grammar
and usage, and
mechanics
•Errors seriously
interfere with
communication
AND
•Little control of
sentence structure,
grammar and
usage, and
mechanics
Post-Test Only
A challenge to tabulate growth
 Portfolios

Measuring achievement v. growth
 Unit

Assessments
Looking at growth across a series
 Capstone

Projects
May be a very strong measure of achievement
35
Selecting DDMs
“Borrow, Buy, or Build”

PRIORITY:
Use Quality Tool to Assess Each Potential DDM to pilot this year for your school
(one district final copy on a computer)

CCOs will help if this is a District-Developed Tool

If there is additional time, Use Educator Assessment Tool to begin to look at
developing 2 assessments for all educators for next year
“Tools” to Support the Process

For determining what is important (Core Curriculum Objectives)

For determining adequacy for use as DDM (Quality Tool)

“Shifts” of Common Core examples and rubrics

For making sure each educator has 2 DDMs (Educator Alignment)

For assessing rigor (Cognitive Complexity Rubric, CEPA Rubric)
 Checklist
Grade and Subject or Course _____________________
Potential DDM Name_____________________________
Potential DDM Source
Developed within district
From another district—indicate which one
Commercial—indicate publisher
Type of assessment  Tracker
On-Demand (specific time for administration)
Performance/Project
Portfolio
Hybrid
Other
Item types
Selected Response (Multiple choice)
Constructed Response (written, oral)
Performance/Portfolio
Two or more
Other
Alignment to Curriculum
Well-aligned
Moderately aligned
Poorly aligned
Not yet aligned
Alignment to Intended Rigor
Well-aligned
Moderately aligned
Poorly aligned
Not yet aligned
MCAS and PARCC
The Curriculum/Assessment Shifts


MCAS

ORQs

Math: Application of Concepts

ELA: ONLY comprehension not writing quality
PARCC Shifts to CC

MC at MUCH HIGHER cognitive level

All writing is assessed as writing (unlike ORQs)

Personal narrative, persuasive essay, literary
analysis of any novel

NEW Text Types—Writing at far higher level:
Narratives, Informational Text, Arguments

MC questions some application


Emphasis on content
Math—Processes, depth of understanding, beyond
application

Emphasis on content plus

Literacy in ELA, math, social sciences, science,
technology
Critically Important!
1)Rigor and 2)Alignment to Curriculum
Aligned to District
curriculum
Rigorous

2011 Massachusetts Frameworks

Shifted to new expectations

Common Core Shifts

Shifted from MCAS expectations

Consider PARCC

This is a district decision

Complex texts

Complex tasks

Writing to text


Shift in Persuasive Essay (Formal
Argument)

Shift in Narrative (More substantial
and linked to content)

Shift in Informational Text
(organization substantiation)
Math, Science , History/SS
frameworks

Gradual increments?

Giant steps?
Understanding the Research Simulation
Task
•
Students begin by reading an anchor text that introduces the
topic.
•
EBSR and TECR items ask students to gather key details about the
passage to support their understanding.
•
Students read two additional sources and answer a few questions
about each text to learn more about the topic, so they are ready
to write the final essay and to show their reading comprehension.
•
Finally, students mirror the research process by synthesizing their
understandings into a writing that uses textual evidence from the
sources.
42
Use what you have learned from reading “Daedalus and Icarus”
by Ovid and “To a Friend Whose Work Has Come to Triumph”
by Anne Sexton to write an essay that provides an analysis of
how Sexton transforms Daedalus and Icarus.
As a starting point, you may want to consider what is
emphasized, absent, or different in the two texts, but feel free to
develop your own focus for analysis.
Develop your essay by providing textual evidence from both
texts. Be sure to follow the conventions of standard English.
Thus, both comprehension of the 2 texts and the author’s craft
are being assessed along with the ability of the student to craft a
clear argument with substantiation from two texts.
43
Texts Worth Reading?

Range: Example of assessing reading across the disciplines and helping to satisfy
the 70%-30% split of informational text to literature at the 9-11 grade band (Note:
Although the split is 70%-30% in grades 9-11, disciplines such as social studies
and science focus almost solely on informational text. English Language Arts
Teachers will have more of a 50%-50% split between informational and literary
text, with informational text including literary non-fiction such as memoirs
and biographies.)

Quality: The texts in this set about Abigail Adams represent content-rich
nonfiction on a topic that is historically significant.

Complexity: Quantitatively and qualitatively, the passages have been validated
and deemed suitable for use at grade 11.
44
Text Types, their Shifts, Rubrics for each
Text type
Narrative
Shifts with the Common
Core
No longer personal story;
content bearing
CC Rubric links
Essential Elements
http://www.doe.k12.de.us/aab/English_Language • Bears content
_Arts/writing_rubrics.shtml
• Story elements
or
support content
The 1.0 Guidebook to LDC or
http://www.parcconline.org/samples/englishlanguage-artsliteracy/grades-6-11-generic-rubricsdraft
Informational Content area articles,
Text
non-fiction, biography,
even literary historical
http://www.doe.k12.de.us/aab/English_Language • Provides
_Arts/writing_rubrics.shtml
information
or The 1.0 Guidebook to LDC
• Many genres
or http://www.parcconline.org/samples/english- • Scientific article,
language-artsliteracy/grades-6-11-generic-rubricsfeature story,
draft
biography, speech
Argument
http://www.doe.k12.de.us/aab/English_Language • Balanced
_Arts/writing_rubrics.shtml
presentation of
Or The 1.0 Guidebook to LDC
multiple points of
or http://www.parcconline.org/samples/englishview
language-artsliteracy/grades-6-11-generic-rubrics- • Claims/Evidence
draft
• Citations
Not persuasive essay
with one voice, but a
more academically
balanced multiple
perspective, but with
claims and evidence by
the writer
Shifted Analytical Writing

Claims

Evidence

Use of textural evidence

Multiple perspectives
Template for the Argument from They Say/I Say
They Say
(major claims, quoted)
I Say
(What does this mean)
Template
Template
The character says
This means
More simply, this means
Connecting what they say it to a
paragraph
Connecting your interpretation to a
paragraph
When Sidney Carton says, “It is a far,
far better….”
He is declaring that his sacrifice is
something new for him, and this
martyrdom will bring him to a better
place, his own resurrection, than he
has ever experienced in his corrupt
life before this final act.
Your analysis as it connects to the
thesis of the paper
The hope that Dickens’ sees for soc
justice is shown in Carton’s selfless
act to save Darnay.
In A Tale of Two Cities, Dickens uses the characters to represent the corruption and the hope for social justice in
England and France. The final chapter shows the hope that Dickens sees despite the corruption. When Sydney Carton
says, “It’s a far, far….known” (
), he symbolizes the possibilities for reform and redemption. Carton is declaring th
his sacrifice is new for him and that he will find a better place, his own resurrection, than he has ever experienced i
his corrupt life.
Templates to scaffold a smoothly written
analysis or argument James Burke)
They Say
I Say
What others say about this claim
and topic

I make a claim for the whole argument

I explain what “they say”

Quoted appropriately

I am responsible for organizing the claims,
the evidence, and my explanations

Cited appropriately


Worked into whole essay smoothly
I am responsible for making links
between/among the sources using
transitional sentences and transitional
words.

In contrast,….

Like…..

Somewhat similar to…

Shifted Informational/Explanatory Writing
 Conveys
information accurately
 Serves one or more of the following purposes:
 Increase
a reader’s knowledge about the subject
 Helps readers understand a procedure or process
 Provides readers with an enhanced comprehension of a
concept
Appendix A CC p 23
Shifted Narrative Examples

In the service of information
 Science—read
article and retell the story from the perspective
or the scientist who was in disagreement with the evidence
 Math—look
at the solution to this problem which has some
problems. Create a dialogue between you and this student in a
peer discussion in which you tell a peer what is good about
and what he needs to do to improve his work
 History—read
the newspaper article written during Lincoln’s
time written by one of his rivals. Write a narrative of a
meeting between him and President Lincoln in which Lincoln
answers some of this person’s objections to his policy based
upon the information in the Gettysburg Address
Delaware Rubrics
http://www.doe.k12.de.us/aab/English_Language_Arts/writing_rubrics.shtml
K-12 Argument Rubrics
 K-12 Informational Writing Rubrics
 K-12 Narrative Writing Rubrics

NOTE:
 Holistic
rubric for faster scoring
 Multiple criteria provides more points
 No point system for rubrics is perfect; you’ll need to
validate the results with student work.
Standards Based versus Common Core
Gatsby Unit
Examining Author’s Purpose
and Point of View
How Great is Gatsby?

Living Likert Scale


Partnered evidence and counter
argument
Living Likert; partnered evidence
gathering

Individual Essay

Fitzgerald’s purpose in writing the
novel

Filtered through Nick’s perspective

Interpreted by movies
Thesis

Argument with rating (110)

Counter Argument

Conclusion


Academic Critique


Scripts, words

Images

Modifications
Authentic writing: Rotten Tomatoes
Writing to Text
Student Side of Notebook
Daisy as distant dream/as foil
Gatsby is
distant and
never truly
close to his
dream though
he doesn’t
realize it.
DeCaprio’s intensity is
too strong for the cool
dreamer.
Writing to Text Sample
Classroom Side of Notebook
How Great Was Gatsby?

Fitzgerald’s purpose

Nick’s point of view

Daisy as dream/foil

Evidence from novel

Three movie versions

Select a scene, image, chapter, or
a series of scenes.

Which of the images portrays
Fitzgerald’s Gatsby?

Thesis:

3-5 examples

So what?
Example of a Strong and Weak Text Set
Strong Text Set
Weak Text Set
Anchor Text: Fahrenheit 451, Ray Bradbury
Anchor Text: Fahrenheit 451, Ray Bradbury
Related Texts:
Related Texts:
•
“You Have Insulted Me: A Letter,” Kurt
•
“‘Chaos:’ Gunman Ambushes, Kills Two
Vonnegut (Informational)
Firefighters at New York Blaze,” Catherine
•
“Burning a Book” by William Stafford (Poem)
Shoichet and Greg Botelho (CNN) (Informational)
•
“The Book Burnings,” United States Holocaust • “Johannes Gutenberg and the Printing Press,”
Memorial Museum (Informational)
Mary Bellis (About.com) (Informational)
•
Excerpts from The Book Thief, Marcus Zusak
•
Fahrenheit 451, Francois Truffaut (Film)
(Appendix B Exemplar)
•
“About Ray Bradbury: Biography” (Informational)
•
“Learning to Read and Write,” Frederick
•
“The Pedestrian,” Ray Bradbury (Literary)
Douglass (Informational)
•
The Children’s Story, James Clavell (Literary)
•
“Learning to Read,” Malcolm X (Informational)
•
“Unto My Books So Good to Turn,” Emily
Dickinson (Poem)
•
“The Portable Phonograph,” Walter Van Tilburg
Clark
Other Subjects and Courses
These assessments include both traditionally tested and non-tested grades. Districts
may choose to select a DDM that meets the traditionally non-tested grade/subject
or course minimum pilot requirement from this collection.
Math
ELA
ELA Literacy Assessments
ELA CCOs
History and Social Sciences
Math Assessments
Math CCOs
Science and Technology
History & Social Studies Assessments
History & Social Studies CCOs
Arts
Science and Technology Assessments
Science and Technology CCOs
Foreign Language
Arts Literacy Assessments
Arts CCOs
Comprehensive Health
Foreign Language Assessments
Foreign Language CCOs
Comprehensive Health Assessments
Comprehensive Health CCOs
Communications & Information Sciences
Other Subjects Assessments
Core Curriculum Objectives
(CCOs—partial list for Writing to Text)
#
Objective
1
Students analyze how specific details and events develop or advance a theme,
characterization, or plot of a grade 9 literary text, and they support their analysis with
strong and thorough textual evidence that includes inferences drawn from the text.
2
Students analyze how the structure, syntax, diction, and connotative or figurative
meanings of words and phrases inform the central idea or theme of a grade 9 literary text,
and they support their analysis with strong and thorough textual evidence that includes
inferences drawn from the text.
3
Students analyze how specific details, concepts, or events interact to develop or advance
a central idea of a grade 9 informational text, and they support their analysis with strong
and thorough textual evidence that includes inferences drawn from the text.
4
Students analyze how cumulative word choice, rhetoric, syntax, diction, and the technical,
connotative, or figurative meanings of words and phrases support the central idea or
author’s purpose of a grade 9 informational text.
5
Students produce clear and coherent writing to craft an argument, in which the
development, organization, and style are appropriate to their task, purpose, and audience,
using such techniques as the following:

introducing precise claim(s), distinguishing the claim(s) from alternate or opposing
claims, and creating an organization that establishes clear relationships among
claim(s), counterclaims, reasons, and evidence;

developing claim(s) and counterclaims fairly, supplying evidence for each while
pointing out the strengths and limitations of both in a manner that anticipates the
audience’s knowledge level and concerns;

using words, phrases, and clauses to link the major sections of the text, create
cohesion, and clarify the relationships between claim(s) and reasons, between
reasons and evidence, and between claim(s) and counterclaims;

establishing and maintaining a formal style and objective tone while attending to
the norms and conventions of the discipline in which they are writing;

providing a concluding statement or section that follows from and supports the
argument presented; and

demonstrating command of the conventions of Standard English.
ELA-Literacy — 9
English 9-12https://wested.app.box.com/s/pt3e203fcjfg9z8r02si
Assessment
Hudson High School Portfolio Assessment for English Language Arts and Social Studies
Publisher Website/Sample
Designed to be a measure of student growth over time in high school ELA and social science courses. Student
selects work samples to include and uploads them to electronic site. Includes guiding questions for students and
scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social
science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending
on specification of assignments.
Traditional
Assessment
Non-Traditional
Assessment
Administration/
Scoring

Traditional End-of-Grade
Assessment

Pre/Post or Repeated
Measures

Paper/Pencil

Traditional End-of-Course
Assessment

Performance Task Rubric

Computer Supported

Selected Response

Portfolio or Work
Sample Rubric

Computer Adaptive

Short Constructed Response

Project-Based Rubric

Machine Scored

Writing Prompt/Essay

Observation Rubric
or Checklist

Scored Locally

Other:

Scored Off-Site
Other Tools: MA Model Curricula and Rubrics
CEPAs
( Also, Delaware rubrics for specific text types)
1
Topic development:
The writing and
artwork identify the
habitat and provide
details
2
Little topic/idea
development,
organization,
and/or details
Little or no
awareness of
audience and/or
task
Limited or weak
topic/idea
development,
organization, and/or
details
Limited awareness of
audience and/or task
Evidence and
Content Accuracy:
writing includes
academic vocabulary
and characteristics
of the animal or
habitat with details
Little or no
evidence is
included
and/or
content is
inaccurate
Use of evidence and
content is limited or
weak
Artwork; identifies
special
characteristics of
the animal or
habitat, to an
appropriate level of
detail
Artwork does not
contribute to the
content of the
exhibit
Artwork demonstrates a
limited connection to
the content (describing
a habitat)
3
Rudimentary
topic/idea
development and/or
organization
Basic supporting
details
Simplistic language
4
5
6
Moderate
topic/idea
development and
organization
Adequate,
relevant details
Some variety in
language
Full topic/idea
development
Logical
organization
Strong details
Appropriate use
of language
Rich topic/idea
development
Careful and/or
subtle organization
Effective/rich use
of language
Use of evidence and
content is included
but is basic and
simplistic
Use of evidence
and accurate
content is
relevant and
adequate
Use of evidence
and accurate
content is logical
and appropriate
A sophisticated
selection of and
inclusion of
evidence and
accurate content
contribute to an
outstanding
submission
Artwork is basically
connected to the
content and
contributes to the
overall understanding
Artwork is
connected to the
content of the
exhibit and
contributes to its
quality
Artwork
contributes to the
overall content of
the exhibit and
provides details
Artwork adds
greatly to the
content of exhibit
providing new
insights or
understandings
Sample DDMs—Local Digital Portfolio
Hudson, MA

Buy, Borrow, Build

Each sample DDM is evaluated
Hudson’s Evaluation: Designed to be a measure of student growth over time in
high school ELA and social science courses. Student selects work samples to
include and uploads them to electronic site. Includes guiding questions for
students and scoring criteria. Scoring rubric for portfolio that can be adapted for
use in all high school ELA and social science courses. Generalized grading criteria
for a portfolio. Could be aligned to a number of CCOs, depending on specification
of assignments.

Many are standardized assessments
Educator Alignment Tool
School
Grade Subject
Course
HS
MEPID Last
First
Name Name
07350 Smith Abby
Course Potential
ID
DDM1
01051 MCAS ELA 10
NO
Potential Potential
DDM2
DDM3
10
ELA
Grade 10
ELA
HS
07350 Smith Abby
9
ELA
World
Studies
01058
Writing
to text 9
HS
07350 Smith Abby
9
ELA
Grade 9
ELA
01051
HS
07352 Smith Brent
10
Math
IMM 2
HS
07352 Smith Brent
10
Math
IMM 1
MCAS MATH10
NO
HS
07353 Smith Cathy
11
Science
Physics
Physics
(singleton)
www.doe.mass.edu/
Writing to
text 9
Next Class: Protocols to Use Locally
for Inter-Rater Reliability;
Looking at Student Work
Possible focus:
Developing
“text sets”: resources, on-line and texts
Developing
effective rubrics for large-scale assessment
Developing
exemplars
Calibrating
scores
Looking
at Student Work (LASW)
http://Nsfharmony.org/protocol/a_z.html
Sample
for Developing Rubrics from an assessment
Pilot Steps:
1.
Prepare to pilot




2.
Test


3.
4.
Build your team
Identify content to assess
Identify the measure

Aligned to content

Informative
Decide how to administer & score
Administer
Score
Analyze
Adjust
64
Where to Begin Today
Quality Checklist Tool



Quality Alignment Tool
Alignment to content/curriculum
Alignment to rigor
If the assessment passes these
criteria:

Then validity and reliability

Then instructions, procedures
for assessment, etc.
Educator Alignment Tool

Preparing for June 1, 2014 report

All educators

2 DDMs
A pause to remember what we are doing here
Tentative
(HowTopics
changing
Exit Slips
assessment positively can bring a positive change in Instruction and curriculum)

Testing protocols for consistency across grades, teams, departments, schools

Protocols to maintain inter-rater reliability (blind assessment)

Mock assessment using rubric/exemplars

Rubric quality (I’ve found this to be a concern when I’ve looked closely at some assessments.)

Data organization, analysis, assigning “scores” to teachers

Determining “cut” scores with local assessments

Organizing for June report to DESE (2 assessments per educator)

Planning for implementation of these many assessments in 2015

When

Accommodations

Windows for assessment

Security

Do singleton teachers assess their assessments?

Time to develop local protocols and directions

What is your priority? What do you need to be successful?
“Perfect is the enemy of good.”
Voltaire
Download