New Directions in Assessment PARCC Prototype Tasks

advertisement
New Directions in Assessment
PARCC Prototype Tasks
Susan H. Hull, Katey Arrington, Lisa Brown,
Ann Roman, Cathy Seeley
The Charles A. Dana Center
The University of Texas at Austin
April 17, 2013
2013
2013
1
1
In this session
•  Why do the Common Core State Standards for
Mathematics call for a new kind of assessment?
•  Overview of a new model for large-scale assessment
(Evidence-Centered Design)
•  What constitutes an innovative next generation
summative assessment task?
2013
2013
22
Why new assessments? Shifts in the CCSSM
From
To
Modeling
Using algorithms to solve a set
of standard problems
Using mathematics to solve a broader
array of problems, including problems
from real-world settings
Coherence
Learning topics in isolation
Learning mathematics in the context of
natural cross-grade-level progressions
Focus
Learning mathematics at a
surface level
Learning fewer topics in greater depth for
true mastery (fewer, clearer, higher)
Rigor
Emphasizing mostly procedural
competence
Developing broad math proficiency and
habits of mind (Standards for
Mathematical Practice)
Modernization
Using a 19th-century algebrabased curriculum
Using a broader 21st-century curriculum
(focus on math and statistics central to
making sense of uncertainty and using
technology to solve real-world and
theoretical problems)
2013
2013
33
The Mathematics Prototyping Project: Dana Center
• 
Charged to develop prototypes of possible assessment tasks / items
that reflect the priorities of the Partnership for Assessment of
Readiness for College and Careers (PARCC) for summative
assessment
• 
Partners
~  Agile Mind, Inc.
~  University of Arizona (Bill McCallum; Center on Mathematics &
Education; Illustrative Mathematics Project)
~  City University of New York
• 
Dana Center Leadership: Cathy Seeley, Susan Hull, Kathi Cook, Ann
Roman, Lisa Brown, Katey Arrington, and colleagues
• 
Consultants and authors: David Foster, Randy Charles, Steve
Leinwand, Sue Eddins, David Hughes, Vicki Massey, University of
Arizona
2013
2013
4
Dana Center’s Role
Keep in mind…
Prototyping is for learning, and it can be (and was)
messy.
2013
2013
55
Assessment Design
Standard
Task
Standard
Task
Standard
Task
Standard
Task
Standard
Task
2013
2013
66
Assessment Design
Standards for
Mathematical Practice
Domain
Cluster Heading
Standard
Standard
Standard
Cluster Heading
Standard
Standard
Standard
Task
Claims
Task
Evidence
Task
Models
Task
Task
Task
2013
2013
77
Evidence-Centered Design (ECD) for PARCC
Model Content Frameworks
Evidence Statements
To make claims
Tasks
about what
students know, we
Based on analysis,
must
Tasks are
operationalize the evidence drives
task development designed to elicit
standards
specific evidence
from students
ECD is a deliberate and systematic approach to assessment
development that will help establish validity of the
assessments, increase comparability of year-to-year
results, and increase efficiencies/reduce costs.
slide from PARCC
8
2013
8
Overview of PARCC Mathematics Task Types
Task Type
Description of Task Type
I. Tasks
assessing
concepts,
skills and
procedures
• 
• 
• 
• 
II. Tasks
assessing
expressing
mathematical
reasoning
•  Each task calls for written arguments / justifications, critique of
reasoning, or precision in mathematical statements (MP.3, 6).
•  Can involve other mathematical practice standards
•  May include a mix of machine-scored and hand-scored responses
•  Included on the Performance Based Assessment component
•  Sub-claim C
III. Tasks
assessing
modeling /
applications
•  Each task calls for modeling/application in a real-world context or
scenario (MP.4)
•  Can involve other mathematical practice standards
•  May include a mix of machine-scored and hand-scored responses
•  Included on the Performance Based Assessment component
•  Sub-claim D
Balance of conceptual understanding, fluency, and application
Can involve any or all mathematical practice standards
Machine-scorable, including innovative, computer-based formats
Will appear on the End-of-Year and Performance-Based
Assessment components
•  Sub-claims A, B and E
Slide from PARCC; for more information see PARCC Task Development ITN Appendix D.
9
2013
9
Nature of Tasks
— Machine-scorable selected response and short
items, including innovative, computer-based formats;
balance of conceptual understanding and procedural
knowledge; brief applications; includes practices
— Practice-forward tasks
— Technology-enhanced tasks
— Constructed/extended-response items
•  hand or machine scored, real-world scenario, tasks
highlighting practices
2013
2013
10
10
What Might These Items Look Like with Technology?
  Agile Mind technological platform was used for PARCC
prototyping
  Vendors are using/creating their own technological
platform and capabilities
  We suggested possible item formats that are enhanced
by, and make use of, technology…
2013
2013
11
11
What is the Role of Technology?
How will it be used?
•  To cost-effectively measure a wider range of standards.
•  Technology enhancements may range from the incremental
to the transformative.
•  Technology can draw students into the mathematics.
•  Enhancements can support wider accessibility for students.
2013
2013
12
12
What is the role of context?
  Designed to:
  make the problem interesting to the student
  describe a real-world problem that mathematics can solve
  set up a mathematical idea
  lead the student to a certain sort of mathematical work
  Does not have to be realistic, but it has to serve a purpose, either
mathematical or pedagogical.
  Is not warranted:
  if it is distracting or phony,
  if the task would be essentially the same without the context, or
  if it distracts from the mathematics or points to a problematic practice.
2013
2013
13
13
Explore PARCC Prototypes
Go to:
www.ccsstoolbox.com
Click on:
  Resources for Implementation
  PARCC Prototyping Project
  Desired grade span
2013
2013
14
14
Explore PARCC Prototypes
What makes these tasks Next Generation?
2013
2013
15
15
A Close Look at PARCC Prototype Assessments
parcconline.org/samples/item-task-prototypes
www.ccsstoolbox.org
  In what ways does the item reflect the shifts in the CCSSM?
  What is the item asking students to do?
  What is different about this item in comparison to current
items?
  How does the technology enhancement challenge student
thinking?
  How might the design of this item inform curriculum and
instruction?
  Considering all parts of this performance task, what are
some implications for professional learning?
2013
2013
16
16
Implications of “Next Generation” Assessments
From what you’ve seen of the summative task
prototypes, what are the implications for…
 
 
 
 
Curriculum
Instruction
Formative assessment
Professional development
2013
2013
17
17
Recommendations for Next Steps
  Engage in deep study of the CCSSM
  Study the prototype items, both on the PARCC website
and on the Dana Center–Agile Mind toolbox
  Begin implementing SMP (especially 3, 4, 6) along with
CCSSM content
  Investigate the features of any machine-scored item
banks that you are using or considering: do they allow for
multiple answers/multiple choice? Fill-in-the-blank? Dragand-drop? Other interactivity? Partial scoring?
  How frequently are your teachers assigning 5+-minute
problems? 10+-minute problems? Multipart problems?
Cross-standard tasks?
2013
2013
18
18
Contact us
Susan H. Hull, Director of Organizational Learning and Mathematics,
shhull@austin.utexas.edu
Katey Arrington, Manager, K–12 Systems Services;
Prototype–High School Lead, katey.arrington@austin.utexas.edu
Lisa Brown, Program Coordinator; Prototype–Middle School Lead,
lisabrown@austin.utexas.edu
Ann Roman, Program Coordinator; Prototype–Elementary Lead,
romana@austin.utexas.edu
Cathy Seeley, Dana Center Senior Fellow Emeritus,
cseeley@utexas.edu
Charles A. Dana Center, www.utdanacenter.org
2013
2013
19
19
Download