COMMON CORE STATE STANDARDS

advertisement
COMMON CORE STATE
STANDARDS
Assessments based on the Common
Core State Standards
Vince Dean, Ph.D.
Office of Educational Assessment &
Accountability
1
Race to the Top Assessment
Competition

Assessments based on the Common Core
State Standards

RTTT Assessment Competition





350 million total
320 million for at least 3-8 and one H.S. grade
30 million for H.S. solution, likely end-of-course
Alternate Assessments based on Alternate
Achievement Standards Grant Competition
English Language Proficiency Grant
Competition (next federal fiscal year)
Race to the Top Assessment
Competition
 Assessment
Consortia
 Development
of an infrastructure and content
for a common assessment in measuring CCSS
in English Language Arts and Mathematics
 Two
consortia
 SMARTER/Balanced
Assessment Consortium
(SBAC)
 Partnership for the Assessment of Readiness
for College and Career (PARCC)
Race to the Top Assessment
Competition

U.S. Education Department Requirements





Measure the full breadth of the Common
Core State Standards
Extend the range of high quality
measurement in both directions
Assessments operational by 2014-15
Consortia must offer an online version
Must take advantage of technology for
reporting speed and be instructionally
relevant
Race to the Top Assessment
Competition
 The
consortia:
 SMARTER/Balanced
31 states
 17 governing states
 CAT beginning in 2014-2015

 PARCC
26 states
 11 governing states
 CBT beginning in 2014-15

Introduction to the
SMARTER Balanced
Assessment Consortium
(SBAC)
History
Smarter
Mosaic
Balanced
• Computer
• Adaptive
• Formative
• Capacity
• Integrated
• System
Smarter Balanced
Assessment Consortium
Governing States
Advisory States
CT, HI, ID, KS, ME, MI, AL, CO, DE,GA, IA, KY,
MO, MT, NC, NM, NV, NH, NJ, ND, OH,OK, PA
OR, UT, VT, WA, WI,
SC, SD
WV
17
14
Theory of Action
Goal
To ensure that all students leave high
school prepared for postsecondary
success in college or a career through
increased student learning and
improved teaching
Theory of Action
SMARTER Balanced Assessment
Consortium shaped by the following
principles:
1. Integrated system
2. Evidence of student performance
3. Teachers integrally involved
4. State-led, transparent and inclusive
governance structure
Theory of Action
SMARTER Balanced Assessment Consortium
shaped by the following principles:
5. Continuously improve teaching and
learning
6. Useful information on multiple
measures
7. Design and implementation adhere to
established professional standards
Theory of Action
Creating a policy environment that supports:
a. innovation systems,
b. high expectations and
c. increased opportunities for students
Aligned to the Common Core Standards:
a.
b.
c.
d.
e.
clearly defined college and career expectations,
learning progressions
content/curricular frameworks,
test maps, and
instructional processes
Theory of Action
SBAC policies and standards are
effectively communicated to districts
and schools:
a.
Multi-media communications plan
b.
Score reports
SBAC Specific Priorities


Ensure all students have access to the
technology needed to participate in
each component (summative,
interim/benchmark, formative)
Support research on how to use
technology to increase access for all
students, in particular those needing
accommodations
SBAC Specific Priorities


Use technology to efficiently deliver
training, resources, reports and data;
social networks for teachers to develop
and disseminate effective CCSS
curriculum and instructional tools
Create innovative item types that
utilize technology and represent realworld contexts
SBAC Specific Priorities


Use Computer Adaptive Testing
engine to maximize accuracy for
individual students across the CCSS
Standardized accommodations policy
and administration practices across
states to ensure comparability
SBAC Assessment Design
Proposal
SBAC Assessment Design
Proposal

Summative Assessment





Measure full range of CCSS
Computer Adaptive Testing for precision
Timely results
Engage Institutions of Higher Education
to ensure achievement standards reflect
college and career readiness
Scale scores help inform growth model
SBAC Assessment Design
Proposal

Interim Benchmark Assessment




Allow for finer grain of measurement
(e.g., end of unit)
Inform teachers if students on track to be
proficient on summative assessments
Multiple opportunities for students to
participate
Scale scores help inform growth model
SBAC Assessment Design
Proposal

Formative Assessment



Repository of tools available to teachers
to support quick adjustment and
differentiated instruction
Help define student performance along
the CCSS learning progressions
Concrete strategies for immediate
feedback loops
SBAC Assessment Design
Proposal

Teacher Engagement



Integral role in developing test maps for
each grade and content area
Item writing, specifications, reviewing,
and range-finding for all test types
Teacher-moderated scoring of
performance events to inform
professional development
Technology Enhanced Item

Prototype items courtesy of the
Minnesota and Utah Departments of
Education
Technology Enhanced Item

Minnesota Science Item
SBAC Assessment Design
Proposal




Assessment window vs. single day
administration
Multiple opportunities to assess
Quick results available to support
instruction
Emphasis on problem-solving and
critical thinking
Alternate Assessments Based on the
Common Core State Standards
Dynamic Learning Maps
Alternate Assessment
Consortium
State Participants
Iowa
Kansas
Michigan
Mississippi
Missouri
New Jersey
North Carolina
Oklahoma
Utah
West Virginia
Wisconsin
Other Participants

University of Kansas




Center for Educational
Testing and Evaluation
Center for Research
Methods and Data
Analysis
Center for Research on
Learning
Special Education
Department




AbleLink
Technologies
The ARC
The Center for
Literacy and
Disability Studies at
the University of
North Carolina at
Chapel Hill
Edvantia
Feature Overview











Learning maps
Dynamic assessment
Inclusion of instructionally relevant tasks
Instructionally embedded and stand-alone versions
Advanced feedback and reporting systems (including
growth modeling)
Technology platform
Universal design
Evidence centered design including cognitive labs
Scaffolding
Development of over 14,000 tasks/items
Professional development
Major Changes

Include



Moving Online
Scoring
Reporting
Moving to Online Assessment
 Survey
of state testing directors (+D.C.)
 41
responses
 5 of 41 states have no CBT initiatives
 36 of 41 states have current CBT initiatives,
including:
Operational online assessment
 Pilot online assessment
 Plans for moving online

Moving to Online Assessment
 Survey
 Of
of state testing directors (+D.C.)
36 states with some initiative
21 states currently administer large-scale general
populations assessments online
 9 states have plans to begin (or expand) online
administration of large-scale general populations
assessments
 8 states currently administer special populations
assessments online
 2 states have plans to begin (or expand) online
administration of special populations assessments

Moving to Online Assessment

Survey of state testing directors (+D.C.)
 Of
36 states with some initiative
5 states currently use Artificial Intelligence (AI)
scoring of constructed response items
 4 states currently use Computer Adaptive Testing
(CAT) technology for general populations
assessment
 0 states currently use CAT technology for special
populations assessment
 7 states offer online interim/benchmark assessments
 7 states offer online item banks accessible to
teachers for creating “formative”/interim/benchmark

Online Assessment -The
Michigan Stage
 Michigan’s
 Pilot
online initiatives
in 2006
 Pilot in 2011 (English Language Proficiency)
 Pilot in 2012 (Alternate Assessments)
 Pilots leading up to operational adoption of
SMARTER/Balanced Assessment Consortium
products in 2014/15
 Constitutional amendment barring unfunded
mandates
Scoring
 Maximize
objective scoring
by
 Automated
scoring of objective
items
 AI scoring of extended written
response items, technology
enhanced items, and
performance tasks wherever
possible
 Distributed hand-scoring of
tasks not scorable using AI
Scoring
(maximize
objective,
distribute
subjective)
Scoring as Professional
Development
 Human
scorers taken from ranks of teachers
 Online
training on hand-scoring
 Online certification as a hand-scorer
 Online monitoring of rater performance
 Validation hand-scoring of samples of AI-scored
tasks
 Our
experience with teacher-led scoring and
rangefinding indicates that it is some of the
best professional development that we
provide to educators
Reporting
 Current
reports can be difficult to read and
poorly used
 Need online reporting of all scores for all
stakeholders, including:
 Policymakers
(aggregate)
 Administrators (aggregate and individual)
 Teachers (aggregate and individual)
 Parents (aggregate and individual)
 Students (individual)
Reporting Portal
 Reporting
portal needs
to be able to integrate
reports from classroom
metrics all the way to
large-scale secure
assessment metrics
Overall achievement &
growth scores
Unit achievement scores
Growth scores based on
learning progressions
Classroom
achievement scores
Challenges


LEA capacity for online assessment
Bandwidth issues, especially in rural
areas



Minnesota challenge
Utah example
USED working with FCC on National
Broadband Initiative
Challenges


Item development for computeradaptive testing
Field-testing



Item types
Demographic coverage
AI Scoring validation
Challenges

Psychometrics



Accommodated versions for SWD and ELL



Comparability across years and student
populations
Equating from year to year
Contrast, read aloud, enlarged print
Braille
All challenges will be resolved by 2014-15
Timeline for Transition
2010-2011
 Getting to know the CCSS/Alignment work
 2010 MEAP/2011MME remain the same
 State focus will be on technical assistance
2011-2012
 Implementation of CCSS in classrooms
 2011 MEAP/2012 MME remain the same
 State focus will be on
instruction/professional development
Timeline for Transition
2012-2013
 2012 MEAP minimally modified as necessary to
reflect the CCSS
 2013 MME remains the same
 State focus will be on student learning
2013-2014
 2013 MEAP based on 2012 model
 2014 MME remains the same
 State focus will be on preparing for new
assessments from SMARTER Consortium
2014-2015
 Full implementation - Instruction and assessment
based on CCSS
Contact Information

Office of Educational Assessment &
Accountability


www.michigan.gov/oeaa
SMARTER Balanced Assessment
Consortium

http://smarter.k12partners.org/
DRAFT
471
Download