Grade 3 Math PowerPoint Presentation

advertisement
Massachusetts Comprehensive
Assessment System (MCAS)
Standard Setting:
Grade 3 Mathematics
Sheraton Four Points Hotel
Norwood, MA
August 15-16, 2007
Wednesday, August 15
Overview of Plenary Session






Welcome/Introductions
Overview of MCAS Program
Purpose of 2007 Standard Setting
Body of Work Method and Procedures
Ground Rules for Standard Setting
Agenda (Wednesday-Thursday)
Department of Education







Bob Bickerton, Associate Commissioner
Wayne Fernald, MCAS Mathematics Lead Developer
Haley Freeman, MCAS Mathematics Development
Specialist
Mark Johnson, Director of MCAS Test Development
Bob Lee, MCAS Chief Analyst
Matt O’Connor, Administrator for Administration,
Analysis and Reporting
Kit Viator, Director of Student Assessment
Measured Progress











Sally Blake, MCAS Lead Developer, Mathematics
Lee Butler, Administrative Assistant
Lisa Ehrlich, Assistant Vice President
Kevin Haley, Manager of Data Analysis
Renee Jordan, Service Center Representative
Mark Peters, Program Assistant
Miechelle Poulin, Program Assistant
Michael J. Richards, Program Manager
Kevin Sweeney, Assistant Vice President, Research &
Analysis
David Tong, Assistant Director, MCAS Program
Management
Eric Wigode, Director of MCAS Test Development
Standard Setting
Facilitator

Sally Blake
Welcome Grade 3 Mathematics Panelists
Karen Anderson
Nancy Buell
Bruce Carter
Robert Cote
Linda Gauthier **
Cheryl Goguen **
Rebecca Gutierrez
Steven Kaczmarczyk
Kristine Klumpp **
Carol LaPolice **
Marlena McCoy
Elaine McNamara
Lyudmila Moiseyeva **
Judy Moore **
Stephanie Morris **
Judith Richards
Jennifer Rubera **
Michael Stanton **
Deborah Stewart
Elizabeth Sweeney **
Denise Young **
**Served on 2006 panel
Associate Professor & Chair, Education Dept.
Elementary Mathematics Specialist
Case Manager
3rd Grade Classroom Teacher
Curriculum Coordinator
Grade 4 General Educator
4th Grade Teacher
Special Education Teacher
Grade 3 Teacher
Math Instructional Leadership Specialist-Elementary
Grade 4 Teacher
Title I Director and Teacher
ELL Teacher
Grade 3 Teacher
Grade 4 Teacher
Mathematics Teacher
Grade 4 Teacher
Principal
Community Representative
Assistant Program Director
Grade 3 Teacher
Stonehill College
William H. Lincoln School
Urban League of Eastern Mass.
Jordan/Jackson Elementary
Saugus Public Schools
Miriam F. McCarthy School
Newton Elementary School
Ellen Bigelow School
Alden Elementary School
Daniel B. Brunton School
Mittineague Elementary School
Parker Avenue School
Baker Elementary School
Harvard Elementary School
Craneville School
Graham & Parks School
Pentucket Lake Elementary
Boyden Elementary School
Urban League
Boston Public School
Brown School
Historical Background of the MCAS
Tests
Massachusetts
Education
Reform Law
passed
Grade 3 Reading,
grade 6 Math, and
grade 7 ELA tests
introduced
1993
2001
1998
First MCAS
operational tests
introduced (ELA,
Math, and Science
& Technology,
grades 4, 8, and 10)
NCLB requires states
to annually test reading
& math in grades 3-8
Grade 3 Math test
administered
Grade 3 Math
standard
setting
revisited
2006
2003
Class of 2003 first
graduating class
required to earn a
CD (ELA and
Math)
2007
2006
Grade 3 Math
standard setting
Purpose of MCAS Program

Inform/improve curriculum and instruction

Evaluate student, school, and district performance
according to Curriculum Framework content standards
and MCAS performance standards

Certify eligibility for high school Competency
Determination (CD)
Selected Features of MCAS

Custom developed based on Massachusetts Curriculum
Framework content standards and MCAS performance
standards

100% of questions used to determine student scores
released annually

Measures performance of ALL students educated with
public funds

Results reported according to raw scores and performance
levels
Overview of 2006 Standards
Setting Event and Outcomes
• Cut scores successfully established at
Warning/Needs Improvement and at Needs
Improvement/Proficient
• Some panelists expressed concern about
whether any test questions existed at the
Above Proficient level; cut score at
Proficient/Above Proficient set at 40 (out of 40)
• 2007 test designed to have sufficient
questions at Above Proficient level
Purpose:
2007 Grade 3 Mathematics
Standard Setting
Primary purpose:
• Establish a cut score at Proficient/Above Proficient
Secondary purpose
• Validate cut scores at Warning/Needs Improvement
and Needs Improvement/Proficient
Standard Setting vs.
Standards Validation

Standard setting (top cut point)
– Process of establishing original cut scores
– Panelists are not provided initial cut points

Standards validation (bottom two cut points)
– Process of validating cut scores
– Panelists are provided initial cut points
2007 Standard Setting/Validation
Cut score
to be
validated
Warning
Cut score
to be
validated
Needs
Improvement
Cut score
needed
Proficient
Above
Proficient
Development of Content
Standards
2000
Mathematics Curriculum Framework content
standards written for grade spans (e.g., grades
5-6 and grades 7-8)
2004
Supplement to the CF was created, pulling
out specific content standards for grades 3,
5, and 7; no “brand-new” standards were
written
Content Standards vs.
Performance Standards

Content standards = “What”
Describe the knowledge and skills students
should acquire in a particular content and grade

Performance Standards = “How well”
Describe student work on MCAS tests at the
Needs Improvement, Proficient, and Above Proficient
levels
General MCAS
Performance Level Descriptors
Needs Improvement
Students at this level demonstrate partial understanding
of subject matter and solve simple problems
Proficient
Students at this level demonstrate a solid understanding of
challenging subject matter and solve a wide variety of
problems
Above Proficient
Students at this level demonstrate a comprehensive and
in-depth understanding of rigorous subject matter, and
provide sophisticated solutions to complex problems
Linking Performance Standards
with Student Work

What is standard setting?
Establishment of cut scores to distinguish
between performance levels

What is your job?
Use the PLDs to evaluate student work and
make recommendation for Proficient/Above
Proficient cut score
Purpose of Standard Setting


Determine cut scores for reporting
assessment results
Answer the question:
– How much is enough?
General Phases of Standard Setting/
Standards Validation

Data-collection phase

Policy-making/decision-making phase
Standard-Setting Methods



Angoff
Bookmark
Body of Work
Choosing a Standard-Setting Method
Prior usage/history
 Recommendation/requirement
by policy-making authority
 Type of assessment

Body of Work method chosen for MCAS test in
Grade 3 Mathematics
What is the Body of Work
Procedure?
Panelists examine student work (actual responses to
test questions) and make a judgment regarding the
performance level to which the student work most
closely corresponds.
Top cut
Standard Setting:
Panelists examine student work
that has not been previously classified
and determine how that work
should be classified.
Lower cuts
Standards Validation:
Panelists examine student work
that has been initially classified into a
performance level based on starting
cut points and determine if they agree
with these classifications or
recommend changes to them.
Initial Classification of Student Work
Initial classification of student work in grade 3
mathematics based on 2006 test results.
Step 1: Equate the 2007 grade 3 mathematics test to the
2006 test.
Step 2: Find the raw score cuts on the 2007 form that
are equivalent to the cut points established in August
2006.
Step 3: Select student work with scores ranging from
very low to very high; classify them into performance
levels based on preliminary cut points found in Step 2.
Selected Student Work
Example Distribution of Selected Student Work: Grade 3 Math
Warning
X
X X
X X X X X
Needs Improvement
Proficient
Above
Proficient
X
X X
X X
X X X X X X X X X X X X X X X X X X
X X X X X X X X X X X X X X X X X X X X X X X X X X X X X X
How to Classify Student Work
Materials you will need:

Performance Level Definitions
• General
• Grade and content specific

Bodies of Student Work
• Responses to constructed-response questions
• Multiple-choice summary sheet

Rating Forms
How to Classify Student Work

Examine the student’s responses to multiple-choice
questions

Examine the student’s responses to open-response
questions

Judge the student’s knowledge and skills
demonstrated relative to the PLDs

Panelists do not need to reach consensus on the
classifications
How to Classify Student Work
To help prepare you to do these ratings, you will spend
time becoming familiar with the following:



Grade 3 mathematics test
General MCAS and grade 3 math Performance
Level Descriptors
Bodies of student work
•
Responses to multiple-choice items AND
constructed-response items
How to Classify Student Work

You will have the opportunity to discuss your
classifications and change them if desired.

Don’t worry! We have procedures, materials, and
staff to assist you in this process.
What Next?








Take the assessment
Complete the Item Map
Discuss the Performance Level Definitions
Complete training round
Complete individual ratings
Receive feedback from first round of ratings
Discuss feedback and provide final ratings
Complete an evaluation form
Top 8 Most
Misunderstood Things
about Standard Setting
8. Standard setting is a great opportunity to rewrite
Curriculum Framework standards.
7. The process is rigged.
6. This is a good time to vent about all the things
you hate about MCAS.
5. We should use this time to rework Math
performance level definitions.
Top 8 Most
Misunderstood Things
about Standard Setting
4. Standard setting is scoring.
3. Only Mathematics scholars should be doing this
work.
2. Only teachers should be doing this work.
1. Disagreement is bad.
Ground Rules







Role of facilitator is to “facilitate” and keep process on
track
Process solely focused on recommending performance
standards (cut scores) for MCAS
MCAS performance level definitions are integral to
process but are not up for debate
Panelists’ recommendations are vital; however, final cut
scores determined by the MDOE
Each panelist must be in attendance for the duration of
the process for his/her judgments to be considered
Each panelist must complete evaluation form at the end
of the event
Cell phones off, please!
Agenda
Wednesday, August 15
Breakfast
8:00 am – 9:00 am
Work session
9:00 am – 12:00 pm
Lunch
12:00 pm – 1:00 pm
Work session
1:00 pm – 4:00 pm
Thursday, August 16
Breakfast
8:00 am – 9:00 am
Work session9:00 am –12:00 pm
Lunch
12:00 pm – 12:45 pm
Work session
12:45 pm – Until completion
Room
Assignment
Grade 3 Math –
105/106
Questions?
Download