Building a Comprehensive Student Achievement System for the

advertisement
WELCOME.
Have you downloaded WSSDA 2014?
Search for the app in iTunes or Google
Play Store and join the conversation.
1. Tap Agenda
2. Locate this session in the agenda
3. Tap Check In
Adding Depth and Breadth:
Crafting a Balanced Assessment System
Mindset
(Dweck, Stanford University)
Fixed Mindset
Avoid
Opportunities Growth Mindset
Challenges
Embrace
Give Up
Obstacles
Persist in face of setbacks
Effort as worthless
Effort
Effort path to mastery
Ignore negative
feedback
Criticism
Learn from criticism
Success Of
Others
Find lessons and inspiration
from success
Threatened by others
Essential Learning Outcomes
•
Understanding the components of
assessment types and a comprehensive,
balanced assessment system
•
Gain a detailed understanding of the data
elements that can provide key information
to teachers, administrators and other
stakeholders
•
Learn the steps to crafting a balanced
assessment system
Is Using Student Data Effective?
• The National What Works Clearinghouse reported after a
meta-analysis of studies that the overall evidence is “weak”
that using achievement data to support instructional
decision-making will result in increased measured student
performance.
But, why?
• The reasons for the weak empirical support is that most
schools do not implement key strategies with fidelity and in
a rigorous enough fashion to achieve measurable positive
outcomes
Key Strategies of
Performance-Driven
School Systems
(University of Southern CA.)
1. Building a Foundation for Data-Driven Decision
Making.
2. Establish a Culture of Data Use and Continuous
Improvement.
3. Invest in an Information Management System
4. Select and Collect the Right Data.
5. Building School Capacity for Data-Driven Decision
Making.
6. Analyze and Act on Data to Improve Performance.
University of Southern California (USC) Conceptual Framework
for Data-Driven Instruction in High Performing School Districts
Diagnostic, Formative, Summative,
progress monitoring - Balanced assessment system
Data Analytic System
Improvement Plans, Protocols,
Monitor progress, follow through,
accountability
Characteristics of a High-Quality
Balanced Assessment System
• Multiple levels of assessment data that is relevant and
actionable at each stage of the teaching and learning
process.
• Designed to meet the needs of key educational
decision-makers, including parents.
• Ability to adjust and adapt to meet the learning needs
of all students through collaboration and intervention
implementation.
• Fit with the district mission, vision and goals.
Type of Assessments within
a Balanced Assessment System
Categories of Assessments
•
•
•
•
Summative
Formative
Criterion Referenced
Normative Referenced
Types of Assessments
• Universal Screening
• Diagnostics
• Benchmarks or Interims
• Common Formatives
• Progress Monitoring
• End of Term
No assessment is inherently summative or formative. An assessment becomes a
formative assessment when you use the information to guide current, on-going
instruction
Basic Balanced Assessment System
Formative
Benchmark
TYPE:
Short Term Ongoing Evaluation Strategies
Periodic Diagnostic/Progress Assessments
FOCUS:
Student-Centered
Classroom/School-Centered
FEEDBACK:
Frequent, immediate Feedback
Multiple Data Points Across Time
Summative
Large-scale Assessments
School /District/State-Centered
Annual Snapshot
Some assessments show characteristics of more than one type
Continuum of Formative Assessment
Stiggins (2005)
Formative Assessment
Assessment
as Learning
Assessment
for Learning
Assessment
of Learning
Daily Student
Self-Assessment
Student-Teacher
Conference
Frequent
Assessments
Teacher-Student
Confer
Guides Instruction
Evaluating Students
Student self
assessment
Responds to student
needs
Determines status
metacognition
Grading
Traditional School Assessments
Benchmark, Common & Summative Assessments
Benchmark or Interim Assessments
• Measure progress on larger units of a districts’
curriculum
(EX: “quarterly benchmarks”).
• Administered several times per year
(EX: fall, winter, spring, quarterly).
• Economical assessment.
• Schools identify students in need of further intervention.
• The information becomes “dated” rather quickly due to
on-going instruction.
Formative (Common) Assessments
• Used to inform instruction
• Sometimes known as “mastery monitoring” or
measurement of skill mastery.
• Known as “common assessments” administered after
smaller segments of instruction (skills, standards, units or
chapters).
• Rich diagnostic information and target specific areas of
learning deficits (EX: specific skills, knowledge,
standards).
• Examples: classroom assessments, observations,
quizzes, unit exams, locally-developed standards-based
assessments.
Summative Assessments
• Administered well after the material is taught.
• Does not generally affect the current, on-going instruction for
students.
• Often used for grading students.
• Can be used to measure growth of students
• Provides feedback to teachers as to how to improve
classroom instruction in the future.
• Informs stakeholders how students or the system progressed
during a given time frame.
• Examples: State NCLB assessments (EOC End of course
tests, ACT, SAT, AP)
Summative Assessment Data
May Suggest:
•
•
•
•
•
•
•
Changes in curriculum
Changes in instructional strategies
Changes in staffing ratios or class sizes
Changes in course offerings
Changes in entire programs
Focused professional development
Determination of value added & teacher effectiveness
But, will not affect the teaching and
learningof current students.
Moving Towards a More
Comprehensive Assessment Plan
Universal Screener, Diagnostic, Progress Monitor, Computer Adaptive
Universal Screening Assessments
• Skill based, economical assessments, not
standards based and typically administered in
reading and math.
• Used as an early warning system for prediction of
performance on high stakes accountability
assessments.
• Screening for students at risk within an RtI
Model.
• Excellent for measuring growth over time
Examples: AIMSWeb, NWEA, aFAST, DIBELS
Diagnostic Assessments
• Standards-based assessments often provide data that is
not specific enough to design targeted instruction.
• Assess micro skills or learning targets.
• Can be a formal or informal tool. Often is a performancebased assessment to measure student response.
• Students receiving tier II or tier II interventions may require
assessments that drill down to measure very specific subskills in reading or math.
• Example: Phonics assessment that each and every phonic
skill that student knows and does not know; Math fact
fluency assessment dx of specific math algorithms
Progress Monitoring
• Used as a type of formative
assessment for RtI.
• The assessment is very
efficient to administer.
• Tier II or tier III interventions
•
require very frequent progress
monitoring assessments and
will allow for measurement of
small incremental improvement. •
• Highly focused on one or two
skills/standards by the targeted
intervention.
• The assessment often has
national norms for expected
growth trajectories.
The data can be displayed to
show discrepancy between
where the student is performing
relative to the expected level.
The data can be graphed to
measure a change in rate of
progress relative to expected
growth over time.
Progress Monitoring Rate of Progress - Low
Slope
60
50
40
30
20
10
Measurement # or Date
35
33
31
29
27
25
23
21
19
17
15
13
11
9
7
5
3
0
1
Performance Score
70
Progress Monitoring - Rate of Progress - High
Slope
60
50
40
30
20
10
Measurement # or Date
35
33
31
29
27
25
23
21
19
17
15
13
11
9
7
5
3
0
1
Performance Score
70
Computer Adaptive Assessments
• Can drill down to the content, skill and
standard grade level
• Nationally-standardized in reading, math,
science and English (K-12 depending on
test).
• EX: NWEA MAP, aFAST System, Let’s Go
Learn
INVESTING IN THE RIGHT DATA SYSTEM
The Right Data System
• Robust platform to store and
report all student data in
effective ways.
• All the data must be one
convenient place for staff
and administration.
• A Few Key Elements:
• Student/Parent Portal
access
• Customizable, interactive
data dashboards for
teachers and Admin
• System integration and
data importing
• Deliver a variety of
•
•
•
•
•
paper
Available item banks
Assessments linked to
standards for reporting
High level data analysis
for single and multiple
measures
Longitudinal analysis
capabilities
Captures and reports
student performance in all
areas (EX: behavior)
Creating the Assessment Vision
at the District Level
The Literature suggests………
The majority of districts simply
do not develop a systematic,
longitudinal plan around
assessment and data utilization.
Balanced Assessment System –
Step 1
Develop a District Assessment
Leadership Team
• Creates the vision
authority
• Curriculum & instruction
• Assessment staff
• Drives implementation
• Teacher reps
• Sets goals and targets
• Admin reps
• Monitors progress
• Special group reps (psychs,
• Collects feedback to improve
system
Collaboration is KEY!
• Includes various stakeholders:
counseling, health, non-core
areas)
Balanced Assessment System –
Step 1
School Assessment Leadership
Teams
• Each School has a team that
reports back to the district
leadership team
• Helps design and create
assessments as appropriate
• Include team members from
various areas:
• Principals.
• Teacher reps from all core
areas
• Special group reps (psychs,
counseling, health, non-core
areas)
• Implements the vision
• Drives school & classroom level
implementation
• Monitors goals and targets
• Provides feedback loop to
district team to improve system
Collaboration is KEY!
Balanced Assessment System –
Step 2
Inventory All Current Assessments Being Administered
– Identify types of assessments being given across the various
grade levels in the target content area(s) and for what specific
reason.
– How is the data being used? Are the results demonstrating the
data protocols are working?
– Why is each assessment being administered? What are the
gaps?
Balanced Assessment System
Step 3
Stakeholder Group Analysis
“Does your current inventory of assessments meet the data needs of all
groups?
– Which stakeholder groups need assessment data?
– Parents, admin, teachers, students, school board, community, state,
federal
– What are the decisions and questions that need to be addressed by
each group?
– What specific assessment information needs to be provided to each
stakeholder group?
Rarely do districts analyze their system from this perspective
Balanced Assessment System
Step 4
Identification of curricular areas that require
assessment data
• Reference back to the stakeholder analysis
• Core academic areas: Math, ELA, Science, Social
Studies
• Non-Core areas: music, PE, world languages
• ESL, Gifted/talented, Special education
• AP, CTE/ROP
Balanced Assessment System
Step 5
Create an Assessment Alignment Map for each curricular area
identified in Step 4
• Map the information from stakeholder groups in step 2 to the appropriate
assessment approaches.
• Stakeholder groups – What data is needed from each assessment method?
• For each curricular/programmatic need, what info is obtained from each
method?
• What are the strengths of each method?
• **This is where mistakes are made! Do not jump to selection of specific tools
and vendors. Can lead to the point where people ask: “Why do we give this
assessment anyway?”
Math – Grade 3
Assessment
Type
Screening
Benchmark
Formative
Summative
Students
Areas to grow
Areas of strength
% correct
Standards msty
% correct
Standards mastery
Prof. Level
Teachers
ID at-risk
ID needs
Grouping needs
Student instr. Needs,
grouping
Item analysis
Standards mast.
Classroom %
Item analysis
Standards mast.
Instrctnl needs
strand data, proficiency
level
Overall effect
Admin
ID at-risk x teacher x
grade x school
Broad areas of
need/strength
RtI
Selection Crit.
Data x teacher,
building, district overall
compare, RtI
Growth levels
Selection Crit.
Data x teacher,
building, district overall
Data x teacher,
building, district overall
Parents
Area of concern
Strengths
Comparisons
NPR
NPR, 3x yr.
Growth level
grades
Proficiency status, ontrack
Yearly growth data
none
AYP status
District Trends
District trends
District trends
District overall
none
none
AYP status
State Growth
Community
Board
State/Feds
%age of at-risk x grd
Diagnostic
C. Balow, 2011
Balanced Assessment System
Step 6
•
•
Create Assessment Infrastructure
•
Determine timeline for assessments to be in place.
•
Determine administration frequency & timelines
•
Determine who will conduct the assessment & logistics
•
Determine where the data will go and how it will be accessed and reported.
•
How will your assessments be created? Prebuilt or homegrown?
•
Will your assessments use purchased item banks or create your own items?
Assessment Infrastructure
•
Determine cut-scores, targets, proficiency levels of the various criterion-based
measures.
•
Determine specific program selection criteria.
•
Logistic regression prediction models to determine students if are “on-track” to
achieve specified outcomes.
•
Who maintains & reports out the data?
C. Balow, 2011
A Quick Summary
1.
2.
3.
4.
5.
Create district and school assessment teams
Inventory current assessments in use
Identify key stakeholder groups (key questions)
Identify content areas in need of assessment data
Determine Data requirements by stakeholder
group (Assessment Data Alignment Map)
6. Complete Assessment Approaches/Measures
Map x content area x grade
7. Design assessment delivery infrastructure and
timelines
Thank You
Abram Jimenez
ajimenez@illuminateed.com
(213) 280-1122
THANK YOU FOR ATTENDING.
We’d love to have your feedback. Take
a moment to participate in a quick survey
about this session in WSSDA 2014.
1. Tap Agenda
2. Locate this session in the agenda
3. Tap Check In (if you haven’t already)
4. Tap Take Survey
Download