the Powerpoint Presentation

advertisement
Outcomes Based Management
Barbara Montero, Vice President of the SARGE Affiliate Network
Presented: June 27th, 2013
Renaissance Denver Hotel
Denver, CO
Introductions &
Background
Mission of Acelero Learning
The mission of Acelero Learning is to bring a
relentless focus on positive child and family
outcomes to close the achievement gap and
build a better future for children, families,
and communities served by the Head Start
program.
What is Acelero Learning?
 Acelero Learning is a proud provider of Head Start/Early
Head Start and other comprehensive early child programs.
 We were founded in 2001 (by a former Head Start teacher
& a former Children’s Defense Fund staffer) and we partner
with local communities to manage the highest quality Head
Start/Early Head Start programs.
 Today, we directly administer a network of Head
Starts/Early Head Starts and other early childhood
programs serving more than 4000 young children and
families around the country. Our National Support Center
(based in Harlem, NY), supports 3 local delegates in
Monmouth/Middlesex, NJ; Camden, NJ/Philadelphia, PA;
and Clark County, NV.
Acelero Learning Team
Aaron
Lieberman
CEO
Henry
Wilde
SVP of Operations
Ellen
Frede
SVP of Education
Lori
Levine










SVP of HS Services
Barbara
Montero
VP of SAN


Former Head Start teacher, CDA holder
Founder and CEO of Jumpstart
Chairman of SchoolSuccess.net, designed and built first on-line
assessment system for HighScope and Teaching Strategies
State of Wisconsin Deputy Secretary for Children
and Families overseeing childcare under Governor Doyle
Former special assistant to Marian Wright Edelman at the
Children’s Defense Fund
Architect of the NJ Abbott Pre-K program and evaluation system
Co-director of National Institute for Early Education Research
Former Head Start teacher, HighScope Training of Trainers codesigner
Queens Borough Commissioner for Child Protection, NYC
Administration for Children’s Services
Deputy Director, Free to Grow, National Program Office,
Columbia University SPH, a 20-site national Head Start demonstration
program
Manages the T/TA network of Acelero partner programs/customers
Deputy Head Start Director for Clark County
(Las Vegas), NV
What is Acelero Learning Training and T/A?
 In-person trainings
•
•
1.5 to 2-day in-person trainings, typically sponsored in partnership with
NHSA and/or Regional Head Start Associations
We have trained over 500 Head Start leaders over the last three years in
using data to monitor and our outcomes-driven approach
 SARGE
•
•
•
A comprehensive online library of service area plans, policies and tools to
assist Head Start and Early Head Start programs in providing high-quality
services and achieving Head Start compliance with huge time savings
Over 90 Head Start programs (serving nearly 80,000 children and
families) are SARGE subscribers, and 80% renew year after year
Currently has an optional Re-competition Resources, and a new series of
Transition Resources coming soon!
 SARGE Affiliate Network
•
•
Provides intensive partnerships with a limited number of programs for
extensive RFP assistance and ongoing support in PDM, ECE, and FCP
We currently work with 8 programs serving over
20,000 children each year
Introduction to Outcomes-Based
Management in Head Start
 Why now? Head Start is under attack.
• 2010 House budget proposed a $1 billion cut
• Sequestration caused a 5.7% cut for FY2013 funding
beginning in March 2013
• Head Start Impact Study shows gains from Head Start
are not as great as we would want them to be
Clear, demonstrable outcomes are more
important than ever before!
The decisions made in this room
directly impact the lives of
thousands
of children and families!
*Concentrate * Focus *
* Participate *
Agenda
 Cornerstone #1
• A Look into Innovative Programs and their Outcomes
 Cornerstone #2
• Using Data to Make Decisions – A Head Start Case
Study
• An Overview of Compliance-Based Monitoring
 Breakout #1
• Moving Beyond Compliance into Outcomes-Based
Monitoring (School Readiness & Parent, Family &
Community Engagement Framework)
 Breakout #2
• Strategy & Goal-Setting, Family Engagement
A Look at Innovative Programs
and their Outcomes
ECE Innovations that Achieved Outcomes
Older with Proven Long-Term Benefits
• High/Scope Perry Preschool Project
• Abecedarian Project
• Chicago Child Parent Centers
High/Scope Perry Preschool
What was innovative?
 Four half days of center-based education, school
year only
 One 1.5 hour home visit per week
 Extremely qualified staff
 1:6 ratio
Cost: $11,300 per child per school year (in 2007 dollars)
What were immediate effects on child outcomes?
 .8 effect size on the PPVT
High/Scope Perry Preschool
What were the long-term effects on child outcomes?
Educational Effects
Berrueta-Clement, J.R., Schweinhart, L.J., Barnett, W.S., Epstein, A.S., & Weikart, D.P. (1984). Changed lives: The
effects of the Perry Preschool Program on youths through age 19. Ypsilanti, MI: High/Scope Press.
High/Scope Perry Preschool
Economic Effects at 40
Schweinhart, L. J., Montie, J., Xiang, Z., Barnett, W. S., Belfield, C. R., & Nores, M. (2005). Lifetime effects: The
High/Scope Perry Preschool study through age 40 (Monographs of the High/Scope Educational Research
Foundation, 14). Ypsilanti, MI: High/Scope Educational Research Foundation.
High/Scope Perry Preschool
Crime Effects at 40
Schweinhart, L. J., Montie, J., Xiang, Z., Barnett, W. S., Belfield, C. R., & Nores, M. (2005). Lifetime effects:
The High/Scope Perry Preschool study through age 40 (Monographs of the High/Scope Educational
Research Foundation, 14). Ypsilanti, MI: High/Scope Educational Research Foundation.
Abecedarian Project
What was innovative?
 Full-day, full-year center-based services for children
 Served children between birth and five years old
 Home visits
 Low teacher ratio, 1:3 for infants and 1:6 for five year olds
Cost: $17,099 per child in 2011 dollars
What were the immediate effects on child outcomes?
 .79 effect size on reading achievement with IQ
Campbell, F. A., Pungello, E. P., Miller-Johnson, S., Burchinal, M., & Ramey, C. T. (2001). The development of
cognitive and academic abilities: Growth curves from an early childhood educational
experiment. Developmental Psychology, 37(2), 231-242.
Abecedarian Project
What were the long-term effects on child outcomes?
Academic Benefits
Barnett, W. S., & Masse, L. N. (2007). Early childhood program design and economic returns: Comparative
benefit-cost analysis of the Abecedarian program and policy implications, Economics of Education Review, 26,
113-125; Campbell, F.A., Ramey, C.T., Pungello, E., Sparling, J., & Miller-Johnson, S. ( 2002). Early childhood
education: Young adult outcomes from the Abecedarian Project. Applied Developmental Science, 6(1), 42-57.
Chicago Child Parent Centers
What was innovative?
 Half day center-based services school year only
 17:2 child to teacher ratio in preschool
 ½ day a week of parent involvement, home visits
Cost: $8,700 in 2007 dollars
What were the immediate effects on child outcomes?
 Age 5 Kindergarten Readiness at 47th Percentile
(vs. 28th Percentile for comparison group)
Chicago Child Parent Centers
What were the long-term effects on child outcomes?
Academic and Social Benefits at School Exit
Temple, J. A., & Reynolds, A. J. (2007). Benefits and costs of investments in preschool education: Evidence from
the Child-Parent Centers and related programs. Economics of Education Review, 26(1), 126-144
Meta-Analysis of Pre-K Data
What was it?
 Meta-Analysis of the Effects of Early Education
Interventions on Cognitive and Social Development
• Gregory Camilli, Sadako Vargas, Sharon Ryan, &
W. Steven Barnett
 Looked at 123 different intervention studies over
last 50 years to synthesize their outcomes
 Goal was to determine the potential benefits and
costs of implementing specific preschool programs
Meta-Analysis:
What Determines Cognitive Gains?
Time of Follow-Up
Research Design Quality
Intentional Teaching
Individualization
(Small groups and 1-on-1)
Comprehensive Services
n= 123 Studies
Barnett, W. S., (Nov. 13, 2012). Investing in Early childhood Education and Care (ECEC)
Negative
Positive
Positive
Positive
Negative
Meta-Analysis: What Produces The Most
Significant Gains?
All Designs
HQ Designs
HQ Programs
1
0.9
0.8
Effects (sd)
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
Treatment End
Ages 5-10
Age at Follow-Up
Barnett, W. S., (Nov. 13, 2012). Investing in Early childhood Education and Care (ECEC)
Age >10
So what programs have weak or
inconclusive results?
 Multi-generation and home-visiting programs
• Even Start unless combined with high quality preschool
• Home Based
• Parent Child Development Centers
 K-3 interventions
• Follow Through
• Project Developmental Continuity
• Head Start Transition Study
 One possible conclusion: education has the biggest impact
on education especially when added to no education
Current Innovations that are
Demonstrating Outcomes
•
•
•
•
Educare
AppleTree
Acelero Learning
Who Else?
Educare
What is innovative?
 Full-day, full-year birth-to-five education services
 Maintain small class size and high staff/child ratios
• Infant-toddler - 3:8 per classroom
• Preschool rooms - 3:17 per classroom
 Maintain high staff qualifications, Master Teachers
with advanced degrees in early childhood supervise
3 or 4 classrooms
 Provide continuity of care, primary teachers remain
with children for 3 years, from birth to 3 years old
 Second team remains with children from ages 3 to 5
Cost: $18,000 to $20,000 per year
Educare
What were the immediate effects on child outcomes?
• Average exiting score of 95 points on the PPVT for
non-ELL children (82.5 for DLL)
• Children had a higher PPVT score the longer the more
years they stayed in Educare
Educare
Educare Implementation Study Findings – August 2012. “Results.”
http://www.educareschools.org/about/pdfs/Demonstrating-Results.pdf
AppleTree
What is innovative?
 Lead teachers with a Bachelor’s degree, three teachers
in every classroom
 Low child-to-teacher ratios between 5:1 and 7:1
 Full-day program, school year only
 Direct assessment progress monitoring every 6 weeks
 Weekly coaching for teachers
What were the immediate effects on child outcomes?
 Average of 99 on the PPVT for 3 year olds with 1 year
 Average of 93 on the PPVT for 4 year olds with 1 year
 .53 effect size on the PPVT
AppleTree
“The AppleTree Approach to School Readiness: A Case Study Using A Longitudinal
Population-Referenced Evaluation Framework. “By Craig T. Ramey & Nancy Cromwell.
August 2010. “Results.” http://www.appletreeinstitute.org/files/doc/AppleTreeApproach.pdf
Acelero Learning
What is innovative?
 Full-day, full-year
 Highly-credentialed staff, 75% with Bachelor’s degrees
 Extensive use of data
• Weekly MBI report with 100 different summary indicators
• Monthly MBI + report with 54 different indicators
• Quarterly MBO report looking at outcomes in education,
family services and health
 An ECE focus on coaching, data, and curriculum
 An FCE focus on giving families more information,
and having a common approach to family life practice goals
Cost: Head Start funding of $8250 per child
Acelero Learning Results
Our Results: In our latest evaluation report, returning children enrolled for
one full year in our Head Start programs achieve 3.5 times the average
gains recorded on the 2009 Head Start FACES study on the Peabody
Picture Vocabulary Test (PPVT).
So How Does It All Stack Up?
PPVT Effect Size by Program *
Perry Preschool Study
FACES Study (2009)
Acelero Learning
8 State UPK
Acelero Learning
8 State UPK
Perry Preschool Study
FACES Study (2009)
0
0.2
0.4
0.6
0.8
1
*Caution should be used in drawing conclusions from these comparisons since the research designs were not the same.
Innovations Within Your Programs
1. Look for research-based, proven approaches
2. Pilot to gain experience, test ideas on small scale
3. If successful, evaluate, modify, and expand scope
4. Do not let history dictate your plan for next year!
“It is our attitude at the beginning of a difficult task which,
more than anything else, will affect its successful outcome.”
- William James
But remember….
 It is never about just ONE thing!
• Focus on Coaching, Curriculum, Families, Data…
 In the end, everything comes down to intensity
and focus.
 Southwest Airlines vs. Ted
Generating great outcomes is not a function of
one decision you make, but the sum total
of EVERY decision you make.
Using Data to Make Decisions:
Head Start Case Study
Introduction: Using Data
DATA
da·ta
’dā-tə, ‘da-tə
n. pl.
Factual information
(as measurements or statistics)
used as a basis for reasoning,
discussion, or calculation
Merriam-Webster Online Dictionary,
www.merriam-webster.com
How Do We Use Data to Make
Decisions…
… at home?
… at work?
Case Study: PFT Head Start
PROCESS:
 Read case study individually.
 Review and summarize key points together.
 In groups, discuss & identify one solution for
program improvement.
i.e. What would you address, and how?
 Report out to the group.
Case Study: Key Points
RM
Teacher
Asst Teacher
F.A.
Butterflies
(Atkins)
Veteran;
initially skeptical
of curric.,
now enthusiastic
(Perez)
Recently hired;
open relationship
w/ teacher
(Jones)
Can-do attitude,
works well
w/ team
Ladybugs
(Berrera)
Recently hired;
excited abt curric.;
eager to learn and
share ideas
(Smith)
Veteran;
comfortable
relationship
w/ teacher
(Watson)
Works easily
w/ team,
integration of
family svcs
in classroom
Case Study: Key Points (cont’d)
RM
Teacher
Asst Teacher
F.A.
Bluebirds
(Cortines)
Veteran;
does just enough;
intimidating
(Vasquez)
Recently hired;
timid;
follows teacher
rather than
speaking up
(Jones)
Keeps his distance;
gives in to teacher;
has asked
supervisor for help
Bunnies
(DeNardo)
Recently hired;
overwhelmed by
paperwork;
little contact w/ FA
(Simpson)
Veteran;
competent, quiet,
few contributions
(Watson)
Little contact
w/ teacher; says
disorganization
affects her work;
frustrated
Case Study: Solutions
Case Study: Solutions
Case Study “Plus” – Exhibit A
Case Study “Plus” – Exhibit B
Case Study “Plus” – Exhibit C
Case Study “Plus” – Exhibit D
Case Study “Plus”: Solutions
Case Study Reflection
Compare discussion/solution #1 (based on
case study only) and discussion/solution #2
(based on case study + data).
What were the key differences …
• In your proposed solutions (results)?
• In the discussion itself (process)?
• In how you felt about it (relationships)?
Summary of Key Points
 Managing with data enables us to make more
objective evaluations/comparisons and more
informed choices.
 We can “manage by exception” and allocate
resources more strategically.
 Using data, we typically identify problems and
generate solutions in a more specific, focused
and effective way.
An Overview of Compliance-Based
Monitoring:
Manage By Information (MBI)
Basics
What is the Manage by Information (MBI)
report?
 Literally, the MBI is a reporting tool which measures
Head Start compliance indicators
 Key Aspects of an MBI:
 Should include information by unit that is managed to and held
accountable for results (i.e. classroom, or center, or zone, etc.)
 Should show trends from prior month (or time period)
 Should evolve over time as you use
 Should be used as both a tool to identify areas that need more
support and to hold people accountable
 To be effective, information must be up to date, real-time and run
every other week (or monthly, at a minimum)
Importance of Using Data to Manage
 Managing with data enables us to make more
objective evaluations/comparisons and more
informed choices
 We can “manage by exception” and allocate
resources more strategically
 Using data, we typically identify problems and
generate solutions in a more specific, focused
and effective way
A Sample Section of the MBI
Rows represent centers
and the dark purple
row shows the program
average or total.
Columns are numbered sequentially
for reference
Historical trend data is included at the bottom.
Conditional
formatting with
colors help
visually highlight
areas.
Types of data to enter, capture, and monitor
 ERSEA
• Current enrollment, openings, waitlist
• Eligibility, by category
 Disabilities / Mental Health
• Active IEPs / IFSPs
• Completed or incomplete or past-due screenings (Dev and SE)
 Family Services
• Completed Strength Assessments and Family Partnership Agreements
 Health
• Completed, Past-Due, Expired Health Screenings and Immunizations
• Medical / Dental Homes
 Early Learning
• Completed Home Visits and Parent/Teacher Conferences
• Completed Assessments, by checkpoint
 Don’t forget to capture and review follow-up, as that is also a requirement
 Can be absolute numbers and/or percentages
De-Mystifying the MBI
It’s a manual process!
• Use of an excel spreadsheet, with pre-populated
formulas and conditional formatting
• Report running via your data system (COPA, Child
Plus)
• Entering data into pre-formatted excel worksheet
• Creating a simplified and brief analysis to include
in the distribution
MBI Relies on Accurate,
Current Data in Your Systems
If it is not in _________
(Child Plus, PROMIS, COPA, etc.)
it did not happen!
MBI Exercise: Leadership Team Meeting
Simulation
1. Get in groups of ten
2. Assign each person one of six roles:
•
•
•
•
•
•
Director
ERSEA Coordinator
Disabilities/MH Coordinator
Family Services Coordinator
Health Coordinator
Early Learning Coordinator
3. Each person reviews data for their area; if time permits, you
can review additional areas
4. Directors come up front and meet with me
5. Each coordinator should prepare their report to be shared
with the team based on their review of data
MBI Exercise: Leadership Team Meeting
Simulation
How does this compare with your typical Leadership Team
meeting? What were the differences?
Why?
Managing by Information
Not just reporting, but acting…
 Distribution of MBI report among key leaders and
governance groups increases accountability for all
players
 Ideally allows leadership to focus on data – not
personalities – in problem solving and tracking
progress
 Ultimately it is about identifying problem areas, and
then making changes to improve, when key
numbers indicate concerns
Use of child and family system
(criteria 1 from the Program Use of Data Rubric)
Inadequate
Beginning
Implementing
No electronic
system is in
place, all
child and
family
information
is paperbased.
An electronic
child and family
data system
(Child Plus,
COPA, PROMIS,
etc) is in place,
and used for
basic child and
family
information.
Child and family
data system is
used to track most
program services,
including family
services, health
and disability
follow up.
Mastering
Child and family data
system is used to
track ALL program
services, and a
review of electronic
records effectively
“tells the story” of
all interactions with
each child and
family.
Next Steps for Creating Your Own MBI:
Beginning
 Re-look at where your program is on the program
use of data rubric
• Recommended: do it as an exercise with your leadership
team
 If your program is at BEGINNING, identify steps to:
• Verify accuracy of currently captured data
• Identify the initial items (10-15) you want to include on
your initial MBI (use following slides to guide you)
• Plan to expand use of data system to include
family services info
Next Steps for Creating Your Own MBI:
Beginning
 Verify accuracy of currently captured data
• Audit paper files with info in Child Plus to assess
accuracy
• Sample # of files but person responsible for input (family
service worker, etc.)
• Have coordinators review reports to assess if data is
being entered in a consistent fashion
 With your management team, identify 10-15 items you
want to look at every month
 Agree on rows (items) and columns (centers, delegates,
classrooms?)
Next Steps for Creating Your Own MBI:
Beginning
Plan to expand use of data system to include
family services info
• Focus on service area by service area
• Provide specific guidance on HOW to track each
item in your data system
• Pilot in small groups, get the kinks out, and
expand!
Use of child and family system
Inadequate
Beginning
Implementing
No electronic
system is in
place, all
child and
family
information
is paperbased.
An electronic
child and family
data system
(Child Plus,
COPA, PROMIS,
etc) is in place,
and used for
basic child and
family
information.
Child and family
data system is
used to track most
program services,
including family
services, health
and disability
follow up.
Mastering
Child and family data
system is used to
track ALL program
services, and a
review of electronic
records effectively
“tells the story” of
all interactions with
each child and
family.
Next Steps for Creating Your Own MBI:
Implementing
 If your program is at IMPLEMENTING, identify steps to:
• Identify initial items (20+) you want to include on your
initial MBI
• Shoot for a frequency of every two weeks
• Plan to expand use of data system to include any
program services not currently captured
Use of child and family system
Inadequate
Beginning
Implementing
No electronic
system is in
place, all
child and
family
information
is paperbased.
An electronic
child and family
data system
(Child Plus,
COPA, PROMIS,
etc) is in place,
and used for
basic child and
family
information.
Child and family
data system is
used to track most
program services,
including family
services, health
and disability
follow up.
Mastering
Child and family data
system is used to
track ALL program
services, and a
review of electronic
records effectively
“tells the story” of
all interactions with
each child and
family.
Next Steps for Creating Your Own MBI:
Mastering
 If your program is at MASTERING:
• Look at other measures, beyond compliance
• Next steps may be data-base customization
through special filters, user defined field,
custom reports, etc.
For Mastering:
MBI Report: The Next Generation
Here is a sample from the 15 new Indicators for our MBI:
• # Children Absent 20-49% of days in last 4 weeks
• # Children Absent >50% of days in last 4 weeks
• # Children Late (15 mins after start time) in last 30
days
• # of Children with an Expired IEP
• # of Parent/Staff Concern without follow-up in 30 days
• # Families in Emergency/Crisis with no follow-up in 30
days
MBI: What to expect?
 When you first run (and distribute) the MBI, this is
what will happen:
The screenings are
done, but I did not
have a chance to put
them in the computer!
But this info is
not correct!
I put them in the
stupid computer
and they are not
showing up!
MBI: How You Should Respond
 Help people understand how the system works
 Build a culture of “low stakes” for first few MBIs
 Let people know we will get it right!
That must be
frustrating! Let’s’ dig
in and see where the
process broke down.
Remember – getting
in the system is the
last step in doing the
screening!
Let’s trouble shoot
to see how we get
the data in on time
before the report is
run.
Your Commitment
Managing By Information
Within the next three days I will…
Within the next 30 days I will…
Outcomes-Based Monitoring:
Managing By Outcomes (MBO)
A Review: What is the Manage by Information
(MBI) report?
 The MBI is a reporting tool which measures
Head Start compliance indicators
 Key Aspects of an MBI:
 Should include information by unit that is managed to and held
accountable for results (i.e. classroom, or center, or zone, etc.)
 Should show trends from prior month (or time period)
 Should evolve over time as you use it
 Should be used as both a tool to identify areas that need more
support and to hold people accountable
 To be effective, information must be up to date, real-time and run
monthly, at a minimum
The MBI:
The Limitations
 Almost exclusively focused on process indicators (in
comparison to outcomes)
 Are short-term by nature – things that change every week
 Can lead you to “miss the forest for the trees” if there are
no other measurement systems in place
The MBI:
The Limitations
For example…
• What if we conduct all of your educational assessments,
but children are leaving our program not prepared
for kindergarten?
• What if we complete our family partnership agreements on
time, but our families never meet those goals?
• What if we finish all our screenings on time, but never
successfully help families follow-up on health issues?
Managing by Outcomes
Why it is needed…
 The Head Start Performance Standards, and the
corresponding OHS monitoring process, are almost
completely focused on PROCESS and COMPLIANCE,
not outcomes
 Only with the recent focus on School Readiness Goals
has OHS ever even asked to see outcome reports!
 For that reason, to be a GREAT Head Start program,
it is NOT ENOUGH to just meet the Head Start Performance
Standards
 Head Start Performance Standards are a floor, not a ceiling!
Managing by Outcomes
What it does…
 The MBO Report allows us to:
• Set program-wide goals for key outcomes we want
to accomplish
• Manage and measure to see if we accomplish them,
and how we can improve
• Review these outcome reports quarterly throughout
the year, and adjust our work accordingly
Introduction to Outcomes
OUTCOME
out·come
Pronunciation: \au̇t-kəm\
Function: noun
Date: 1788
: something that follows as a result or consequence
Merriam-Webster Online Dictionary
Word origin: 1788, "that which results from
something," originally Scottish, from out + come (v.).
From Dictionary.com
Managing by Outcomes
Process vs. Outcome
Process
 Completing
assessments on time
Outcome
 Children mastering
skills
 Completing family
strengths
assessments
 Families achieving
goals
 Completing required
screenings on time
 Children receiving
treatment to help them
become healthier!
Managing by Outcomes
How do we choose?
 In Head Start, choosing key outcomes to focus on is a
HUGE challenge
 The Head Start Performance Standards mandate many
different processes, most of which (hopefully) have
corresponding positive outcomes, but we cannot
track everything
 While we have to have school readiness goals, there is
no one definitive way to do this
 We have chosen to focus on three areas that we think
are essential for Head Start programs: child, family, and
health outcomes
Three Key Outcome Areas
Child Outcomes
What will our children be able to do as a result of
our program?
Now: School Readiness Goals
Family Outcomes
What will our families accomplish as a result of
our program?
Now: PFCE Outcomes
Health Outcomes
How will our children be healthier as a result of
our program?
Key Elements of
Manage by Outcomes


Each outcome should
focus on what will be
different as a result of
your program
For each outcome,
you identify specific
objectives, measures,
and targets
Objectives
What is the goal?
Measure
How will you measure it?
Target
What is the specific level you
want to achieve based on
your measure?
Exercise: School Readiness Goals
Objective: Measure:
Children
enter
kindergarten
prepared to
succeed
Target:
Objectives
What is the goal?
• We chose three domains of our school
readiness goals that we thought were
crucial for positive child outcomes
• Within each domain, we identified an
average score that would indicate
“mastery” (i.e., where we wanted children
to be upon transitioning to kindergarten)
Measure
How will you measure it?
Target
What is the specific level you
want to achieve based on
your measure?
• We chose whatever assessment system
the program was using (GOLD, WSS, ELS)
as the measure
• We chose a specific % of children
(i.e. 90%) that we want to reach the
mastery level
Objectives, Measures and Targets
OHS Guidance on School Readiness Goals
Each agency should aggregate and analyze assessment data
at multiple points in a year and use that data in combination
with other program data to:
 Determine progress toward meeting the established goals;
 Inform parents and the community of results;
 Direct continuous program improvements related to
curriculum, teaching, and instructional strategies, professional
development, program design, and other program decisions;
OHS Guidance on School Readiness Goals
 OHS recommends that agencies aggregate and analyze
assessment data at least three times during the year (to
provide baseline, midpoint, and year end progress); or, for
programs operating 90 days or less, two times during the
program
 OHS recommends programs use data in combination with
input from parents and families to determine each child's
status and progress with regard to, at a minimum,
language and literacy, cognition and general knowledge,
approaches toward learning, physical well-being and motor
development, and social and emotional development, and
to individualize the experiences, teaching, and services to
best support each child.
Use of Data on Child Outcomes for School Readiness
(criteria 3 from the Program Use of Data Rubric)
Inadequate
Beginning
Implementing
Mastering
Data on child
outcomes is
not used for
school
readiness
goals.
Data on child
outcomes is used
for assessing
progress on school
readiness goals at
least three times
per year at the
program level.
Data on child
outcomes is used for
assessing progress on
school readiness goals
three times a year at
the program, center,
and classroom level.
There is quality
assurance process in
place to understand
reliability of child
outcome data.
Data on child outcomes
is used to measure
progress toward a
“school ready” target,
and data is analyzed at
the classroom and child
level to help teachers
individualize. Quality
assurance process
confirms data is
accurate.
School Readiness Goals At the Program,
Center, and Child Level
 At the program level
•
•
•
•
•
Delegate gets roll-up of child outcome data
Reviewed by leadership team and education leadership team
Demonstrates global trends
Likely least useful view
Break out by program options or other sub-groups as much as
possible (OHS requirement)
 At the center level
• Allows you to see differences by center
• Ultimately, you want to see data by the person who is
accountable -- for us that is center directors
• Helps education staff troubleshoot and prioritize
School Readiness Goals At the Program,
Center, and Child Level, cont.
 At the child level
•
•
•
•
Ultimately, this is the most useful view
Should help teachers differentiate instruction
Should help center directors focus coaching/problem solving
Ultimately, teacher is accountable for reaching school
readiness goals
Quality Assurance Process To Understand
Reliability of Child Outcome Data
 Look closely: These are complicated systems!
•
•
•
•
Sample each teacher’s data in the system
Look at quality of observations (specific, objective, factual)
Verify assessment item(s) to which the observation is assigned
Assess accuracy of scoring (the level teacher chose for
each item)
 What % of child assessment data is truly accurate?
• Less than 50%? Do not start analyzing outcome data
• Ensure all teachers are reliable. Some systems now have built-in
reliability (GOLD, ELS).
• Explore opportunities to streamline assessment process
 Repeat!
Use of Data on Child Outcomes for School Readiness
(criteria 3 from the Program Use of Data Rubric)
Inadequate
Beginning
Implementing
Mastering
Data on child
outcomes is
not used for
school
readiness
goals.
Data on child
outcomes is used
for assessing
progress on school
readiness goals at
least three times
per year at the
program level.
Data on child
outcomes is used for
assessing progress on
school readiness goals
three times a year at
the program, center,
and classroom level.
There is quality
assurance process in
place to understand
reliability of child
outcome data.
Data on child outcomes
is used to measure
progress toward a
“school ready” target,
and data is analyzed at
the classroom and child
level to help teachers
individualize. Quality
assurance process
confirms data is
accurate.
School Readiness Goals Are Used to Measure
Progress Toward A “School Ready” Target
 Set a TARGET for what % of children you want to reach
your School Readiness Goals
• Must separate out data between three year olds and four year
olds
• Again, look at data by program, center, and classroom level
 If your data is accurate
• Time 1 tells you where you need to focus (baseline)
• Time 2 tells you the score at half-time -- need it quickly
to make mid-course adjustments
• Time 3 tells you where children are at the end of the year
The MBO and Parent, Family &
Community Engagement Outcome
Goals
Sample MBI Family Measures
The MBI :
The Limitations
For example…
• What if we complete all of our Family Partnership
Agreements, but families don’t ever meet their goals?
• What if we captured that families received services, but
those services didn’t provide the support necessary to
stabilize or meet the families long term needs?
• What if parents attended a workshop on job readiness, but
never actually looked for, or secured a job?
Managing by Outcomes
Why it is needed for families…
 The Office of Head Start has not yet integrated Family
Outcomes into any of the monitoring systems, but…
 The recently released Parent Family Community
Engagement Framework identified seven family outcomes
areas
 The links in the research between child and family outcomes
suggest that HS programs will not achieve school readiness
goals without a more intensive focus on targeted family
outcomes
 To be a great HS program we should be leveraging HS’s two
generation model to achieve significant family impact that
will enhance children’s lives longer after they leave HS
Key Outcome Areas
Child Outcomes
What will our children be able to do as a result of
our program?
Now: School Readiness Goals
Family Outcomes
What will our families accomplish as a result of
our program?
Now: PFCE Outcomes
Health Outcomes
How will our children be healthier as a result of
our program?
Family Outcomes
How do we choose?
 In Head Start, choosing key outcomes to focus on is a
HUGE challenge
 The Head Start Performance Standards mandate many
different processes, most of which (hopefully) have
corresponding positive outcomes, but we cannot
track everything
 There are currently no expectations set by the OHS for
tracking family outcomes
 We’ve started simply, with expectations to seek other
external measures that better capture behavioral
change as our work progresses
Exercise: PFCE Goal
Objective:
Families
leave our
program
prepared to
support their
child’s
future
success.
Measure:
Target:
Objectives
What is the goal?
• We chose four domains of family
outcomes that research links with
positive child outcomes
• Within each domain, we chose to
track progress so that we could help
staff track benchmarks towards
family outcomes
Measure
How will you measure it?
Target
What is the specific level you
want to achieve based on
your measure?
• We chose to link our measurements
to our Family Strengths Assessment
and goal setting data
• We set targets of %ages of families
where we hoped to see progress and
tested other measures before setting
targets
Objectives, Measures and Targets
Tracking tangible family
outcomes – such as
securing a job or
completing a GED
Office of Head Start Parent, Family and Community
Engagement Framework
From the PFCE Framework
 The PFCE is a roadmap for progress in achieving the types
of outcomes that lead to positive and enduring change for
children and families
 When parent and family engagement activities are systemic
and integrated across program foundations and program
impact areas, family engagement outcomes are achieved,
resulting in children who are healthy and ready for school
 Parent and family engagement activities are grounded in
positive, ongoing, and goal-oriented relationships with
families
PFCE Framework: Research-Based
Families play a critical role in helping their children to prepare for
school and a lifetime of academic success. Research indicates that:
 Children with supportive home learning environments show
increased literacy development, better peer interactions, fewer
behavior problems, and more motivation and persistence during
learning activities
 Among the youngest children, daily parent-child reading from
infancy prompts cognitive skills as well as early vocabulary gains
that lead to more reading and vocabulary growth, a pattern of
growth that has been compared to a snowball
 Continued family engagement is important through the school
years. Longitudinal studies of low-income children show that high
family involvement offsets the risks of children growing up in lowincome households and in households with low parent education
OHS PFCE Framework and Acelero Learning’s
Family Services Approach
Targets explicit
family
outcomes
Managing By Outcomes, Step 1
Objectives
What is the goal?
 Simply put, our objective is assisting families to
achieve outcomes in four areas:
• Family Life Practices
• Self-Sufficiency
• Supporting Children with Chronic Health Conditions and
IEP’s/IFSP’s
• Stabilizing families in crisis and reducing the impact of
substance abuse, depression, domestic violence on
young children’s development
Managing By Outcomes, Step 2
Measure
How will you measure it?
 Pre/Post Family Strengths Assessment:
• We assess families at the beginning and end of program
year and track changes in family circumstances
 Goal Progress:
• We track progress on family goals in each of the four
areas
 Goal Attainment:
• We track #’s of families who have achieved tangible selfsufficiency outcomes, such as GED completion, securing a
job, English proficiency
Use of Data for Family Outcomes
(criteria 4 from the Program Use of Data Rubric)
Inadequate
Beginning
Implementing
Mastering
There is no
process in
place to
quantify or
track progress
on family
goals.
There is a process
in place to track
and quantify
progress on family
goals that helps
standardize
progress measures
with similar goals.
There is a process in
place to track and
quantify progress on all
family goals, and data
is run and analyzed at
the program level and
family advocate level.
There is a quality
assurance process in
place to understand
reliability of family
outcome data.
Family goal progress is
used to measure
progress toward a
program-wide target.
Family outcome data is
regularly analyzed on
the family, advocate,
and program level.
Quality assurance
process confirms data
is accurate.
Family Outcomes At the Program and
Center
 At the program level
• Delegate gets roll-up of family outcome data
• Reviewed by leadership team and family services
leadership team
• Demonstrates global trends, but hard to see differences in
family engagement across workers/Centers
 At the center level
• Allows you to see differences by center
• Ultimately, you want to see the data by persons responsible
– by Advocate and FSC
• Allows us to see trends among workers and supervision
Use of Data for Family Outcomes
(criteria 4 from the Program Use of Data Rubric)
Inadequate
Beginning
Implementing
Mastering
There is no
process in
place to
quantify or
track progress
on family
goals.
There is a process
in place to track
and quantify
progress on family
goals that helps
standardize
progress measures
with similar goals.
There is a process in
place to track and
quantify progress on all
family goals, and data
is run and analyzed at
the program level and
family advocate level.
There is a quality
assurance process in
place to understand
reliability of family
outcome data.
Family goal progress is
used to measure
progress toward a
program-wide target.
Family outcome data is
regularly analyzed on
the family, advocate,
and program level.
Quality assurance
process confirms data
is accurate.
Quality Assurance Process To Understand
Reliability of Family Outcome Data
 Look closely: These systems rely on accurate data! First the
family strengths assessment
• Look at FSA trend data across programs, Centers and
Advocates to compare to level of need one would expect
to see based upon family demographics
• Conduct qualitative audits of FSA instruments to assess
accuracy of family assessment scoring
• Review stratified sample of goal setting data to assess
accuracy of reporting of goal progress
• Utilize reliability data to inform professional development
and supervision
Managing By Outcomes &
Health Outcome Goals
The MBI:
The Limitations
For example…
• What if we finish all our screenings on time, but never
successfully help families follow-up on health issues?
• What if we screen children, but never follow-up to ensure
children receive the appropriate Disabilities and/or Mental
Health services?
Three Key Outcome Areas
Child Outcomes
What will our children be able to do as a result of
our program?
Now: School Readiness Goals
Family Outcomes
What will our families accomplish as a result of
our program?
Now: PFCE Outcomes
Health Outcomes
How will our children be healthier as a result of
our program?
Key Elements of
Manage by Outcomes


Each outcome should
focus on what will be
different as a result of
your program
For each outcome,
you identify specific
objectives, measures,
and targets
Objectives
What is the goal?
Measure
How will you measure it?
Target
What is the specific level you
want to achieve based on
your measure?
Health Outcomes
How do we choose?
 In Head Start, choosing key outcomes to focus on is a
HUGE challenge
 The Head Start Performance Standards mandate many
different processes, most of which (hopefully) have
corresponding positive outcomes, but we cannot
track everything
 We’ve started simply, with a focus on health indicators
that are reported in the PIR, with a few “beyond Z”
innovations. We focus on how quickly we get children in
treatment, and we focus on measuring our success at
helping resolve obesity issues for our children
Exercise: Health Services Outcome Goal
Objective: Measure:
Children are
healthy (or
healthier)
when they
leave our
program.
Target:
Objectives
What is the goal?
Measure
How will you measure it?
Target
What is the specific level you
want to achieve based on
your measure?
• We chose five areas of health follow
up that are critical to a child’s healthy
development
•Within each area, we track whether or
not children are receiving treatment
with in set time frames (i.e. 90 days,
etc.)
• We chose to link our measurements
to out internal processes of
documenting following-up
• We set targets for the % of children we
wanted to receive treatment in defined
time periods (Q1, Q2, Q3, or Q4)
Objectives, Measures and Targets
Managing By Outcomes, Step 1
Objectives
What is the goal?
 Simply put, our objective is assisting children and
families to receive treatment, with a focus on:
• Lead
• Hemoglobin/Hematocrit
• Dental
• Other non-chronic medical (i.e. hearing & vision
screenings, accidents, and illnesses)
• Obesity Resolution
Managing By Outcomes, Step 2
Measure
How will you measure it?
 Health Requirements & Results:
• We assess children’s need for treatment, based on the
results of their required health requirements
 Treatment Received:
• We track the timing of when a health issue is identified to
when treatment is received
 Obesity
• We track the difference in BMI from the fall assessment to
the most recent assessment
Managing by Outcomes (Health)
Why it is needed for children & families…
 Increasingly, children we serve in our Head Start programs
are at risk for experiencing negative health outcomes….
• We are raising a generation of children who are likely to be less
healthy than their parents and may even have a shorter life
expectancy
• Chronic diseases remain the number one factor in poor health
outcomes, many of which have their origins in childhood
• Over the last three decades, the number of overweight children has
more than tripled, and one in every five children has a mental health
concern
 Impact of Healthy Lifestyles on Children’s Development
• Poor diets linked to disproportionate rates of obesity, Type 2
Diabetes in low income communities
• Access to green space and safe parks, playgrounds impacts socialemotional and gross motor development
Impact of Poorly Managed Health Conditions on
School Success
 Asthma is the most common chronic childhood illness
affecting approximately 6.3 million (8.7%) children in the
United States. Since 1980, the prevalence of asthma has
increased by 75%. In inner city communities, as many as
one child in three is diagnosed with asthma.
 Asthma accounts for more school absenteeism than any
other chronic disease, and 60% of students with asthma
miss school annually due to respiratory symptoms.
 Research suggests that young children with severe and
poorly managed asthma are more likely to repeat a grade or
fall behind their peers by third grade
Use of Data for Health Outcomes
Inadequate
Beginning
Implementing
Mastering
There is no
process in
place to
quantify or
track results
for health
requirements.
There is a process
in place to track
and quantify health
screenings and
follow up that helps
standardize timing
expectations for
follow-up and
treatment.
There is a process in
place to track and
quantify health
screenings and follow
up, and data is run and
analyzed at the
program level and
center level. There is a
audit process in place
to get qualitative data
on health follow up.
Health follow-up and
treatment is used to
measure progress
toward a program-wide
target. Health outcome
data is regularly
analyzed on the center,
and program level.
Quality assurance
process confirms data
is accurate.
Health Outcomes At the Program and
Center Level
 At the program level
• Delegate gets roll-up of health outcome data
• Reviewed by leadership team and health services
leadership team
• Demonstrates global trends, but hard to see differences in
health treatment across centers
 At the center level
• Allows you to see differences by center
• Ultimately, you want to see the data by persons responsible
– by Advocate and/or Health Coordinator
• Allows us to see trends among workers and supervision
Critical Health Audit/Quality Assurance Steps
 Consistent Data Entry Standards are Critical
• Specific, screen by screen guidance is often needed
• Flexibility in systems is a detriment to good data
 Quality Audits are Key
• Included survey on understanding health roles and
responsibilities
• Directors and coordinators sampled records, reviewed for
quality and timeliness of follow up
 Continuous improvement cycle often leads to process
improvements
Health and Nutrition Take-Aways
 Health Outcomes directly linked to effective
understanding and execution of roles and
responsibilities
 Based on audits, we refined MBI follow-up tracking
to bring health execution “out of the shadows” and
front and center to all delegates
 Roles and responsibilities as designed work when
they are well-executed!
Managing By Outcomes &
Mental Health & Disabilities Goal
Examples
Disabilities & MH Outcomes
Managing by Outcomes (Dis. and Mental Health)
Why it is needed for children & families…
 The links in the research between child, health, and mental
health outcomes suggest….
• Intense stress linked to poorer child and adult health outcomes, as
well as changes in brain maturation
• Children’s challenging behavior sometimes a proxy or “warning
signal” of family circumstances
Things to Think About
 What data structures do you have in place to track children with chronic
health conditions and/or special needs? Could your Family Advocates
identify these families on their caseloads?
 What information do you capture about the status of children’s health
conditions? Can you identify those children whose chronic health
conditions are affecting their attendance or program participation?
 What structures do you currently have in place to assess the family
dynamics of children with challenging behaviors?
 How can this intentional goal-setting structure be integrated into
existing service coordination systems?
• What adaptations/enhancements would be needed?
In the long run, men hit only
what they aim at.
Therefore, they had better aim
at something high.
Henry David Thoreau
Your Commitment
Managing By Outcomes
Within the next three days I will…
Within the next 30 days I will…
Thank you VERY much for your participation and
attendance in Day 4 of the NHSA Manager &
Director Academy!
Please keep in touch and remember to network with your
Head Start peers (including us!)
Acelero Training & Technical Assistance Contacts:
Katherine Molina-Powell: katherine@acelero.net
Barbara Montero: barbara@acelero.net
Like us on Facebook
www.facebook.com/acelero
to get stay up-to-date on what’s happening in the world of
Head Start and to receive info on upcoming Acelero
events.
Download