Narrowing the Gap between Evaluation and Developmental Science: Implications for Evaluation Capacity Building

advertisement
Narrowing the Gap between
Evaluation and Developmental
Science: Implications for
Evaluation Capacity Building
Tiffany Berry, PhD
Research Associate Professor
Associate Director
Claremont Evaluation Center
Claremont Graduate University
June 29, 2015
Research
University
Context
of Work
Full-time
Educational
Evaluator
Child and
Adolescent
Development
Evaluation
Disciplinary
Training
Evaluation Process
Youth
Programs
Improve
Programs and
Youth
Outcomes
Developmental
Research Process
Research
Community
Evaluation
Improve
Theory and
Knowledge
Research
How can we strategically
narrow the gap between
developmental science and
evaluation practice? How
can we build capacity
within both fields?
“Developmental
Sensitivity”
Evaluation
Practice
Developmental
Science
Tools, Mechanisms,
Value, Development “in
context”
Where does developmental
sensitivity conceptually
fit within the discipline of
evaluation?
#1: AEA Guiding Principles
“Evaluators have the responsibility to
understand and respect differences among
participants, such as differences in their
culture, religion, gender, disability, age,
sexual orientation and ethnicity, and to
account for potential implications of these
differences when planning, conducting,
analyzing, and reporting evaluations” (AEA,
2004, emphasis added).
#2: Cultural Competence
The culturally competent
evaluator is one who “draws upon a
wide range of evaluation theories and
methods to design and carry out an
evaluation that is optimally matched to
the context” (AEA Cultural
Competence Statement, 2011)
Childhood as a context to understand
#3: Need
ECD
Programs
1
#3: Need
57%
47%
45%
70%
46%
88%
2
#3: Need
3
How do we define
developmental
sensitivity?
Principles of
Development
Age
Domains
Cognitive
Environment
Individual
Interactions
Sensitive
periods
Physical
Milestones
SocialEmotional
Moral
Behavioral
How do we apply
developmental sensitivity
to educational evaluation
practice?
18
PHASE I:
Engage
Stakeholders
PHASE VI:
Ensure Use
and Share
Lessons
Learned
PHASE II:
Describe the
Program
CDC Evaluation
Framework
PHASE III:
Focus the
Evaluation
Design
PHASE V:
Justify
Conclusions
PHASE IV:
Gather and
Analyze
Evidence
PHASE I:
Engage
Stakeholders
PHASE VI:
Ensure Use
and Share
Lessons
Learned
PHASE V:
Justify
Conclusions
Realistic
expectations for
development?
Knowledge
of child
development?
PHASE IV:
Gather and
Analyze
Evidence
Engage 19
Stakeholders
PHASE II:
Describe the
Program
PHASE III:
Focus the
Evaluation
Design
PHASE I:
Engage
Stakeholders
PHASE VI:
Ensure Use
and Share
Lessons
Learned
PHASE V:
Justify
Conclusions
Developmentally
appropriate?
Capitalize on
multiple
developmental
domains?
Aligned with
sensitive periods?
PHASE IV:
Gather and
Analyze
Evidence
Describe the20
Program
PHASE II:
Describe the
Program
PHASE III:
Focus the
Evaluation
Design
PHASE I:
Engage
Stakeholders
PHASE VI:
Ensure Use
and Share
Lessons
Learned
PHASE V:
Justify
Conclusions
Account for
maturation?
Stability and
variability across
domains?
Mediators and
moderators?
Ecological context?
PHASE IV:
Gather and
Analyze
Evidence
Focus the 21
Evaluation Design
PHASE II:
Describe the
Program
PHASE III:
Focus the
Evaluation
Design
PHASE I:
Engage
Stakeholders
PHASE VI:
Ensure Use
and Share
Lessons
Learned
PHASE V:
Justify
Conclusions
Developmental
precursors?
Standardized
assessments across
domains? Sensitive
measures? Mixed
methods?
PHASE IV:
Gather and
Analyze
Evidence
Gather and 22
Analyze Evidence
PHASE II:
Describe the
Program
PHASE III:
Focus the
Evaluation
Design
PHASE I:
Engage
Stakeholders
PHASE VI:
Ensure Use
and Share
Lessons
Learned
PHASE V:
Justify
Conclusions
Understanding
youth in context?
Embrace and
measure
variability;
statistical
modeling to
identify pathways
PHASE IV:
Gather and
Analyze
Evidence
Justify 23
Conclusions
PHASE II:
Describe the
Program
PHASE III:
Focus the
Evaluation
Design
PHASE I:
Engage
Stakeholders
PHASE VI:
Ensure Use
and Share
Lessons
Learned
PHASE V:
Justify
Conclusions
Communicate
findings back to
program AND
scholarly outlets
to promote
understanding of
“youth in context”
PHASE IV:
Gather and
Analyze
Evidence
24
Ensure Use and
Share Lessons
Learned
PHASE II:
Describe the
Program
PHASE III:
Focus the
Evaluation
Design
Developmental Sensitivity
PHASE I:
Engage
Stakeholders
PHASE VI:
Ensure Use
and Share
Lessons
Learned
PHASE II:
Describe the
Program
CDC
Evaluation
Framework
PHASE III:
Focus the
Evaluation
Design
PHASE V:
Justify
Conclusions
PHASE IV:
Gather and
Analyze
Evidence
Developmental Sensitivity
25
“Developmental
Sensitivity”
Evaluation
Practice
Developmental
Science
Tools, Mechanisms,
Value, Development “in
context”
27
How can we conceptualize
evaluation practice in way that
will inform developmental
science?
Afterschool Programs
Context
Eval
Service
Providers
Theory
Research
Community
Program
Theory
Improve
Programs
Improve
Theory
Social Science
Theory
Collaboration
Mechanisms
Dissemination
Evaluation Practitioners
Developmental
Science
Evaluation
Practice
30
Implications for Building Evaluation Capacity
Programs
Evaluators
Institutions
Organizational
learning
approach
Tools and
Measurement
Promote interdisciplinary
training
Let evaluators
learn from you
Social Science
Theory
Develop and
reward scholarpractitioners
Age as a unique
context
Evaluation is a bridge
Theory
Practice
32
Tiffany.berry@cgu.edu; 909.607.1540
Download