It's an Evolution: Changing Roles and Approaches in the Evaluation

advertisement
It's an Evolution: Changing Roles and
Approaches in the Evaluation of the
Pittsburgh Science of Learning Center
Brian Zuckerman
American Evaluation Association
November 2nd 2011
Pittsburgh Science of Learning Center
PI: Ken Koedinger (CMU)
Co-PIs: Chuck Perfettti (UPitt),
David Klahr (CMU),
Lauren Resnick (Upitt)
• Purpose: Leverage cognitive theory and
computational modeling to identify the
conditions that cause robust student learning.
• Goals: Fundamentally transform
– translational research in education
– generation of learning science theory
Ed technology + Wide dissemination = “Basic research at scale”
+
=
PSLC: Transforming
Translational Research
• LearnLab = social & technical
infrastructure to support fieldbased basic research
Researchers
Schools
Learn
Lab
Algebra Intelligent Tutor
– Controlled experiments in real courses
– Educational technologies => Data!
• Practice-relevant discovery
– What lab-based theory survives
translation?
– Field-based data drives discovery
Chemistry Virtual Lab
English Reading
Tutor
PSLC: Transforming Theory Generation
Emerging “Computational Learning Science”
• Data mining & computational modeling techniques
• New data sources: Brain imaging, classroom video,
student interactions with ed tech
PSLC Capacity Building
• Vast student data repository
• New field: Educational Data Mining
• 2010 KDD Cup
– Annual competition of Knowledge
Discovery and Data Mining conference
– Task: Predict step-by-step
performance of 10,000 algebra
students across school year
pslcdatashop.web.cmu.edu/KDDCup/
PSLC: Research to Practice
• Translation is built in
– LearnLab embeds experiments &
scientific data collection within
running courses
– Many outcomes of 200+ learning
studies incorporated into courses
• Ed tech dissemination partners
– Carnegie Learning, Inc.
>600,000 K12 math students a year
– Open Learning Initiative
1000s of college student users a
semester
Cognitive Tutor 2010/11 release uses
PSLC results (Butcher & Aleven, 2008)
In use in all 50 states
PSLC Activities Relative to Program
Goals
• Conducts large-scale research
– Large-scale theory development
– Formation/nucleation of new fields or subdisciplines
• Develops and maintains infrastructure useful to community
– Large-scale infrastructure development (LearnLab, DataShop, tools)
• Educates diverse, highly competent, and globally-engaged
workforce
– Education of graduate students and postdoctoral researchers
– Broadening participation (e.g., PSLC summer internships)
– Conducts center mass-requiring ancillary education efforts for broader
learning sciences community (e.g., PSLC summer school)
• Forges valuable partnerships
– Among PSLC researchers in interdisciplinary collaborations
– With industry/external stakeholders
Changes in PSLC Organization
• Reorganization around renewal
– Four clusters become three thrusts
– Change in co-PI, on Executive Committee
6
Evaluation Context
• Evaluator context
– STPI funded by the center
– STPI came on board around first site visit in 2005
– Center passed through five-year review, now in year 7
• Shifts in PSLC activities and logic model
– Change in organization
– Shifting emphasis on goal of theoretical framework
development
– Other smaller changes (e.g., shift in diversity goals
toward long-term expansion of field)
7
Evolution of Evaluation Effort Matches
Changes in PSLC Lifecycle
• Years 1-2: Predominantly focused on growth of “Centerness” with data
collection internal to Center
– Management processes (interviews)
– Collaboration formation (interviews, collaboration survey)
– Development of Center-wide language and culture (interviews)
• Years 3-4: In preparation for site review focus shifted
– External investigators’ knowledge of Center research and predictions of future
value
– Theoretical framework development/wiki analysis
– Bibliometric analysis of publications to date
• Years 5-6: Center reorganization led to refocus on “centerness”
– Return to interviews in Years 1-2
– Analysis of changes in thrust plans over time
• Present: Evaluation effort largely dormant until plans for SLC-wide
evaluation become evident
– Some continuing activities around sustainability
Data Collection Changes Over Time (Partial
list of measures and approaches)
Topic
Data Collection Strategy
Years
1-2
Years
3-4
Theoretical framework
Internal Interviews/growth of
common language
X
X
Theoretical framework
Analysis of PSLC theory wiki
X
Theoretical framework
External interviews/value of PSLC
theory development to date
X
Theoretical framework
Analysis of thrust plans
Theoretical framework
Bibliometrics
X
New fields
External interviews/role of PSLC
in educational data mining
community
X
Value of infrastructure
Internal interviews/use internally
of PSLC DataShop, tools
Value of infrastructure
External interviews/knowledge of
PSLC DataShop, tools
Years
5-6
X
X
X
X
X
Data Collection Changes Over Time
(cont.)
Topic
Data Collection Strategy
Years
1-2
Years
3-4
Education of students
Internal interviews/perception of
value of PSLC participation
X
X
Education of students
Tracking next steps of PSLC
graduates
X
X
Broadening participation
Participant observation and
interviews with interns
X
X
Broadening participation
Tracking next steps of interns
Other ancillary
educational efforts
Follow-up up with participants in
summer school
X
X
Years
5-6
X
X
Data Collection Changes Over Time
(cont.)
Topic
Data Collection Strategy
Years
1-2
Years
3-4
Years
5-6
Collaboration formation
Internal interviews/value and
success of center-wide
collaboration formation
approaches
X
X
Collaboration formation
Internal Interviews/
collaborativeness in Center
X
X
Collaboration formation
Collaboration survey
X
Collaboration formation
Bibliometrics
X
External collaborations
External interviews/value of
collaboration
Management
Value and effectiveness of center
structures and processes
X
X
X
X
Reflections
• Evaluation in complex context
– Pressure from government funder to demonstrate
results even during first five-year period
– Change in PSLC organization and goals over time
• Required nimble evaluation approach as a result
– Evaluation plans developed in Years 1 and 5 served as
point of departure rather than blueprint
– Difficult-to-maintain balance between need for
continuity in data collection and shifting priorities
Download