here - CRESST

advertisement
Global 3D Technology Forum
Developing and Testing Math Games for
Learning: Best Practices, Lessons Learned,
and Results From a Large-Scale Randomized
Controlled Trial
Gregory Chung
Assistant Director for Research Innovation
National Center for Research on Evaluation, Standards, and Student
Testing (CRESST)
University of California, Los Angeles (UCLA)
Global 3D Technology Forum
Seoul – 15 October 2014
Structure of Talk
2/9
•
Brief introduction
•
Why develop math games?
•
Large-scale randomized controlled trial of
the effectiveness of our games
•
Impact, best practices, and lessons learned
Introduction to CRESST
3/9
•
National Center for Research
on Evaluation, Standards, and
Student Testing
•
Co-Directors Eva Baker, Li Cai
•
15+ PhD, 10+ MA/MS/BS
research staff; graduate
students, programmers,
graphic artists, business and
administrative support
•
Supported by federal (DoE,
NSF, DARPA, ONR), state, and
local agencies, foundations,
non-profits, and business
CRESST Games for Learning R&D
IES /
CATS
DARPA /
Engage
ONR /
CSUSB
PBS
Mobilize
6
K-3
College
K-2
9-12
math
physics
math
math
c/s
Develop ontology and knowledge
specifications Research: In what
✓
✓
✓
Develop games
✓
✓
✓
✓
✓
Topic
Grade
Domain
Engineering
•
ways and under what conditions
✓ effectively
✓
✓
Develop testbed
can video games be used to
teach,
✓
✓
✓
✓
Instrument games
assess, and engage students?
•
Engineering: Develop or adopt design processes
thatdesigns
lead to effective and engaging
✓
✓ games
✓
Instructional/game
Research
Modeling user understanding
•
✓
Analytics: Develop or adopt methods of collecting
✓
✓
✓
✓
and analyzing large-scale game data
Mining gameplay data
In-game adaptivity
✓
✓
✓
✓
Rapid assessment generation
✓
✓
Novel assessment formats
✓
✓
Executing RCT
✓
✓
Crowdsourcing
data collection
4
/9
✓
✓
Background
Why Develop Math Games?
Focus on Pre-algebra
6/9
•
Pre-algebra provides students with the
fundamental skills and knowledge that underlie
algebra
•
Gateway to more advanced mathematics and
STEM majors
•
3rd – 4th grade concepts remain unmastered
well into college
What’s Wrong With This Picture?
7/9
8th Grade National Sample
% Incorrect: 25%
% Incorrect: 33%
8/9
Remedial Math in College:
Intermediate Algebra
Item
% Incorrect
27%
26%
11%
9
Tensions: Games for Learning Math
game <–---> learning
fun <–---> math
play time <–---> efficiency
choose to play <–---> have to play
commercial grade <–---> research grade
implicit instruction <–---> explicit instruction
unobtrusive measurement <–---> obtrusive measurement
(embedded)
(external)
simple tasks <–---> complex tasks
abstract math <–---> concrete math
10 / 9
conceptual <–---> fluency
Project Goal
•
Develop and evaluate video games aimed at
improving students’ knowledge and skills in prealgebra topics
 Focus on rational numbers and solving equations
(Grade 6)
 Use an engineering approach to systematically develop
and test games
 Conduct a randomized controlled trial to test the
effectiveness of our games
11 / 9
Randomized Controlled Trial of
Game Effectiveness
Research Question
•
13 / 9
Does playing CRESST-developed rational
number games result in more learning of
rational number concepts compared to playing
a comparison set of games?
Sample
•
62 classrooms randomly assigned to
condition
 30 treatment classrooms
 29 comparison classrooms
 3 classes dropped because of schools’ technology
 ~1500 students
•
14 / 9
Final sample—59 classrooms, 9 districts, 23
schools in California and Nevada, ethnically
diverse
Procedure
•
Teachers integrated math games into their
curriculum
 Rational number or control topic
 3 hours professional development
 Implementation window: January to April (pre-state testing)
 Pretest
 10
(30min)
gameplay days within window (40min per occasion)
 Delayed
•
posttest 1 week after last game (30min)
$600 honorarium to teacher
15 / 9
Measures
16 / 9
Results
•
The treatment condition performed significantly
higher on the posttest than the comparison
condition
 Treatment condition (n = 30) played 4 games related to
fractions
 Comparison condition (n = 29) played 4 alternative
games (solving equations)
 No differences between conditions on pretest
17 / 9
Student’s Perceptions
I learned from
at least two games
Agree
I really got into
at least two games
Disagree
I wanted to play longer
in at least two games
0%
10%
20%
30%
40%
50%
60%
70%
80%
90% 100%
Teacher’s Perceptions
At least two games
helped students learn
Agree
Students were engaged
in almost every game
Disagree
Would use games again
in classroom
0%
10%
20%
30%
40%
50%
60%
70%
80%
90% 100%
Efficacy Trial Results
•
Multi-level model shows significant effects of
treatment, effect size = .6
Type of Intervention
Mean Effect Size
Instructional format
.21a
Instructional component/skill training
.36a
Focused on individual students
.40a
Simulation and games
.34b
Computer-assisted instruction
.31b
aLipsey,
M. W. et al. (2012). Translating the Statistical Representation of the Effects of Education
Interventions Into More Readily Interpretable Forms. (NCSER 2013-3000).
bHattie,
J . (2009). Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement.
19 / 9 Routledge.
Game Development Methodology
Hope is Not a Strategy
Launch Schedule
Y1
Y2
Y3
Game Development Studies
Study lock
Procedure test
Classroom dry run
22 / 9
Launch
Y4
Game Development Studies
2 Years Before RCT
•
Total of 24 small-scale studies, Grades 4-12
 Schools, after-school program, private school
 18
•
sites; 8 school districts; 4 states
Types of studies
 Small scale tests, pre-post, think-alouds
7
studies; total N=238
 Experimental studies: Instructional variations
 10
studies; total N=1588
 Typically
30-40 minutes
 Assessment and methodology studies
23 / 9
7
studies; experimental data plus 711 new subjects
LARGE SCALE EFFICACY TEST
59 classrooms (1,700 kids)
GAME EFFECTIVENESS TRIALS
60 + kids per condition
Game
Development and
Testing
Process
GAME PROTOTYPE + TESTING
Small-scale (1-5 students)
Classroom-scale (30 students)
PLAY/PAPER TESTING
Small-scale (1-5 students)
Classroom-scale (30 students)
DEVELOPMENT
establish instructional
sequence + assessments
Fractions Games
25 / 9
Wiki Jones
Save Patch
(number line)
(unit, fractional pieces, adding fractions)
Tlaloc’s Book
Rosie’s Rates
(inverse operations)
(functions)
Solving Equations Games
26 / 9
Monster Line
Expresso
(operations on pos. and neg. integers)
(transforming expressions)
Zooples
AlgebRock
(solving equations)
(solving equations)
Game Development Studies
Results
•
Empirical
 Instruction/feedback – less is more, narrative and avatar
choice matters for engagement, collaboration helpful for low
performers, repeated play improves learning, math measure
validation, engagement scale validation, misconceptions and
play strategies detectable via data mining, ...
•
Student and teacher self-reports, classroom
observations
 Game features – music, graphics, cheevos, avatar select
 Math not really an issue, high engagement, surprises
27 / 9
 Technology issues
Game Development Studies
Results
•
Developing games for learning – hard but doable
 Big difference between teaching and practice
 Games don’t have to be AAA quality
 Classroom
context sets a low bar
 Stickiness possible – the more challenging, the more sticky
 Game mechanics that require use of knowledge is a key
design feature
 Anecdotal collateral benefits of gameplay – more student
28 / 9
engagement, more individual attention, better feedback to
teacher, increased confidence of student, productive studentstudent interactions
Impact, Best Practices, and
Lessons Learned
Impact: Developed learningEffective Games
•
30 / 9
6th students learned from our fractions games as
measured by an external transfer test
Impact: New Methodology,
Dissemination
•
New statistical methodology developed at CRESST
(Li Cai, Kil-Chan Choi)
•
Technical reports craziness
 CATS technical reports downloaded 69,456 times (4 yrs)
•
Journals (15), proceedings (7), chapters (9), technical
reports (22), conference presentations (96)
•
Transfer of development approaches and
methodologies to other game-focused projects at
CRESST
31 / 9
Best Practices:
Coherent Design Process
•
Ontologies and knowledge
specifications
•
Guides what to teach in
game, what to measure,
what to focus on in teacher
training
32 / 9
Common Core State Standards
Best Practices:
Game Testbed to Accelerate R&D
<root>
<verifylogins> false </verifylogins>
<loginfile> RNLogins.xml </loginfile>
<hidescroll> false </hidescroll>
<cheats> true </cheats>
<newFormat> true </newFormat>
<showbuild> true </showbuild>
<lockcharacters> true </lockcharacters>
<levelselect> true </levelselect>
<onlyvisited> false </onlyvisited>
</root> <LevelName> Level 50 </LevelName>
<LevelWidth> 3 </LevelWidth>
<LevelHeight> 2 </LevelHeight>
<Sign>
<LevelDenominator> 2 </LevelDenominator>
<xNum> 0 </xNum>
<LevelScale> 2 </LevelScale>
<xDenom> 2 </xDenom>
<NumKeys> 0 </NumKeys>
<yNum> 4 </yNum>
<NumCoins> 0 </NumCoins>
<yDenom> 2 </yDenom>
<required> false </required>
<optional> false </optional>
<goal> false </goal>
<start> true </start>
34 / 9
</Sign>
Best Practices:
Gameplay as a Data Source
•
Key design properties
 Game telemetry reflects description (not inferences) about
events connected to learning
 Game telemetry packaged in a structured format (i.e., usable)
 Game mechanic requires use of math knowledge
 Game allows player to fail (i.e., commit errors)
 Game requires player to make decisions (do I do this or that?)
 Player cannot progress without requisite knowledge
 Stage / level design useful [stages = 1 concept; levels =
35 / 9
variations of concepts]; sawtooth curve
Best Practice: Assume the Worst
•
1994: Is there a network?
•
2014: Can I access the network?
•
Same problem 20 years later:
Networked vs. stand-alone
 Option 1: Put everything on a USB stick (EXE, data)
 Option 2: Bring 40 laptops to classroom
36 / 9
Lessons Learned
•
Assumption by game designers that making the
game too “mathy” would destroy the game
•
Assumption by game designers that to be
learning effective, the player has to have a fluid
and seamless experience
•
Assumption by researchers that to be learning
effective, every content detail had to be covered
•
Assumption by all that using games for learning
purposes is a good idea
37 / 9
Lessons Learned
•
•
•
Developing games for learning--hard but doable
•
In-house design and development facilitate rapid
iteration
•
•
•
•
Testbed capability is an R&D accelerant
38 / 9
Learning issues should drive game design
Common design specifications focuses development
effort
Gameplay interaction can be a rich data source
Technology infrastructure a source of uncertainty
Logistics is the last meter
Acknowledgements
cresst.org
cats.cse.ucla.edu
ies.ed.gov
40 / 9
greg@ucla.edu
cresst.org
1/3
Design: Game mechanics
•
Design game mechanics to require targeted
knowledge
•
Leverage interaction to exercise players’
knowledge
 Allow correct and incorrect player actions
 Create decision points as part of gameplay
 Use misconceptions to create game “traps” (puzzles)
•
43 / 9
Examples from a fractions game
Design: Game mechanics around
misconceptions
Partitioning Misconception
Students believe the denominator is the number of
dividing lines rather than the spaces between
1/4
1/4
1/4
1/4
= 4/4
44 / 9
Design: Game mechanics around
misconceptions
Partitioning Misconception
Students believe the denominator is the number of
dividing lines rather than the spaces between
1/3
1/3
1/3
1/3
= 4/3
45 / 9
Design: Game mechanics around
misconceptions
46 / 9
Download