Setting Up Learning Objectives and
Measurement for Game Design
Girlie C. Delacruz and Ayesha L. Madni
Serious Play Conference
Los Angeles, CA – July 21, 2012
Overview
Assessment Validity
Components of
Assessment Architecture
Create assessment architecture
(Your Example)
What is so hard?
What are some of your challenges?
Passed the Game
Gameplay
Log data
Domain
Challenges We Have
• Translating objectives into assessment
outcomes
– Purpose of assessment information
– Communication between designers and educators
• Game is developed—need to assess its
effectiveness
– Cannot change code, wraparounds
How can we meet the challenge?
Front-end Efforts Support
Effectiveness
Instructional
requirements
Assessment
requirements
Technology
requirements
Model-Based Engineering Design
Communication
Collaboration
Model-Based Engineering Design
z
Part One
ASSESSMENT VALIDITY
What Is Assessment?
Assessment (noun) = Test
Assessment As A Verb
ASSESSMENT
=
Process of drawing reasonable
inferences about what a person
knows by evaluating what they say
or do in a given situation.
Games As Formative Assessment
Formative Assessment:
Use and interpretation of task performance information
with intent to adapt learning, such as provide feedback.
(Baker, 1974; Scriven, 1967).
Games As Formative Assessment
Games as Formative Assessment:
Use and interpretation of game performance information
with intent to adapt learning, such as provide feedback.
What is Validity?
Assessment Validity as a Quality
Judgment
Critical Analysis
Legal
Judgment
Scientific Process
Assessment Validity
ASSESSMENT
VALIDITY
=
Bringing evidence and analysis
to evaluate the propositions of
interpretive argument.
(Linn, 2010)
How Does This Relate to Design?
① Identification of the inferences to be made.
• What do you want to be able to say?
② Specificity about the expected uses and users of the
learning system.
•
•
Define boundaries of the training system
Determine need for supplemental resources
③ Translate into game mechanics
④ Empirical analysis of judgment of performance within
context of assumptions.
What do you want to be able to say
about the gameplayer(s)?
•
•
•
•
Player mastered the concepts.
How do you know?
Because they did x, y, z (player history)
Because they can do a, b, c (future events)
Identify Key Outcomes:
Defining Success Metrics
• Quantitative Criteria (Generalizable)
– % of successful levels/quests/actions
– Progress into the game
– Changes in performance
• Errors
• Time spent on similar levels
• Correct moves
• Qualitative Criteria (Game-specific)
– Patterns of gameplay
– Specific actions
pre1
pre2
pre3
pre4
BACKGROUND LAYER
• Prior knowledge
• Game experience
• Age, sex
• Language
proficiency
pre5
motion
o1
o2
duration
directio
n
speed
o3
o4
o5
CONSTRUCT LAYER
Construct,
subordinate
constructs, and interdependencies
o6
o7
o8
INDICATOR LAYER
Behavioral evidence
of construct
fn(e1, e2, e3, ...; s1, s2, s3, ...): Computes an indicator value given raw
events and game states
FUNCTION LAYER
Computes indicator
value
Game events and states (e1, e2, e3, ...; s1, s2, s3, ...)
EVENT LAYER
Player behavior and
game states
General Approach
• Derive structure of measurement model from
ontology structure
• Define “layers”
– Background: Demographic and other variables that
may moderate learning and game performance
– Construct: Structure of knowledge dependencies
– Indicator: Input data (evidence) of construct
– Function: Set of functions that operate over raw event
stream to compute indicator value
– Event: Atomic in-game player behaviors and game
states
• Assumptions
– Chain of reasoning among the layers are accurate
Part Two
ASSESSMENT ARCHITECTURE
Components of Assessment
Architecture
DOMAIN REPRESENTATION
• instantiating domain-specific related
information and practices
• guides development
• allows for external review
COGNITIVE DEMANDS
• defines targeted knowledge,
skills, abilities, practices
• domain-independent
descriptions of learning
TASK SPECIFICATIONS
• defines what the students
(tasks/scenarios, materials,
actions)
• defines rules and constraints)
• defines scoring
Cognitive Demands
What kind of thinking do you want capture?
• Adaptive, complex problem solving
• Conceptual, procedural, and systemic learning of
content
• Transfer
• Situation awareness and risk assessment
• Decision making
• Self-regulation
• Teamwork
• Communication
Domain Representation
•
•
External representation(s) of domainspecific models
Defines universe (or boundaries) of what is
to be learned and tested
Example: Math
Ontologies
Knowledge
specifications
Item
specifications
Task Specifications
① Operational statement of content and behavior for task
• Content = stimulus/scenario (what will the users see?)
② Behavior = what student is expected to do/ response (what will the
users do?)
• Content limits
③ Rules for generating the stimulus/scenario posed to the student
• Permits systematic generation of scenarios with similar
attributes
• Response descriptions
④ Maps user interactions to cognitive requirements
Force and Motion
Pushes and pulls, can have different strengths and directions.
Pushing and pulling on an object can change the speed or direction of its motion and can start or stop it.
Each force acts on one particular object and has both strength and a direction.
Energy The faster a given object is moving, the more energy it possesses
NGSS
performance
expectation
Content limits
Targeted science
and engineering
practice(s)
Response
description
Task complexity
Available resources
Plan and conduct an investigation to compare the
effects of different strengths of pushes on the motion
of an object (K-PS2-1).
Effects: change in position; increased or decreased
acceleration
Strengths of pushes: Qualitative (small, medium, big),
or quantitative
Type of Motion: Rotational
Constraints on planar objects: Must be something that
can be pushed horizontally and attached to its fulcrum
(e.g., the door to a house)
Allowable variations on objects: Mass, height and
width, location of object
Constraints on fulcrum objects: Must be attached to
the planar object; position of fulcrum object cannot be
changed
Ask questions that can be investigated based on
patterns such as cause and effect relationships.
Ask questions: Query the MARI about the properties
of the objects (e.g., what is the distance between the
hinge and where I pushed) based on observed
outcomes (e.g., how hard it was to push the door, or
how far the door moved).
Student only has 4 attempts to pass the ball to the girl
and can only vary position and strength of push.
Analyze data to determine if a design solution works as intended to change
the speed or direction of an object with a push (K-PS2-2).
Data: distance, slope, time, speed
Speed change: increase in acceleration
Direction: Vertical movement
Constraints on planar objects: Must be something flat (e.g., book, frame,
ruler) that can be placed on another object and can be pushed in a
downward movement
Allowable variations on planar objects: Mass, height and width, location
of object in the room, surface material
Constraints on fulcrum objects: The structural properties of the fulcrum
should support some, but not all of the set of planar objects; position of
fulcrum object can be changed
Use observations to describe patterns and/or relationships in that natural
and designed world(s) in order to answer scientific questions and solve
problems.
Use observations: use snapshot images of activity in the HRLA with
overlaid measurement data generated by the MARI to sort situations based
on the physical features, behaviors, or functional roles in the design.
Easy: Student can vary the position and strength of the push, but must
apply force by placing additional objects on the planar object and pushing
downward with both hands (to connect the kinesthetic experience of
applying the force with hands on experience of the object).
Harder: Student can vary both the position and strength of the push and
how the planar object is placed on the fulcrum (e.g., load is moved closer
or further away from fulcrum)
Iconic and graphical representation of underlying physics laws will be on the screen, and will change based on student actions.
Guided questions will ask students about distance, mass, force magnitude and direction, height, and slope based on observed
outcomes.
Components of Computational Model
Components of Decision Model
Courses of Action
Do nothing: move on, end task
Get more evidence or information: repeat same task,
perform similar task, ask a question
Intervene (instructional remediation): give elaborated
feedback, worked example or add scaffolding, more
supporting information
Intervene (task modification): new task (reduced or
increased difficulty), new task (qualitatively different)
Components of Decision Model
Decision Factors
Confidence of diagnosis : How certain are we about
hypothesized causal relation?
Consequence of misdiagnosis: What happens if we get it
wrong? What are the implications of ignoring other possible
states or causal relations?
Effectiveness of intervention: How effective is the intervention
we will give after diagnosis?
Constraints: Do we have to efficiency concerns with respect
to time or resource constraints?
Part Three
ASSESSMENT ARCHITECTURE
(YOUR EXAMPLE)
Assessment Architecture
Fixed Variables
Task characteristics
+
Context (test, simulation,
game)
Assumptions and Design
Rationale
+
Person (prior knowledge
and experience)
36
Assessment Architecture
Fixed Variables
Task characteristics
+
Observed Event(s)
What happened?
(Raw data, scored
information?)
Performance to
be Assessed
Context (test, simulation,
game)
+
Person (prior knowledge
and experience)
37
Assessment Architecture
Fixed Variables
Task characteristics
+
Observed Event(s)
Judgment
of
What happened?
performance
(Raw data, scored
Translation
What does this mean?
information?)
Context (test, simulation,
game)
+
Person (prior knowledge
and experience)
38
Assessment Architecture
Fixed Variables
Task characteristics
+
Context (test, simulation,
game)
+
Person (prior knowledge
and experience)
Assessment
Validation
Translation
Observed Event(s)
What happened?
(Raw data, scored
information?)
What does this mean?
Inferences
What are the potential causes of the
observed events?
Characteristics of the task?
Context?
Lack of Knowledge?
Not sure?
39
Potential Course of Actions
No
intervention
Move On
End Task
Get more evidence
or information
Repeat
Same
Trial
Perform
Similar
Task
Ask a
questio
n
Give
Elaborated
Feedback
Worked
Example
Intervene
Instructional
Remediation
Modify
Task
Add
Scaffoldin
g
More
Information
New Task
With
Reduced
Difficulty