Uploaded by Vl Snt

Evaluation of Serious Games in Mobile Platforms with QEF QEF Quantitative Evaluation Framework

advertisement
2012 Seventh IEEE International Conference on Wireless, Mobile and Ubiquitous Technology in Education
Evaluation of Serious Games in Mobile Platforms with QEF
QEF (Quantitative Evaluation Framework)
Paula Escudeiro
Nuno Escudeiro
Departamento de Engenharia Informática
Instituto Superior de Engenharia do Porto
Porto, Portugal
pmo@isep.ipp.pt
Departamento de Engenharia Informática
Instituto Superior de Engenharia do Porto
Porto, Portugal
nfe@isep.ipp.pt
Abstract— In this paper we present the overall evaluation of
the Quantitative Evaluation Framework (QEF) approach
which has been applied in an operational teaching
environment for the last year. This environment includes the
development of a serious game, supported by a web platform
and extended to mobile platforms, which is being supervised
by the research group GILT (Graphics, interaction & learning
technologies) under the frame of an international project. The
serious game that has been developed is called “Time Mesh”.
Time Mesh is a funded quasi experimental educational
software project which is being developed under the frame of a
quality evaluation environment, Quantitative Evaluation
Framework (QEF) (Escudeiro Paula and Bidarra José, 2006),
that measures system quality throughout its development life
cycle. The quality evaluation process started with a careful
planning phase. It has included the purpose of the evaluation,
the timing of the evaluation and who should be conducting the
evaluation process. Moderating the development of “Time
Mesh” with QEF assures the quality of the final product.
places over the last 600 years in social, cultural, economic
aspects.
The notion of European citizenship will be brought to the
learner/players through their interactions, either in real
(students' interaction in different countries through the
platform) or fictional (game scenarios) worlds. By
understanding the shaping of Europe as a result of history,
students will become familiar with the formation and
evolution of the countries and regions. This understanding is
expected to assist students to identify and respect cultural
diversity. Furthermore, the project’s geographical coverage
of Europe brings the multiculturalism and multi-linguistic
aspects of Europe into play. Time Mesh, the serious game,
was extended to mobile learning platforms. The framework
QEF was used to assess not only the final product but also its
quality during the development life cycle..
II.
Keywords-component; Assessment; Quantitative Evaluation;
Methodology; Serious Game; Mobile Learning
I.
INTRODUCTION
Developing serious games is a very demanding process
requiring efficient control activities to detect and correct
failures on an early stage. By monitoring the quality of the
product along its life cycle we are able to prevent the
majority of its failures. To evaluate serious games we
propose QEF, Quantitative Evaluation Framework, a generic
quality evaluation framework. This framework may also be
applied in other settings.
“Time Mesh” is a serious game being developed under
SELEAG, a Comenius project funded by the European
Union.
The objective of the SELEAG project is to evaluate the
added value of Serious Games in learning history, culture
and social relations.
Time Mesh is a collaborative and social game platform
for sharing and acquiring knowledge of the history of
European regions. This is an extensible, online, multilanguage, multi-player platform that is being developed.
This serious game allows students/players being involved
in the evolution of three different but interrelated European
978-0-7695-4662-9/12 $26.00 © 2012 IEEE
DOI 10.1109/WMUTE.2012.65
QEF - THE ASSESSEMENT METHODOLOGY
A simple question for any educational software should
be,” can this product actually teach what it is supposed to?”
It is a simple question to ask, but often difficult to answer
because the product may have so many beguiling features. It
requires the evaluator to recognize his/her own view of the
way in which students learn, to relate that view to the
learning objectives and to determine how and whether those
objectives are carried out in the software.
The application of QEF throughout the development life
cycle of “Time Mesh”, in both Web and mobile platforms,
highlights the flaws that are present in the current version at
the time of evaluation allowing the development team to
focus on those flaws guiding the product to achieve the
desirable requirements.
Moderating the development of “Time Mesh” with QEF
assures the quality of the final product.
QEF evaluates the educational software quality (ISO
9126 is the standard of reference) [Scalet et al, 2000] in a
three dimensional space. Every dimension aggregates a set of
factors. A factor is a component that represents the system
performance from a particular point of view. The dimensions
of our Cartesian quality space are: Functionality (F);
Efficiency (E) and Adaptability (A).
268
Authorized licensed use limited to: b-on: Instituto Politecnico do Porto. Downloaded on September 29,2023 at 09:12:57 UTC from IEEE Xplore. Restrictions apply.
validated? Has it no
orthographic errors?
R10: Has the alert
message been checked?
Are there no pervasive
or negative messages
and no racial or religion
discrimination?
R11: Is the content
related with situations
and problems of
student’s interest?
The quality of a given system is defined in our tridimensional Cartesian quality space and measured, in
percentage, relatively to a hypothetically ideal system
represented in our quality space by the coordinates (1, 1, 1).
The quality space aggregates in the dimensions –
Functionality; Efficiency and Adaptability – a set of factors
that measure the relevant characteristics of the object being
evaluated.
The Functionality dimension reflects the characteristics
of the educational software related to its operational aspects.
It aggregates two factors: easy of use and content’s quality
The Efficiency dimension aggregates, in the case of
educational software, four factors: audiovisual quality,
technical and static elements, navigation and interaction,
originality and use of advanced technology. Through this
dimension we measure the system’s ability for presenting
different views on its content with minimum effort.
The Adaptability dimension is the aggregation of five
factors: versatility, pedagogical aspects, didactical resources,
stimulates the initiative and self learning and cognitive effort
of the activities. Through them we can measure to what
extend the scenario and system content are efficacious –
whether they are focused and able to present different
instructional design theories and different learning
environment in a common platform.
The coordinates of a given system in our quality space,
may be obtained through the application of one of several
aggregation forms. We will compute these coordinates as the
average of the factors that contribute to it. The average is
simple and gives the same relevance to all factors.
QEF has previously been applied to control the quality of
several products throughout their lifecycle with very good
results.
III.
R12: Are examples,
simulations and
graphs part of the
system?
Table 1. Educational software requirements for functionality dimension
Dimension
Factor
Adaptability
Versatility
QUALITY DIMENSIONS
Quality dimensions are based on the following factors:
.
Dimension
Factor
Functionality
Easy of use
Content’s
quality
Requirement
examples
R1: Does the student
use the educational
software without
having to read the
manuals exhaustively?
R2: An on-line system
exists to help the user
overcome the
difficulties?
R8: Is the information
well structured and
does it adequately
distinguish the
objectives, context,
results, multimedia
resources...
Pedagogical
aspects
Didactical
resources
Stimulates
the initiative
R9: Is the content
Requirement
examples
R3: The educational
software is easily
integrated with other
educational
environments?
R4: Does it allow for
configuration? (level,
number of users on
line, language…)
R5: Does it includes an
evaluation system,
during the development
process?
R18: Does it allow for
new techniques and
better learning?
R19: Does it allow for
activities that keep the
curiosity and the
interest of the students
in the content, without
provoking anxiety?
R20: Does it provide
different activity types,
concerning the
knowledge acquisition,
that allow for different
forms of using the
system?
R21: Does it provide
help for students as
tutoring actions,
guiding activities and
reinforcements?
R22: Does it allow for
students’ decisions
269
Authorized licensed use limited to: b-on: Instituto Politecnico do Porto. Downloaded on September 29,2023 at 09:12:57 UTC from IEEE Xplore. Restrictions apply.
concerning the tasks to
carry through, the
choice of study module
and the study of subject
matter?
R23: Does it allow for
easy memorization,
interpretation,
syntheses and
experimentation?
and self
learning
Cognitive
effort of the
activities
Where:
n is the number of relevant factors for the dimension.
Each factor is evaluated by:
1
×
¦ prm
Efficiency
Factor
Audiovisual
quality
Technical
and static
elements
Navigation
and
interaction
Originality
and use of
advanced
technology
m
m
Requirement
examples
R6. Is there no excess
of information?
R7: Has it a rigorous
scenario design which
includes title, menus,
video, sound, photos,
metaphor, color rules?
R13: Does the
educational software
have a good program
structure that allows
easy access to content
and activities?
R14-: Is the speed of
communication
between the program
and the user
(animation,
presentation of
contents, reading of
data...) adequate?
R15: Is the program
execution efficient and
with no operational
errors?
R16: Is the navigation
system transparent,
allowing the user to
control actions?
Has the system been
developed with
originality?
The dissimilarity between the system under evaluation
and the ideal system is given by:
§ Dim ·
¦j ¨¨1 − 100 j ¸¸
©
¹
1−
D
n , Θ ∈ [0,1]
or
D ·
§
¨1 −
¸ * 100
∈ [0,100]
n¹
©
,θ
The quality of a system is measured from the distance
between the ideal system (projected system) and the real
system (final system).
a
IS
D
RS
n
× factorn )
,
¦(p ) = 1
n
n
f
e
For each system being developed we will have to identify
the importance of each factor to the dimension, pn. The
dimension coordinate is them computed as the weighted
mean of these factors:
n
2
Finally the quality of the system is computed as:
Table 3. Educational software requirements for efficiency
dimension
¦(p
× pc m )
Where:
M is the number of valid requirements for the factor.
pr m is the weight of the requirement m
pc m is the fulfillment percentage of the requirement m.
Table 2. Educational software requirements for adaptability dimension
Dimension
¦ ( pr
m
The system quality is in the inverse
proportion of the distance between the Ideal
System (IS) and the Real System (RS).
If D=0 Then Q=1
p ∈ [0,1]
ανδ n
If D=maxim, D max =
Then Q=0
n
270
Authorized licensed use limited to: b-on: Instituto Politecnico do Porto. Downloaded on September 29,2023 at 09:12:57 UTC from IEEE Xplore. Restrictions apply.
[16] Harland 1996 Harland, J. Evaluation as realpolitik. In Scott D &
Usher R (Eds), Understanding Educational Research. Routledge.
[17] Jacobson, (1992), Jacobson, Object Oriented Systems Engineering,
Addison-Wesley, 1992.
[18] Keller/Back (2004), Keller, M.; Back, A.: Blended-Learning-Projekte
im Unternehmen, Learning Center der Universität St. Gallen, St.
Gallen, 2004
[19] Laurillard, Diana (2002). Rethinking University Teaching, 2ª Ed.,
Parte II, “Analysing the Media for Learning and Teaching” (79-172).
London and New York: Routledge/Falmer.
[20] Merrill, (1981), Merrill, M.D., Kowallis, T., & Wilson, B.G.
Instructional design in transition. In F. Farley & N. Gordon (Eds.),
Psychology and education: The state of union
[21] Minken, I., Stenseth, B. E Vavik L., (1998), Educational Software.
ULTIMA-Gruppen A/S, Haden, Norway, 1998
[22] Oliver 2000, Oliver M, Evaluating online teaching and learning,
Information Services and Use, vol 20 (2/3)
[23] Pressman Roger S., (2001), Pressman, Roger S. Software Engineering
a Practitioner’s Approach, 5 th Edition, McGraw-Hill Companies Inc,
2001
[24] Purinima Valiathan, (2005), ASTD-Linking People, Learning &
Performance. Learning circuits- American Society for Training &
Development
[25] Rumbaugh et al, (1991), J. Rumbaugh, M. Blaha, W. Premerlani, F.
Eddy, and W. Lorensen, Object Oriented Modeling and Design,
Prentice Hall, Englewood Cliffs, New Jersey, 1991.
[26] Scalet et al, 2000, ISO/IEC 9126 and 14598 integration aspects: The
Second World Congress on Software Quality, Yokohama, Japan,
2000.
[27] Timmers (2000), Timmers, P.: Electronic Commerce: Strategies and
Models for Business-to-Business Trading. Wiley,.
[28] Yourdon, (1998), E. Yourdon, Managing the System Life Cycle, 2nd
Edition, Yourdon Press/prentice Hall, Englewood Cliffs, New Jersey,
1998.
We say that system quality is q% which means that the
system is able to perform q% of its initial specifications.
IV.
CONCLUSIONS
In this work we describe the use of a method to assess
quantitatively the quality of a given educational system
supported in Web and mobile platforms.
Quality evaluation frameworks, like the one we propose,
are crucial to help validating educational systems and ensure
that they are adequate and follow the original specifications
before using them in the learning environment.
REFERENCES
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
[9]
[10]
[11]
[12]
[13]
[14]
[15]
A.B. Smith, C.D. Jones, and E.F. Roberts, “Article Title”, Journal,
Publisher, Location, Date, pp. 1-10.
A. Dias Figueiredo (2005), Context Engineering: And is
Development Research Agenda, Universidade Coimbra, 2005
Allesi, S. e Trollip (1985), S. Computer Based Instruction: Methods
and Development, Prentice Hall, Inc. Englewood Cliffs, New Jersey,
USA, 1985
Bates Tony, (2000), A. W. Tony Managing Technological Change:
Strategies for College and University Leaders. San Francisco, 2000
BloomBertram B. Mesia, and David R. Krathwohl, (1964), Bloom B.
S. Taxonomy of Educational Objectives,: The Affective Domain &
The Cognitive Domain. New York: David McKay Co Inc.
Booch, (1994), G. Booch, Object Oriented Analysis and Design With
Applications, Second Edition, Benjamin/Cummings, Menlo Park,
Califórnia, 1994.
Clark 1994, Gery, (1994), Gery, GJ Making CBT happen. Boston:
Weingarten.
Clark, RE Media will never influence learning. Educational
Technology Research and Development
Coad and Yordon, (1991), P. Coad and E. Yourdon, OOA –Object
Oriented Analysis, 2nd Edition, Prentice Hall, Englewood Cliffs,
New Jersey, 1991
Crossley, K. e Green (1990), Le Design des Didacticiels: Guide
Pratique pour la Conception de Scénarios Pédagigiques Interactifs.
ACL-Editions, Paris France 1990.
Darby, M. R., Zucker, L. G., and Armstrong, J., 2002.
Commercializing Knowledge: University Science, Knowledge
Capture, and Firm Performance in Biotechnology. Management
Science, 48, 138-153.
Eckerson (1995), Eckerson, Wayne W. "Three Tier Client/Server
Architecture: Achieving Scalability, Performance, and Efficiency in
Client Server Applications." Open Information Systems 10, 1
(January 1995): 3(20).
Escudeiro, Paula; Bidarra José, (2006), X-TEC: Techno Didactical
Extension for Instruction/Learning Based on Computer – A new
development model for educational software, Setúbal, Portugal,
WEBIST 2006.
Finch (1986) Finch, J. Research and Social Policy. Falmer.
Gagné, (1996), Gagné, Robert M. and Medsker, Karen L. (1996). The
Conditions of Learning Training Applications. Florida: Harcourt
Brace & Company.
271
Authorized licensed use limited to: b-on: Instituto Politecnico do Porto. Downloaded on September 29,2023 at 09:12:57 UTC from IEEE Xplore. Restrictions apply.
Download