ChapterPDF Available

A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning

Authors:

Abstract and Figures

In this chapter, the authors present a methodology for researching and evaluating Serious Games (SG) and digital (or other forms of) Game-Based Learning (GBL). The methodology consists of the following elements: 1) frame-reflective analysis; 2) a methodology explicating the rationale behind a conceptual-research model; 3) research designs and data-gathering procedures; 4) validated research instruments and tools; 5) a body of knowledge that provides operationalised models and hypotheses; and 6) professional ethics. The methodology is intended to resolve the dilemma between the “generality” and “standardisation” required for comparative, theory-based research and the “specificity” and “flexibility” needed for evaluating specific cases.
Content may be subject to copyright.
Psychology, Pedagogy,
and Assessment in
Serious Games
Thomas M. Connolly
University of the West of Scotland, UK
Thomas Hainey
University of the West of Scotland, UK
Elizabeth Boyle
University of the West of Scotland, UK
Gavin Baxter
University of the West of Scotland, UK
Pablo Moreno-Ger
Universidad Complutense de Madrid, Spain
A volume in the Advances in Game-Based
Learning (AGBL) Book Series
Published in the United States of America by
Information Science Reference (an imprint of IGI Global)
701 E. Chocolate Avenue
Hershey PA 17033
Tel: 717-533-8845
Fax: 717-533-8661
E-mail: cust@igi-global.com
Web site: http://www.igi-global.com
Copyright © 2014 by IGI Global. All rights reserved. No part of this publication may be reproduced, stored or distributed in
any form or by any means, electronic or mechanical, including photocopying, without written permission from the publisher.
Product or company names used in this set are for identification purposes only. Inclusion of the names of the products or
companies does not indicate a claim of ownership by IGI Global of the trademark or registered trademark.
Library of Congress Cataloging-in-Publication Data
British Cataloguing in Publication Data
A Cataloguing in Publication record for this book is available from the British Library.
All work contributed to this book is new, previously-unpublished material. The views expressed in this book are those of the
authors, but not necessarily of the publisher.
For electronic access to this publication, please contact: eresources@igi-global.com.
CIP Data – Pending
ISBN 978-1-4666-4773-2 (hardcover)
ISBN 978-1-4666-4774-9 (ebook)
ISBN 978-1-4666-4775-6 (print & perpetual access)
This book is published in the IGI Global book series Advances in Game-Based Learning (AGBL) (ISSN: 2327-1825;
eISSN: 2327-1833)
Managing Director:
Production Manager:
Publishing Systems Analyst:
Development Editor:
Acquisitions Editor:
Typesetter:
Cover Design:
Lindsay Johnston
Jennifer Yoder
Adrienne Freeland
Allyson Gard
Kayla Wolfe
Christina Barkanic
Jason Mull
357
Copyright © 2014, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Chapter 17
DOI: 10.4018/978-1-4666-4773-2.ch017
A Brief Methodology for
Researching and Evaluating
Serious Games and Game-
Based Learning
ABSTRACT
In this chapter, the authors present a methodology for researching and evaluating Serious Games
(SG) and digital (or other forms of) Game-Based Learning (GBL). The methodology consists of the
following elements: 1) frame-reflective analysis; 2) a methodology explicating the rationale behind a
conceptual-research model; 3) research designs and data-gathering procedures; 4) validated research
instruments and tools; 5) a body of knowledge that provides operationalised models and hypotheses;
and 6) professional ethics. The methodology is intended to resolve the dilemma between the “general-
ity” and “standardisation” required for comparative, theory-based research and the “specificity” and
“flexibility” needed for evaluating specific cases.
Igor Mayer
Delft University of Technology, The Netherlands
Geertje Bekebrede
Delft University of Technology, The Netherlands
Harald Warmelink
Delft University of Technology, The Netherlands
Qiqi Zhou
Delft University of Technology, The Netherlands
358
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
INTRODUCTION
The growing interest in digital and other forms of
game-based learning (GBL), serious games and
simulation gaming (both abbreviated as SG) is
accompanied by an increasing need to know the
effects of what we are doing and promoting (Mayer,
Bekebrede et al., 2013; Mayer, Warmelink, &
Bekebrede, 2012). Meeting this need requires
proper methods, tools and principles that can be
agreed upon, validated and applied by the frag-
mented GBL and SG communities. In other words,
we must move towards a ‘science of game-based
learning’ (Sanchez, Cannon-Bowers, & Bowers,
2010). Considerable efforts and resources are cur-
rently being devoted to researching and evaluating
SG and GBL, thereby increasing both the number
and the quality of such evaluations (see discus-
sion below). Considerable weaknesses remain,
however, including the following:
• A lack of comprehensive, multipurpose
frameworks for comparative, longitudi-
nal evaluation (Blunt, 2006; Meyer, 2010;
Mortagy & Boghikian-Whitby, 2010;
Vartiainen, 2000.)
• Few theories with which to formulate and
test hypotheses (Mayer, 2005; Noy, Raban,
& Ravid, 2006.)
• Few operationalised models with which to
examine ‘causal’ relations (e.g. in structur-
al equation models) (Connolly, Stansfield,
& Hainey, 2009; Hainey, 2011; Hainey &
Connolly, 2010.)
• Few validated questionnaires, constructs
or scales, whether from other fields (e.g.
psychology) or constructed especially for
SG and GBL (Boyle, Connolly, & Hainey,
2011; Brockmyer et al., 2009; Mayes &
Cotton, 2001.)
• A lack of proper research designs that can
be used in dynamic, professional learn-
ing contexts other than that of the less
preferable randomised controlled trials
(RCT), which are impractical, unethical
and uncommon in almost every domain
except medicine, therapy and related fields
(Connolly, Boyle, MacArthur, Hainey,
& Boyle, 2012; Kato, Cole, Bradlyn, &
Pollock, 2008; Knight et al., 2010; Szturm,
Betker, Moussavi, Desai, & Goodman,
2011; van der Spek, Wouters, & Van
Oostendorp, 2011; van der Spek, 2011.)
• The absence of generic tools for unobtru-
sive (‘stealth’) data-gathering and assess-
ment in and around SG (Kickmeier-Rust,
Steiner, & Albert, 2009; Shute, Masduki,
& Donmez, 2010; Shute, Ventura, Bauer,
& Zapata-Rivera, 2009; Shute, 2011.)
In short, despite a promising increase in
publications, methods, tools and findings, we
continue to lack an overarching methodology for
SG research. The fragmented SG and GBL com-
munity faces the enormous challenge of aligning
itself in order to evaluate and research gaming
for learning in a comparative, systematic fashion
using procedures, frameworks and methods that
can be validated, checked and reproduced. It is
valuable to identify the effects of playing games
through randomised, controlled trials (RCT), many
of which involve students in laboratories. Never-
theless, it is also essential to identify the effects
of GBL and SG in uncontrolled circumstances
and within the context of objectives that truly
matter for real-life performance (e.g. emergency
management, leadership), as is usually the case
in professional learning and training. The chal-
lenge is to gather data on the quality, application
and outcomes of a broad range of SG on different
topics and with different objectives, within and for
different institutional contexts, at different times
and under uncontrolled conditions.
Research Objective
The objective of this chapter is to present a con-
densed methodology for researching and evaluat-
359
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
ing SG and GBL. The methodology comprises a
framework, conceptual models, research designs,
data-gathering techniques, hypothesis formula-
tion, directions for developing and using evaluation
constructs and scales, and procedures for testing
structural equation models. The methodology is
derived from the results of a long-term research
project (2005-2013) involving the systematic,
uniform and quantitative evaluation of several
hundred SG sessions with 12 SG, aimed at creating
a dataset containing information on 2 500 respon-
dents in higher education or work organisations.
Our ambition is to resolve the dilemma between
the generality and standardisation required for
comparative, longitudinal, theory-based research
and the specificity and flexibility needed for
evaluating singular cases.
IN SEARCH OF A METHODOLOGY
A social scientific discipline of SG research should
comprise the following elements (see Figure 1):
• Frame-Reflective Analysis: The multiple,
often conflicting ways in which we per-
ceive and discuss SG and GBL (Chong &
Druckman, 2007; Shaffer, 2006; Squire,
2001.)
• Methodology: The rationale and prin-
ciples upon which a conceptual model for
SG and GBL research is founded (Lawson
& Lawson, 2010; Mackenzie & Knipe,
2006.)
• Research Designs and Data-Gathering
Procedures: What works, why and when?
(De Vaus, 2001; Schneider, 2005.)
• Validated Research Instruments and
Tools: Questionnaires, surveys and instru-
ments for logging and tracking SG and
GBL, including their validation, for SG
and GBL (Boyle et al., 2011; Brockmyer
et al., 2009; Chertoff, Goldiez, & LaViola,
2010; Egenfeldt-Nielsen, 2006a; Mayes &
Cotton, 2001; Wright, 2009.)
• A dynamic body of knowledge identifying
the state of the art and knowledge gaps lead-
ing to research questions, operationalised
models, hypotheses for testing (Janssen &
Klievink, 2010; Ma, Williams, Prejean, &
Richard, 2007; Raphael, Bachen, Lynn,
Baldwin-Philippi, & McKee, 2009; Young
et al., 2012.)
• Professional ethics for SG researchers,
professionals and users (Babbie, 2007;
Chandler & Torbert, 2003; Kimmel, 1988.)
There is a great need for such a discipline, for
the following reasons:
• Accountability: ‘Users’ (clients, players,
learners) are increasingly becoming ex-
posed to and familiar with SG. They have
the right to know what they are actually
buying, using or playing, and they are en-
titled to insight into the reasons for and the
effects or consequences of the application
of SG and gamification. We also expect
that users will become more demanding,
critical and sceptical.
• Responsibility: This is the opposite of ac-
countability. A discipline that advocates
the use of SG and gamification to repair
a broken reality (McGonigal, 2011), es-
pecially in contexts involving vulnerable
groups in society (e.g. children, patients,
immigrants), has a great responsibility to
engage in critical reflection on the short-
term and long-term value and structural
consequences of the gamification tools that
they are developing, promoting and using.
In this chapter, we discuss the design of the
methodology with regard to the aforementioned
six blocks of methodology (see Figure 1.)
360
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
FRAME-REFLECTIVE ANALYSIS
For any emerging science or methodology, it is
necessary to provide a clear specification of the
locus and focus of its research. One way to ac-
complish this is to provide clear definitions and/
or develop classifications and taxonomies (Göbel,
2010; Jantke, 2010; Krathwohl, Bloom, & Betram,
1973; Maier & Grössler, 2000; Sawyer, 2008).
Words can have different meanings, depending
upon the disciplinary backgrounds in which they
are used, the interests upon which they are based
and other aspects. ‘Learning’, ‘serious games’,
‘effectiveness’, ‘evaluation, ‘researchand similar
concepts can easily become a source of confusion
or disagreement for different sub-communities.
Definitions or taxonomies are generally problem-
atic, however, especially in newly emerging and
highly interdisciplinary research disciplines like
SG, for the following reasons:
• Anti-essentialists’ have argued that
‘games’ cannot be defined (Rockwell &
Kee, 2011; Wittgenstein, 1953.)
• Definitions and taxonomies of SG and
GBL are political. They have a tendency
to exclude diverging views from access to
resources (e.g. funding and publication).
• Definitions and taxonomies kill innova-
tion. By definition, new combinations
(Schumpeter’s neue combinationen) do not
fit into any of the boxes of taxonomy.
• Definitions and taxonomies reify. They
focus on games as things and not on ex-
periences. Reification does not promote
a critical discourse about the underlying
worldviews and assumptions.
In our view, framing theory (Benford & Snow,
2000; Fisher, 1997; Giddens, 1988; Scheufele,
Iyengar, Kenski, Hall, & Eds, 2002) and frame-re-
flective (discourse) analysis (Adams & Goffman,
1979; Hart, 2008) can provide some necessary
foundations for an emerging scientific discipline of
SGs research. As noted above, words can take on
different meanings when used in different contexts,
possibly generating confusion or disagreement
within research communities. Framing is the act
of attributing meaning to events and phenomena.
It is a way in which to create order out of chaos
by providing critical analysis of the multiple,
often conflicting ways in which we perceive and
discuss particular topics, including the utility of
games in society, business and politics. In addition
to definitions and taxonomies, it is important to
develop a better understanding of the frames that
people construct and use when addressing and
answering the following questions:
Figure 1. Building blocks for a science of SG and GBL
361
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
• Does frequent playing of digital games
affect leadership styles, preferences for
team collaboration, motor skills and other
characteristics?
• Are games effective as interventions de-
signed for change or learning (e.g. at
school, work or in therapy, healthcare or
the military)?
• How is innovation achieved in the game in-
dustry, and what can other businesses learn
from these practices (Kim & Kim, 2011)?
• Can games promote (sell) products and
services or influence ideas and beliefs?
• Can the use of game principles or game
technology help societal and political com-
munities and institutions to organise them-
selves or improve their self-organisation
(Zichermann & Linder, 2010)?
Frame Analysis
A frame is “an instrument for defining reality”,
as opposed to “an instrument for describing real-
ity” (Donati 1992 in Fisher, 1997: 5.4). Frames
are neither mutually exclusive nor easily suited
to specific individuals. They exist in parallel, and
many researchers (including the authors) implicitly
or explicitly switch frames or adopt several frames
simultaneously. Other constructs closely resemble
frames, including ‘lenses’ and ‘belief systems’.
In sociology, frame analysis (FA) originates in
Goffman’s (1974) sociological theory and studies
on the organisation of experiences:
I assume that definitions of a situation are built
up in accordance with principles of organization
which govern events – at least social ones – and
our subjective involvement in them; […] that is
my definition of a frame. […] frame analysis is a
slogan to refer to the examination of the organiza-
tion of experience (Goffman, 1974:11).
In other words, frame analysis is a reflection
on “how people understand an issue, and to track
the way in which this understanding changes over
time” (Fisher, 1997, 6.2).
One appropriate question therefore concerns
the dominant frames through which researchers
currently view and discuss the utility of games for
society, business and politics. Another question
concerns how framing lays the foundation for a
methodology of SG research. In the following
subsections, we discuss:
• Frames on SG.
• Frames on learning.
• Frames on research.
Frames on SG
Any science of SG and any methodology for SG
research should reflect upon the assumptions
underlying its ontology (being) and epistemology
(knowing). A detailed discussion of the philosophy
of science would obviously exceed the scope of
this chapter. For our purposes, we need only define
two ‘drivers’ with which to construct four frames
on the utility of games (see Table 1.)
1. Whether the world as we know it is more
likely to be real (ontological realism) or
constructed (ontological idealism): If the
world is real, we are more likely to be able
to observe it, measure it and come as close
as possible to understanding it as it really
is. If it is rooted in our ideas (mind), we
Table 1. Four frames
362
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
can only explore and try to understand our
relationship to the world as we think it is,
expanding our understanding through inter-
action with others who may think differently
(phenomenology).
2. How we consider change in the world (and in
‘ourselves’ within it): If we assume that the
subject (‘I’/‘we’) can exercise some degree
of control in changing its environment, we
acknowledge ‘interventionism’. We then as-
sume that we can ‘decide’ (build, construct,
repair, steer) parts of the world in which we
live as we see fit. If we assume that actual
change is less the creation of one or several
individuals than it is the emergent result of
various intentional and unintentional forces
within a system, we accept a type of ‘evo-
lutionism’ or ‘determinism’. The system is
assumed to influence the subjects to a much
greater extent than the subject can influence
the system.
We thus construct a two-dimensional space in
which we place four frames on the utility of games
or SG (see Table 2). We discuss these frames in
the sections below.
• SG as Tool, Therapy: This frame reflects
the majority and most frequently cited ex-
amples of SG for a wide range of purposes
Table 2. Frame-reflective discourse analysis
363
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
(e.g. therapy, education, health, decision-
making, training). Through this frame, we
see a ‘thing’ that can be measured, indexed
and taxonomised. In other words, we see a
‘tool’ that might or might not work (De
Caluwé, Hofstede & Peters, 2008). The
language in this frame is pervaded with
words like ‘effectiveness’, ‘efficacy’, ran-
domised controlled trials (RCTs) and ‘evi-
dence-based’. The tool itself is measured in
terms of ‘metrics’ and its effects in terms
of ‘analytics’. Especially within the context
of health, it is treated as a new type of ther-
apy, the effectiveness of which must be as-
sessed in clinical trials (Fernández-Aranda
et al., 2012). Research revolves around the
question of whether games offer a more ef-
fective tool for learning, education, health
and training. Proponents do their best to
prove it and understand how that works.
Opponents might argue that game-play
does not work, that there is inconclusive
evidence or even that it has countervailing
effects, like addiction (see Table 3).
Watching the healseeker game
(Healseeker|Yulius Academie, n.d.) and
reading its documentation, it is interesting
to note how the words used by the design-
ers, researchers and sponsors reflect a med-
ical frame (Kato et al., 2008), directed to-
wards the search for scientific evidence
that patients (in this case, young children
with ADHD) might improve by playing the
game.
• SG as Creative Innovation: From this
frame, we see SGs as a part of evolution-
ary change, and especially as a significant
factor in the competitive race among na-
tions, regions, companies and even indi-
viduals. The argument in this frame is that
the phenomenon of digital games is built
upon highly competitive business models
that might be more suitable for Society 2.0
and that the games are surrounded by tech-
nological innovation, creativity and other
processes that could generate a competitive
advantage in design, production and or-
ganisation (Nieborg, 2011; Schrage, 2000).
Failure to use game technology, game prin-
ciples or related resources comes close to
stepping out of the race. The arguments of
a great many policymakers and business
leaders are derived from within this frame,
promoting SG as ‘a way to the future’ or
‘a chance for innovation. Watching and
reading the case of the FORD virtual real-
ity factory (Virtual Reality at Ford Motor
Company, n.d.), it is interesting to note that
it is presented as an almost unavoidable
and self-evident innovation. If the compa-
ny does not go virtual, others will, and the
company will lose its competitive advan-
tage. Whereas games like Healseeker are
aimed at curing and repairing that which
is broken, games within this frame aim to
build a new future. Many examples from
national or EU policy documents (e.g. on
the creative industries or innovation pol-
icy) can illustrate how this frame colours
the ways in which policymakers interpret
SG. Obvious criticisms concern the rela-
tive novelty, validity and uniqueness of this
view on games as ‘creative innovation’ or
‘competitive advantage. Research issues
revolve around understanding the prin-
ciples of creativity and innovation in and
around games (and the game industry) and
finding ways to utilise it. Counterarguments
might assert that the political-economic
support for the creative-game industry pro-
motes incumbent winners while eliminat-
ing true innovators and entrepreneurs.
• SG as Persuasion: From this frame, we
see the world as engaged in a power strug-
gle of beliefs and ideas. Games are a pow-
erful new means of communication, and an
even more powerful means of persuasion
and rhetoric (Bogost, 2007a). This can be
364
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
used for selling products or services (e.g.
advergames, many forms of gamification,
games for branding), as well as for effecting
change in social behaviour (e.g. bullying
prevention) or political ideas. Examples of
such SG are numerous. Some of these SG,
like September 12 (Frasca, 2007), are well-
known and have put a mark on the debate
about SG. Many others (e.g. the Wikileaks
examples) are known only within small
communities. The vast majority have sim-
ple, non-engaging game-play, although
their procedural rhetoric remains very
clear and strong (Bogost, 2007). The de-
velopment of relatively complicated games
has been driven by a few large institutions
and companies, including America’s Army
(AA; Nieborg, 2004) and €conomia (€co-
nomia, n.d.). In our view, the case of PING
(Poverty Is Not A Game (PING), n.d.) falls
somewhere between Frame I and Frame
III. In its presentation, however, it contains
much of the rhetoric of intervened social
change (i.e. making children aware of pov-
erty). Although researchers have investi-
gated the types of ideas that are expressed
through games, most studies focus primar-
ily on how discourses in society respond
to such games and ideas, on whether and
how they influence the discourse in society
or certain communities, and on how this
works.
• SG as Self-Organisation: Through this
frame, we see games as part of an evolution
in society and cultures at large. Adherents
argue that we are witnessing the ludifica-
tion (Raessens, 2006, 2009) of cultures,
due to the growing pervasiveness of digi-
tal games, especially amongst the younger
generation. Ludification (or gamification)
affects the ways in which people organise
and interact in everyday life (e.g. in social-
political-cultural life or at work). For many,
this cultural change might be subtle, slow
and unnoticed. It might also become sub-
merged in self-organising communities on
the web or in our efforts to gamify science
by using games to organise crowd sourc-
ing or political participation. One of the
best examples of SG as self-organisation
is Foldit (Cooper et al., 2010). Although
some researchers attempt to explain ludi-
fication within this frame, most try to find
and exploit game principles for self-organ-
isation as part of gamification (A World
without Oil, McGonigal, 2011). Critics
might argue that ludification and gamifica-
tion could potentially create a new divide
based upon access to and literacy in digital
games. Furthermore, a wide range of ethi-
cal questions exists with regard to the use
of games for self-organisation (e.g. in the
work place.)
Frames on Learning
There are numerous frames (theories, models)
for learning. Although space limitations prevent
us from reconstructing them in this chapter, SG
research reflects a strong bias towards ‘individual’,
‘educational’ learning. We would like to draw at-
tention to the fact that games are connected to other
forms of learning (not restricted to education),
including the various forms of learning systems.
System change can be brought about by uncon-
trollable, external factors (e.g. disasters, crises)
or as the result of internal, intentional processes
(i.e. learning) that are intended to improve control
over the system’s environment. The first type of
change can trigger the second; failure to learn can
cause the first.
Systems and system boundaries can obviously
be established in different ways and at differ-
ent levels (e.g. individual, group, organisation,
society). In other words, individuals are not the
only parties that learn; groups, organisations and
societies learn as well. Many theories have been
developed on group, organisational and system
365
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
learning (Argyris & Schön, 1974; Argyris, 1977;
Meadows, 1999; Senge, Roberts, & Ross, 1994;
Senge, 1990; Sweeney & Meadows, 2001).
Individuals (e.g. students) are thus not the only
parties that can learn from playing games; systems
and organisations can also learn through playing
and gaming (Schrage, 2000). For this reason, SG
can be used as deliberate interventions to change
the performance of groups, organisations or sys-
tems (Duke & Geurts, 2004; Mayer & Veeneman,
2002). They can also be used to foster public
awareness and critical discourse concerning par-
ticular issues (Bogost, 2007; Rebolledo-Mendez,
Avramides, & de Freitas, 2009.)
The emerging discipline of GBL is roughly
segregated into ‘SG for education’ and ‘simula-
tion games for societal and organisational change’.
Relationships between the individual and system
levels of GBL and SG are insufficiently understood
(Argyris, 1977, 1982, 1995; Berends, Boersma,
& Weggeman, 2003; Simon, 1991.)
Gaming and associated forms of playful ex-
perimentation (e.g. simulations, policy exercises)
can be viewed as built-in mechanisms with which
systems can prepare for or cope with change. They
offer a way for individuals/systems to determine
whether change is needed, whether this would be
a time to change, what would happen if the change
were to take place and whether they are capable
of changing when necessary.
For systems, gaming is adaptation, although
this adaptation has no intrinsic morality. Whether
the adaptation should be regarded as ‘good’ or
‘bad’ depends upon the individual’s value frame-
work. A person can ‘learn’ to be an effective serial
killer; the state can ‘learn’ to become an effective
dictatorship. Institutionalised norms and values
obviously do indicate the value of learning (e.g.
through self-actualisation, progress, achievement,
commercial success.)
Although SG is commonly associated with
‘education’, education is a changing system in
itself. It is a subsystem institutionalised within
societies in order to manage change in learning.
The effectiveness of SG and GBL (an issue in
Frame I) can be considered in different ways:
1. GBL at the individual level, group, organi-
sational, network or system level.
2. GBL in formal education, as well as out-
side formal education (e.g. professional or
post-academic training), as with on-the-job
training, learning within the context of
management, decision-making, planning or
politics.
3. GBL by different actors in and around the
game (e.g. the actual player, the spectator, the
designer, the analyst): GBL is not restricted
or limited to those who actually play the
game.
GBL is thus much wider and richer than the
dominant frame is: GBL = games in/for educa-
tion (Wu, Hsiao, Wu, Lin, & Huang, 2011) (see
Table 3.)
Table 3. Evaluation and learning at three levels
Analytical: Learning from a
Game
Learning: Learning during
a Game
Instrumental: Development,
Training through a Game
System (Ex ante) validation, assessment
of a complex system
Enhancement of learning
systems, organisations
Development of concrete
strategy, decision, policy
Network, chain (Ex ante) validation, assessment
of network, chain
Enhancement of learning
network or chain
Development of governance,
co-ordination or other processes
Actor/group (Ex ante) validation, assessment
of individuals, teams
Situational awareness, sense
making, team learning
SOPs: Job-oriented training
(JOT), skills and competencies
366
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Frames of SG Research
Like ‘learning’, the notion of ‘game’ can take on
different meanings within contexts of ‘research’
or ‘science’. The following are several frames on
the use of ‘game’ within the context of research:
• Research Theory: Game theory as used
in economics, political science and simi-
lar fields (Leyton-Brown & Shoham,
2008; Ordeshook, 1986; Shubik, 1999;
Varoufakis, 2008.)
• Research Concept: Organisation, man-
agement, decision-making as a strategic or
political game (Firestone, 1989; Scharpf,
1997; Steunenberg, Schmidtchen, &
Koboldt, 1999.)
• Research Object: Studying game cultures,
game economics, game politics and simi-
lar aspects (Castronova, Williams, Ratan,
& Keegan, 2009; Castronova, 2005; Ermi
& Mäyrä, 2003; Salomon & Soudoplatoff,
2010; Shaw, 2010.)
• Design Artefact: Game as a socio-techni-
cal design, as an artefact or as another ob-
ject (Annetta, 2006; Björk & Holopainen,
2005; Harteveld, 2011; Kankaanranta &
Neittaanmki, 2008; van der Spek, 2011).
• Research Method: Game as a research
method comparable to simulation or exper-
imentation (Barnaud, Promburom, Trebuil,
& Bousquet, 2007; Ducrot, 2009; Lempert
& Schwabe, 1993; Mayer, Carton, Jong,
Leijten, & Dammers, 2004; Meijer, 2009;
Meijer, Mayer, van Luipen, & Weitenberg,
2011; Taylor, 1971; Tykhonov, Jonker, &
Meijer, 2008.)
• Intervention Method: Game as therapy
or as a method for education, learning,
change or decision support (Geurts, Duke,
& Vermeulen, 2007; Preschl, Wagner,
Forstmeier, & Maercker, 2011; Wenzler,
2008; Whitlock, McLaughlin, & Allaire,
2012.)
• Data-Gathering Method: Game as an
environment for observation, group in-
terview or data modelling (Cooper et al.,
2010; Good & Su, 2011; Khatib et al.,
2011; Wood, Griffiths, & Eatough, 2004.)
Similarly, the concepts of ‘research’, ‘evalu-
ation’ and ‘assessment’ can also have different
meanings in different contexts. Table 4 provides
an overview of several frames on research, evalu-
ation and assessment.
Types of Evaluation Research
There are several different types of SG research
and evaluation:
1. Constructivist vs. Objectivist Evaluation:
Constructivists emphasise that evaluation is
an interactive, interpretative process among
stakeholders; objectivists emphasise that the
characteristics of evaluation should include
factuality, distantiality, impartiality and
neutrality (Guba & Lincoln, 1989.)
2. Theory-Based vs. Explorative Evaluation:
Theory-based evaluation emphasises that
hypotheses should be derived in advance
from previous scientific research and theory;
explorative evaluation emphasises that ex-
planatory theories and hypotheses can be
derived from the ‘ground up’ (Weiss, 1997.)
3. Summative vs. Formative Type of
Evaluation: Summative evaluation focuses
on ex post goal achievement; formative
evaluation focuses on adaptation purposes
(Bloom, Hastings, & Madaus, 1972.)
4. Learning vs. Accountability Type of
Evaluation: The learning type of evalua-
tion emphasises lessons for the future; the
accountability type of evaluation emphasises
367
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
‘responsibility for the past’ (Edelenbos &
Van Buuren, 2005.)
5. Broad vs. Narrow Evaluation: Broad
evaluation considers many different perspec-
tives, aspects, dimensions and disciplines;
narrow evaluation focuses on one or only a
few specific details.
6. Rigorous vs. Generic Evaluation: Rigorous
evaluation emphasises depth and rigour at
the expense of resources; generic evaluation
emphasises resources at the expense of depth.
The dimensions described above are obviously
not restrictive. They are nevertheless important
for demonstrating that the issue of evaluation re-
search on SG can be placed within a much wider
methodological context. This is not a plea for doing
haphazard research; it is simply a reminder of the
various epistemological and ontological perspec-
tives that have been developed throughout the
history of science and that should not be ignored
in a science of SG and GBL.
Our frame analysis clearly indicates that our
methodology for SG research fits within Frame
I and that it is of the evaluation type (primarily
objectivist, theory based, summative). Methodolo-
gies for Type II (e.g. economics of innovation),
Type III (e.g. political science, sociology), and
Type IV (e.g. cultural and organisational studies)
have yet to be developed.
Table 4. Research, evaluation and assessment of SG
Research Social scientific research To understand or explain something without a specific
predefined, societal purpose, using scientific methods
Action research: scientific research approach that assumes
that research and change are and should be intertwined
Applied research (e.g. contract research, policy analysis,
consultancy)
To improve or change, primarily with a predefined societal
purpose and often for a problem owner, client, issue or
stakeholder group, using methods that derive their legitimacy
and credibility from scientific criteria
Evaluation = a specific type of applied research
Evaluation Applied research with the specific intention of determining the
‘value’ of something or someone in the light of past, present
or future objectives, tasks, function or other aspects
Evaluator = person who is evaluating.
Evaluans = the object/person being evaluated
Output/outcome evaluation = to evaluate something or
someone primarily according to outputs or outcomes,
regardless of how they have been achieved
Process evaluation = to evaluate something or someone
according to internal processes, regardless of their outputs
or outcomes
Evaluation of learning = to establish the value of a learning
process and/or output
Learning type of evaluation: evaluation primarily intended
to learn from the past, with the goal of improving something
or someone in the future
Performance evaluation of process, output, outcome, person
(i.e. assessment)
Assessment Evaluation with the specific intention to establish/judge the
performance, suitability and/or cost-effectiveness of something
or someone in the light of past, present or future objectives,
tasks, function or other aspects
Assessor = person who is assessing
Assessans = object/person being assessed
Assessment = the process of assessing or being assessed
368
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
METHODOLOGY: RATIONALE
AND CONCEPTUAL MODEL
Literature Overview
A great many PhD theses have now been published
on the effects of game-based learning and/or SG
experiments (Becker, 2008; Bekebrede, 2010;
Blunt, 2006; Bremson, 2012; Calleja, 2007;
Copier, 2007; Djaouti, 2011; Egenfeldt-Nielsen,
2005; Gasnier, 2007; Hainey, 2010; Harteveld,
2012; Houtkamp, 2012; Hussaan, 2012; Kuit,
2002; Leemkuil, 2006; Meijer, 2009; Squire,
2004; Steinkuehler, 2005; van der Spek, 2011;
van Houten, 2007; van Staalduinen, 2012). Several
review articles on game-based learning have also
been published, and such articles are now appear-
ing with increasing frequency (Adams, 2010; Bar-
lett, Anderson, & Swing, 2008; Beyer & Larkin,
1978; Boyle, Connolly, Hainey, & Boyle, 2012;
Connolly et al., 2012; Coulthard, 2009; Egenfeldt-
Nielsen, 2006b; Girard, Ecalle, & Magnan, 2012;
Gosen & Washbush, 2004; Graafland, Schraagen,
& Schijven, 2012; Greenblat, 1973; Hays, 2005;
Jenson & de Castell, 2010; Ke, 2009; Lee, 1999;
Leemkuil, Jong, & Ootes, 2000; Mayer, 2009;
Papastergiou, 2009; Randel, Morris, Wetzelf, &
Whitehill, 1992).
Few of these publications provide high-quality
evaluation frameworks regarding what should be
measured in a comparative fashion and how to do
so, taking into account the real-life and dynamic
setting of the project (De Freitas & Oliver, 2006).
Hainey and colleagues have recently published
a useful overview of 11 evaluation frameworks
(Connolly et al., 2009; Hainey & Connolly,
2010). The frameworks reviewed include the four-
dimensional evaluation framework proposed by
De Freitas and colleagues (2010). Other models
that can be considered include the framework
for theory-based evaluation developed by Kriz
and Hense (Bekebrede, 2010; Kriz & Hense,
2004, 2006). In general, few publications present
evaluation frameworks for game-based learning
in higher education, let alone within the context
of professional, in-company training or group and
organisational learning.
Limitations of Existing Models
Most of the existing models and frameworks are
high-level models. They specify a limited number
of generic concepts that can or should be con-
sidered when evaluating SG. These models and
frameworks nevertheless offer:
1. Few indications for how to use the models,
for what purpose, with what scope and under
which conditions.
2. Few procedures for validating the conceptual
research/evaluation model.
3. Few research hypotheses and research
designs.
4. Few definitions of or relationships and in-
terrelationships between the concepts in the
model.
5. Few operationalisations and validations of
constructs.
Furthermore, the application of the models is
characterised by:
• The dominance of single case-studies, sin-
gle games, single contexts of application.
• A lack of information on the question-
naires used.
• A focus on the GBL of children in for-
mal education, with little attention to ad-
vanced–professional learning outside of
education.
• A focus on the learning of individuals in
formal training or the educational con-
text, with little attention to the learning
of teams, groups, organisations, networks
or systems within policy or organisational
contexts.
369
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Requirements
One important question therefore concerns the
requirements for a good evaluation framework
for SG evaluation research. A generic evaluation
framework (and corresponding procedures) for
GBL and SG research should ideally have the
following characteristics:
1. Broad Scope: It should consider the broad
range of educational contexts, games, learn-
ing objectives and topics.
2. Comparative: It should allow the use of
particular data from different games for
comparison.
3. Standardised: The use of pre-/quasi-
experimental research designs requires the
standardisation of materials and procedures.
4. Specific: It should measure data precisely
by pinpointing variables.
5. Flexible: Given that game play cannot be
always predicted, data gathering should be
flexible for measurement, while meeting
all other requirements (e.g. standardisation,
specificity.)
6. Triangulated: It should use a mixed-method
approach with qualitative and quantitative
data.
7. Multi-Level: It should consider the indi-
vidual, game, team, organisation and system
levels.
8. Validated: It should use validated research
methods (e.g. research method and game
design.)
9. Expandable: It should offer the possibility
of measurement on new variables.
10. Unobtrusive: The use of gaming for sys-
tematic and extensive data gathering (e.g.
research, comparative or theory-based evalu-
ation) should be unobtrusive.
11. Fast and Non-Time Consuming: The
use of real-world cases for data gathering
implies that tools and methods should be
fast and non-time-consuming, given that
real-world projects do not allow much time
and resources to be devoted to research.
12. Multi-Purpose: It should persuade stake-
holders to extend their data-gathering efforts
beyond the obvious and the minimal.
An evaluation of GBL or SG should be broad
in scope but light in operation. It should address
both the formative and the summative purposes
of evaluation (Bloom et al., 1972), as well as the
evaluation interests of the designers, players,
financers and other stakeholders. At the other
end of the spectrum, the data should be suitable
for deeper analysis, in order to understand what
happens and why.
Comparative, Theory-
Based Evaluation
The establishment of learning effectiveness and
contributing factors requires an evaluation frame-
work that allows:
1. The operationalisation of independent, de-
pendent and mediating/context variables,
including ‘engagement’ (in this case, inde-
pendent), ‘learning effectiveness’ (in this
case, dependent) and age (or age-mediating)/
psychological safety (context.)
2. A systematic, unobtrusive process for data
gathering and data analysis.
3. The formulation of research questions and
hypotheses based on a conceptual research
model.
Conceptual Framework
A generic model for social scientific research,
evaluation and assessment regarding SG in real-
world contexts should provide:
370
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
• A flexible and generally applicable re-
search model from which we can derive:
• A set of research questions and hypotheses.
• A research design for applying the model.
• A suite of research tools and instruments.
• Guidelines, practices and rules for apply-
ing, falsifying, validating and improving
the elements specified.
• Empirical testing of the robustness of the
model.
The core of the model portrayed in Figure 2
is a deconstruction of GBL into the following
elements:
• The pre-game condition (the subject’s
attitudes, knowledge, skills and behaviour
relevant to GBL and SG and/or the case at
hand before playing the game): We mea-
sure a variety of items and constructs,
including attitudes towards GBL and or-
ganisational commitment (see 3.1–3.4 in
Figure 2.)
• The quality of the GBL intervention:
subdivided into the quality of the actual
game design, game-play, interaction with
the facilitator/instructor and interaction
with the digital game environment (see
4.1–4.2 in Figure 2.)
• The post-game condition: the subject’s
attitudes, knowledge, skills and behaviour
relevant to the GBL and related matters
(see 5.1–5.4 in Figure 2.)
Background Variables
Socio-demographic variables (e.g. age, sex, na-
tionality) (see 1.1 in Figure 2.)
Professional and student characteristics (e.g.
position, work experience, level of education)
(see 2.1 in Figure 2.)
Mediating Variables
• Individual as a participant (e.g. personality
characteristics; Big 5, Hexaco) (see 1.2 in
Figure 2.)
• Individual as a learner (e.g. learning styles)
(see 1.3 in Figure 2.)
• Individual as a gamer (e.g. game skill,
game experience, game attitudes, game-
play style) (see 1.4 in Figure 2.)
• Professional/student as a serious gamer
(e.g. previous experience with SG in a pro-
fessional context) (see 2.4 in Figure 2.)
• Professional/student as a participant (e.g.
intrinsic/extrinsic motivation) (Ainley &
Armatas, 2006.)
• Context Variables: Organisational/in-
stitutional climate in which the GBL/SG
takes place (e.g. commitment to the or-
ganisation, identification with leader or or-
ganisation, psychological safety) (see 6.1
in Figure 2.)
• First-Order Learning: Direct influence
of playing the game on the individual,
small-group attitudes, knowledge, skills or
behaviour (see 7 in Figure 2.)
• Second-Order Learning: Direct/indirect,
short or long-term influence of the game as
a whole (incl. design process, sessions, dis-
cussions, publications, other interventions
and other factors) at the group, network,
organisational and system levels (see 8 in
Figure 2.)
QUASI-EXPERIMENTAL
RESEARCH DESIGN
The model can now be translated into a quasi-
experimental design. In other words, it can be
translated from the simple ‘post-test only’ design
into a ‘pre-test/post-test’ design involving ‘ran-
domisation (R)’, ‘control group (C)’ and ‘repeated
measurement’ (Campbell & Stanley, 1963; Cook
371
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
& Campbell, 1979; Creswell, 2002). Figure 3
illustrates the basic translation of the conceptual
model into a quasi-experimental design (R and C
not included in Figure 3.)
DATA-GATHERING
One of the special features that SG can offer
for advanced learning is that the games provide
excellent environments for mixed-method data
gathering (i.e. triangulation), including crowd
sourcing, panel discussions, surveys and obser-
vations (including video observations). Figure 4
provides a visual impression of the methods that
can be mixed with SG.
The following are a few examples of different
forms of data gathering:
Figure 2. Conceptual framework
372
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
• MSP Challenge 2011: We used a group of
70 international players as an expert panel
in a survey assessing the state of marine
spatial planning (MSP) in the 13 countries
in which the players worked (Mayer, Zhou,
et al., 2012, 2013.)
• XVR Training Simulators: We used vid-
eo observations and network analysis of 8
sessions with 100 first responders to anal-
yse team communication patterns and ef-
fectiveness (Ruijven, n.d.)
• Levee Patroller (Deltares Deltabrain,
n.d.): We conducted pre-game, in-game
and post-game knowledge tests to measure
the increase in knowledge of geo-mechani-
cal levee failures as a result of playing eight
exercises in the 3D SG Levee Patroller
(Harteveld, 2012.)
• Servant-Leadership Game: We used
validated pre-game and in-game ques-
tionnaires on relevant psychological con-
structs, including servant-leadership and
commitment to change (Kortmann et al.,
2012.)
• TeamUp: We performed in-game logging
and tracking on hundreds of events and re-
sults, including distances, paths, play time
and avoidable mistakes, in combination
with questionnaires (Bezuijen, 2012.)
• SimPort–MV2 (TU Delft, Tygron
Serious Gaming, & Port of Rotterdam,
n.d.): We used pre-game and post-game
questionnaires on such aspects as learning
satisfaction, game play and motivation, in
combination with maps and strategic deci-
sions on the second Maasvlakte port area
(Bekebrede, 2010.)
Evaluation data are gathered through mixed
methods, in most cases combining pre-game and
post-game surveys amongst the players with live
or video observations, transcripts of after-action
reviews and game results. In a few cases, meth-
ods are applied more rigorously through in-game
knowledge tests or network and communication
analyses from video observations. Table 5 provides
an overview of how to mix the various methods
in the pre-game, in-game and post-game stages.
Figure 3. Generic quasi-experimental design for GBL and SG
373
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
VALIDATED INSTRUMENTS AND
TOOLS
In our comparative research, we gradually built
a set of validated and reusable questions for the
following constructs and items:
Pre-Game
• Socio-demographic factors (e.g. sex,
age, nationality and culture) (Bekebrede,
Warmelink, & Mayer, 2011; Boyle &
Connolly, 2008; Brown, Ley, Evett, &
Standen, 2011; Brown, Hall, Holtzer,
Brown, & Brown, 1997; Erfani et al., 2010;
Hofstede, 1986; Jenson & de Castell, 2010;
Kinzie & Joseph, 2008; Pfister, 2011.)
• Previous experiences/skills (e.g. with
computers, games and virtual learning en-
vironments) (Erfani et al., 2010; Harper et
al., 2007; Mortagy & Boghikian-Whitby,
2010.)
• Attitudes (e.g. change, conflicts, intrinsic
and extrinsic motivation, learning styles)
(Ashton & Lee, 2009; Garris, Ahlers,
& Driskell, 2002; Guay, Vallerand, &
Blanchard, 2000; Huang, 2011; K. Lee &
Ashton, 2004; Malone & Lepper, 1987a,
1987b.)
Figure 4. SG and data-gathering methods
374
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
• Skills: personal competence (e.g. with
games, learning, particular professional
skills) (Brown et al., 2011; Enochsson
et al., 2004; Holsbrink-Engels, 1998;
Verdaasdonk et al., 2009; Wolfe & Box,
1988.)
• Behaviours (behavioural intentions.)
• Group, team, organisational character-
istics (e.g. team/group conflict, psycho-
logical safety, psychological collectivism,
team and organisational commitment)
(Brockner & Higgins, 2001; Carmeli,
Brueller, & Dutton, 2009; Edmondson,
1999; Ferris, 2005; Jackson, Colquitt,
Wesson, & Zapata-Phelan, 2006.)
In-Game
• Game performance: based upon in-
game scores (e.g. time, avoidable mis-
takes) (Baba, 1993; Blumberg, 2000;
Oslin, Mitchell, & Griffin, 1998; Tallir,
Lenoir, Valcke, & Musch, 2007; Trepte &
Reinecke, 2011.)
• Game-play (e.g. effort; dominance, influ-
ence, power.)
• Game experience (e.g. flow, immersion,
presence) (Admiraal, Huizenga, Akkerman,
& Ten Dam, 2011; Csikszentmihalyi,
1991; Martin & Jackson, 2008.)
Table 5. What to measure, how and when?
How What? Pre-Game In Game Post-Game
Self-reported Qual. Personality, player
experiences, context etc.
Interviews, focus
group, logbook
Logbook, interviews or
small assignments as
part of the game
Interviews focus
group, after-action
review
Quant Soc-dem., opinions,
motivations, attitudes,
engagement, game-
quality learning, power,
influence, reputation,
network centrality, learning
satisfaction etc.
Survey, questionnaires,
individual or expert
panel
In-game questionnaires Survey,
questionnaires,
individual or expert
panel
Tested Qual. Behaviour, skills etc. Actor role-play, case-
analysis, assessment,
mental models etc.
Game-based
behavioural assessment
Game-based
behavioural
assessment
Quant Values, knowledge, attitudes,
skills, personality, power
Psychometric, socio-
metric tests (e.g.
personality, leadership,
team roles, IQ)
Game-based
behavioural
performance analysis
Game-based
behavioural
performance analysis
Observed Qual. Behavioural performance
of student, professionals,
player and/or facilitator,
others; decisions, strategies,
policies, emotions, conflicts
etc.
Participatory
observation,
ethnographic methods
Video, audio
personal observation,
ethnography,
maps, figures,
drawings, pictures etc.
Participatory
observation,
ethnographic methods
Quant Biophysical–psychological
responses, including stress
(heart rate, perspiration)
Participatory
observation, network
analysis,
biophysical–
psychological
observation
In-game tracking
and logging; network
analysis, data mining,
biometric observation
In-game log file
analysis, network
analysis
375
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Post-Game
• Game experience (e.g. engagement, fun
while playing the game) (Boyle et al.,
2012; Mayes & Cotton, 2001; Schuurink,
Houtkamp, & Toet, 2008.)
• Player satisfaction with: the game (e.g.
clarity, realism); user interaction (e.g. at-
tractiveness, ease of use, computer mal-
functions, support); the quality of the facil-
itator (e.g. supportive, player identification
with facilitator); interaction with other
students (e.g. player efforts, motivation);
identification of players with role; team en-
gagement (Olsen, Procci, & Bowers, 2011;
Reichlin et al., 2011) (See also Table 3 and
the discussion)
First-order learning (short-term, in-
dividual, participants):
Player learning satisfaction,
self-reported, self-perceived
learning (e.g. broad range of
items.)
Measured changes in knowl-
edge, attitudes, skills and behav-
iours (behavioural intentions.)
Second-order learning (medium-
term, long-term, collective, partici-
pants and non-participants):
Self-reported, case-based, re-
constructive: asking clients,
participants or other parties how
the results of the GBL have been
implemented.
Measured changes in team,
group or organisational char-
acteristics (e.g. safety, commit-
ment, performance.)
Tooling for Stealth Assessment
and Evaluation
One of the advantages of using digital SG instead
of analogue simulation games for training and as-
sessment is that digital SG allow data to gathered,
logged, saved and analysed unobtrusively, for
purposes of debriefing, assessment or research.
Stealth assessment (i.e. non-invasive, unobtrusive
assessment) could potentially increase the learning
efficacy of SG, given that much of the learning in
SG now remains relatively ‘implicit’ and ‘subjec-
tive’ (e.g. as noted in personal debriefings).
In the assessment of both players and trainers,
it continues to be relatively difficult to monitor
and keep track of what happens, to objectify the
observations and to compare them to other ses-
sions, as well as to provide authoritative feedback
on the information in order to enhance learning or
support a judgement. Stealth assessment in SG can
therefore serve several functions (Kickmeier-Rust
et al., 2009; Petersen & Bedek, 2012; Seitlinger,
Bedek, Kopeinik, & Albert, 2012; Shute & Kim,
2010; Shute et al., 2010, 2009; Shute, 2011;
Zapata-Rivera, VanWinkle, Doyle, Buteux, &
Bauer, 2009):
• Mirroring: providing a better factual ac-
count of what exactly happened during the
game (e.g. in terms of decisions, actions,
arousal, network centrality) and in which
sequence.
• Sense making: providing improved sup-
port for the facilitator, trainer or instruc-
tor in the interpretation of behaviour and
performance, as with the visual-dynamic
portrayal of what happened (e.g. moving
graphs, stills) or with comparison to other
player (groups); objectification of player
performance (e.g. top or bottom 10% of all
teams that played this game.)
376
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
• Adaptation: ensuring improved adapta-
tion of the game to the level of the players,
or improved game design.
• Research: storing data to construct and/or
validate underlying scientific theories, ei-
ther domain-oriented or method-oriented.
• Data mining, crowd sourcing: similar to
(4), but explorative and at a larger scale.
As part of the research methodology dis-
cussed here, we are therefore developing tools for
mixed-method research. The objectives include
the following:
1. To develop a conceptual model for the stealth
assessment of individual and team behaviour
and performance in and around digital SG.
2. To incorporate stealth assessment into SG.
3. To construct feedback dashboards for
immediate-action review and game-based
learning.
4. To evaluate the results of stealth in-game
data with validated psychometric constructs
and tests.
5. To use the patterns of game behaviour and
performance to improve the game, the after-
action review and learning efficacy.
6. To validate the efficacy of stealth assessment
for training, assessment and research, and to
provide guidelines and recommendations.
Figure 5 presents an impression of a digital
tool that provides guidance and structure to data
collection and that can connect observed, reported
and logged data in and around game sessions.
The tool has been titled Q.E.D., which stands for
Quasi-Experimental Design as well as for Quod
Erat Demonstrandum.
OPERATIONALISATION OF THE
RESEARCH MODEL AND
HYPOTHESES
The operationalisation of the generic conceptual
model (Figure 2) within the context of a dy-
namic, multi-stakeholder project can pose a true
challenge. A great variety of games, players and
learning contexts, and trade-offs must be made
with regard to time, resources and the focus of
the evaluation (see above).
Given that not everything can be included in
an evaluation, and given the possibility of con-
flict between different evaluation objectives, it
is important to identify the exact purpose of the
evaluation in order to define the proper type of
questions, which should subsequently be translated
into an operationalised model and hypotheses for
testing.
We classify the types of research questions and
research hypotheses that can guide GBL and SG
research into the following categories:
Figure 5. Q.E.D. tool
377
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
• Design-oriented research (artefact):
‘making it (better).’
The validation of specific and generic
game-based artefacts and events.
The development and validation of
design theories, methods and tools.
• Intervention-oriented research (e.g.
learning, change, policymaking, man-
agement): ‘making it work.
The learning effectiveness/impact of
game-based interventions.
The transfer of game-based interven-
tions to the real world.
• Domain-oriented research (e.g. health-
care, military, energy): ‘making it matter.’
The effectiveness of using SG to un-
derstand complexity, dynamics in
specific domains.
• Disciplinary research (e.g. methodology,
ethics, explanatory and interpretative
theories): ‘making it understandable.’
The sociological, economic, political,
cultural and other frames on SG.
Theory construction on GBL and SG.
Methodology: design and validation
of research methods and tools.
Reflection and ethics.
Operationalising the
Evaluation Instrument
Depending upon the case at hand, it is now possible
to define and construct pre-game, in-game and
post-game instruments for measuring or observ-
ing relevant variables. This is accomplished by
selecting the constructs and items that are neces-
sary and relevant for the specific configuration of
game/client/setting/players/purpose.
Operationalising the Research Model
The operationalisation of constructs and items
should proceed parallel to the preliminary op-
erationalisation of the research model that will
be tested later. Operationalisation of the model
implies selecting and filtering from the conceptual
model (Figure 1). The elements and parts that
are relevant can and should be included. Figure
6 presents a basic example of an operationalised
conceptual model.
Testing the Model
First-order learning effects (see Figure 2) can be
established as changes between pre- and post-game
measurements, with or without a control group.
Second-order effects (see Figure 2; i.e. trans-
fer of learning to performance of the system) is
much more difficult to assess. Empirical data of
the first order are supportive, and they provide
direction. Learning transfer at the individual or
small-group level could possibly be assessed by
using the same or similar research tools after some
time. Learning transfer at the chain, network,
organisational or system level is most likely to
be performed through reconstructive case stud-
ies with triangulated data gathering (e.g. KPIs,
interviews). To our knowledge, very few research
publications address the impact of serious gaming
on organisations and systems.
In a few situations, measurements of change
and learning can be performed in the form of ob-
jective ‘tests’. Self-constructed items or constructs
for measuring attitude, knowledge, skills and be-
haviours cannot always be ‘tested’ or ‘validated’
in advance. Self-reporting or self-assessment
of change and learning is common; it is often
necessary in addition to other data and, in many
cases, it is sufficient. A thorough mixed-method
evaluation study of the game Levee Patroller by
(Harteveld, 2012) revealed high and significant
correlations between observed, measured and
self-reported learning effects.
Even when based upon self-reporting, however,
high-quality questionnaires that include items,
constructs and scales for comparative and longitu-
dinal measurements are not commonly available.
In due course, the number and quality of validated
378
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
constructs for SG will increase. Psychometric
measurement from behavioural sciences (e.g.
psychology, management) is largely unexplored
territory for SG research.
Data Reduction and Analysis
A final and important concern involves the reduc-
tion and analysis of data. As indicated above, we
frequently use the same scales or constructs to
measure ‘game design quality’ and ‘player satisfac-
tion’ through such constructs as clarity, realism,
facilitator quality and user interaction. Over the
years, we have varied and adapted items, ques-
tions and constructs. Data reduction through factor
analysis and the reliability analysis of scales are
increasingly allowing us to select the influential
and distinguishing items and construct scales.
Our present dataset now contains 960 variables
concerning 12 games (Mayer, 2012; Mayer, Beke-
brede, et al., 2013).
The ultimate goal of comparative research is
to construct and test the efficacy of GBL and SG
through structural models (including structural
equation models), using validated or newly con-
structed psychometric scales and constructs for the
broad range of constructs listed above. Although
we are progressing, we are still far from achieving
this ambition. Figure 7 presents an example of a
structural model that has recently been published
(Mayer, Warmelink, et al., 2012).
Figure 6. Operationalised model (example)
379
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
RESEARCH AND PROFESSIONAL
ETHICS
This section contains a final word about eth-
ics within the context of SG. The relationship
between ethics and games is viewed in terms of
the following:
1. The ethics involved in making, disseminat-
ing or playing games (e.g. of a violent or
controversial nature.)
2. Using SG to allow players to adopt particular
codes of conduct or to reflect upon ethical
dilemmas (GATE, 2011.)
Very little attention has been paid to the
social-ethical role of SG designers, SG research-
ers, SG professionals or SG users. When games
are utilised in society for purposes of therapy,
innovation, persuasion or self-organisation, they
are inherently based upon specific value systems,
and they subtly exert power.
First, SG can be very coercive. It is a common
mistake to believe that SG participation is always
voluntary, with high intrinsic motivation due to
the supposedly engaging nature of the games.
In many cases, teachers or managers request or
demand that players participate in SG. Refusal
might result in failing a course or losing a promo-
tion. An organisation might expect employees to
fulfil job-related training requirements through an
online game, possibly assessing them according
to their performance.
Second, SG might not be as pleasurable as
many might assume; they might cause pain and
frustration. Participants might fail in front of
their colleagues. A great deal of attention is cur-
rently being paid to the pleasurable aspects of SG
Figure 7. Example of a structural model (Mayer, Warmelink, et al., 2012)
Source: Mayer, Warmelink, et al., 2012
380
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
(e.g. engagement, fun, immersion, flow), while
very little attention is being paid to emotions
that accompany deep learning and change (e.g.
‘frustration’ and ‘pain’). Good SG should not
only be ‘engaging’; they should also be painful
(i.e. ‘hard fun’).
Third, SG might be used for manipulation or
questionable political purposes. A government
department recently asked us to design and run
a simulation game to improve stakeholder coor-
dination within the governmental immigration
chain. The key performance indicator was to be
an increase in the number of immigrants sent back
(after final judicial denial of a permit to stay) to
their countries of origin after the gaming sessions
(Frame I). We obviously refused. The root of the
problem was not inefficiency in the immigration
chain, but internal and external stakeholder dis-
agreement with government immigration policy
(Frame III).
Fourth, many SG are designed and used for
vulnerable target groups (e.g. young children with
ADHD, people with mental disorders or cancer,
unemployed people, immigrants). Additional re-
flection upon the ethics and potentially harmful
effects of the game-based intervention model is
needed, as is additional ‘informed consent’ when
doing game experiments with such participant-
players.
CONCLUSION
In this chapter, we formulate frames, requirements
and a conceptual research model that can be trans-
lated into quasi-experimental research designs and
operationalised into a model for evaluating specific
cases and contexts of GBL. We are aware of the
many assumptions we needed to make. We are also
aware of our inability to mention all popular SG
and cite additional important references. Given
the purpose of this book, we restricted our focus
to types of research associated with Frame I (i.e.
SG as a tool for learning). As stated previously,
research methodologies associated with Frames II,
III and IV are largely in need of elaboration within
the emerging science of SG, as are the network,
organisational and system levels of learning with
games and second- order effects. We wish to as-
sure the reader that we have only begun.
REFERENCES
Adams, E. M., & Goffman, E. (1979). Frame analy-
sis: An essay on the organization of experience.
Philosophy and Phenomenological Research,
39(4), 601. doi:10.2307/2106908.
Adams, S. A. (2010). Use of serious health
games in health care: A review. Studies in Health
Technology and Informatics, 157, 160–166.
PMID:20543383.
Admiraal, W., Huizenga, J., Akkerman, S., &
Dam, G. T. (2011). The concept of flow in col-
laborative game-based learning. Computers in Hu-
man Behavior, 27(3), 1185–1194. doi:10.1016/j.
chb.2010.12.013.
Ainley, M., & Armatas, C. (2006). Motivational
perspectives on students ’ responses to learning
in virtual learning environments. In J. Weiss, J.
Nolan, J. Hunsinger, & P. Trifonas (Eds.), The in-
ternational handbook of virtual learning environ-
ments (pp. 365–394). Dordrecht, The Netherlands:
Springer. doi:10.1007/978-1-4020-3803-7_15.
Annetta, L. A. (2006). Serious games: Incorpo-
rating video games in the classroom - Games
designed using sound pedagogy actively engage
the net generation in learning. EDUCAUSE, 29(3).
Argyris, C. (1977). Double loop learning in
organizations. Harvard Business Review, 55(5),
115–125.
Argyris, C. (1982). The executive mind and
double-loop learning. Organizational Dynamics,
11(2), 5–22. doi:10.1016/0090-2616(82)90002-X
PMID:10256769.
381
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Argyris, C. (1995). Action science and organiza-
tional learning. Journal of Managerial Psychology,
10(6), 20–26. doi:10.1108/02683949510093849.
Argyris, C., & Schön, D. A. (1974). Theory in
practice: Increasing professional effectiveness.
San Francsico, CA: Jossey-Bass Publishers.
Ashton, M. C., & Lee, K. (2009). An investiga-
tion of personality types within the HEXACO
personality framework. Journal of Individual
Differences, 30(4), 181–187. doi:10.1027/1614-
0001.30.4.181.
Baba, D. M. (1993). Determinants of video game
performance. Advances in Psychology, 102,
57–74. doi:10.1016/S0166-4115(08)61465-X.
Babbie, E. (2007). The ethics and politics of so-
cial research. In E. Babbie (Ed.), The practice of
social research (pp. 60–83). London: Thomson
Wadsworth.
Barlett, C. P., Anderson, C. A., & Swing, E.
L. (2008). Video game effects—Confirmed,
suspected, and speculative: A review of the evi-
dence. Simulation & Gaming, 40(3), 377–403.
doi:10.1177/1046878108327539.
Barnaud, C., Promburom, T., Trebuil, G., &
Bousquet, F. (2007). An evolving simulation/
gaming process to facilitate adaptive watershed
management in northern mountainous Thai-
land. Simulation & Gaming, 38(3), 398–420.
doi:10.1177/1046878107300670.
Becker, K. (2008). The invention of good games:
Understanding learning design in commercial
video games. Calgary, Canada: University of
Calgary.
Bekebrede, G. (2010). Experiencing complexity:
A gaming approach for understanding infrastruc-
ture systems. TU Delft. Retrieved May 3, 2013,
from http://www.narcis.nl/publication/RecordID/
oai:tudelft. nl:uuid:dae75f36-4fb6-4a53-8711-
8aab42378878
Bekebrede, G., Warmelink, H. J. G., & Mayer,
I. S. (2011). Reviewing the need for gaming in
education to accommodate the net generation.
Computers & Education, 57(2), 1521–1529.
doi:10.1016/j.compedu.2011.02.010.
Benford, R. D., & Snow, D. A. (2000). Framing
processes and social movements: An overview and
assessment. Annual Review of Sociology, 26(1),
611–639. doi:10.1146/annurev.soc.26.1.611.
Berends, H., Boersma, K., & Weggeman, M.
(2003). The structuration of organizational
learning. Human Relations, 56(9), 1035–1056.
doi:10.1177/0018726703569001.
Beyer, J. L., & Larkin, R. P. (1978). Simulation
review: A review of instructional simulations for
human geography. Simulation & Gaming, 9(3),
339–352. doi:10.1177/104687817800900306.
Bezuijen, A. (2012). Teamplay the further devel-
opment of TeamUp, a teamwork focused serious
game. Delft, The Netherlands: TU Delft.
Björk, S., & Holopainen, J. (2005). Patterns in
game design. Boston: Charles River Media.
Bloom, B. S., Hastings, T. J., & Madaus, G. F.
(1972). Handbook on formative and summative
evaluation of student learning. Studies in Art
Education, 14(1), 68–72. doi:10.2307/1319918.
Blumberg, F. C. (2000). The effects of chil-
dren’s goals for learning on video game per-
formance. Journal of Applied Developmental
Psychology, 21(6), 641–653. doi:10.1016/S0193-
3973(00)00058-7.
Blunt, R. D. (2006). A causal-comparative explo-
ration of the relationship between game-based
learning and academic achievement: Teaching
management with video games. Minneapolis, MN:
Walden University.
Bogost, I. (2007). Persuasive games: The expres-
sive power of videogames. Cambridge, MA: MIT
Press.
382
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Boyle, E. A., & Connolly, T. M. (2008). Games
for learning: Does gender make a difference? In
Proceedings of 2nd European Conference on
Games Based Learning, 69–76.
Boyle, E. A., Connolly, T. M., & Hainey, T. (2011).
The role of psychology in understanding the impact
of computer games. Entertainment Computing,
2(2), 69–74. doi:10.1016/j.entcom.2010.12.002.
Boyle, E. A., Connolly, T. M., Hainey, T., & Boyle,
J. M. (2012). Engagement in digital entertain-
ment games: A systematic review. Computers in
Human Behavior, 28(3), 771–780. doi:10.1016/j.
chb.2011.11.020.
Bremson, J. (2012). Using gaming simulation to
explore long range fuel and vehicle transitions.
University of California. Retrieved May 3, 2013,
from http://steps.ucdavis.edu/its_publications/
dissertations/Bremson_Joel.pdf
Brockmyer, J. H., Fox, C. M., Curtiss, K., Mc-
Broom, E., Burkhart, K. M., & Pidruzny, J. N.
(2009). The development of the game engage-
ment questionnaire: A measure of engagement
in video game-playing. Journal of Experimental
Social Psychology, 45(4), 624–634. doi:10.1016/j.
jesp.2009.02.016.
Brockner, J., & Higgins, E. T. (2001). Regula-
tory focus theory: Implications for the study
of emotions at work. Organizational Behavior
and Human Decision Processes, 86(1), 35–66.
doi:10.1006/obhd.2001.2972.
Brown, D. J., Ley, J., Evett, L., & Standen, P.
(2011). Can participating in games based learn-
ing improve mathematic skills in students with
intellectual disabilities? In Proceedings of 2011
IEEE 1st International Conference on Serious
Games and Applications for Health SeGAH (pp.
1–9). IEEE. doi:10.1109/SeGAH.2011.6165461
Brown, M., Hall, L. R., Holtzer, R., Brown, S.,
& Brown, N. (1997). Gender and video game
performance. Seks Roles, 36(11), 793–812.
doi:10.1023/A:1025631307585.
Calleja, G. (2007). Digital games as designed
experience: Reframing the concept of immersion.
Wellington, New Zealand: Victoria University of
Wellington.
Campbell, D. T., & Stanley, J. C. (1963). Ex-
perimental and quasi-experimental designs for
research on teaching. In N. L. Gage (Ed.), Hand-
book of research on teaching (pp. 171–246).
Chicago: Rand McNally.
Carmeli, A., Brueller, D., & Dutton, J. E. (2009).
Learning behaviours in the workplace? The role
of high quality interpersonal relationships and
psychological safety. Systems Research and Be-
havioral Science, 26, 81–98. doi:10.1002/sres.932.
Castronova, E. (2005). Synthetic worlds: The
business and culture of online games. Chicago:
University of Chicago Press.
Castronova, E., Williams, D., Ratan, R., &
Keegan, B. (2009). As real as real? Macro-
economic behavior in a large-scale virtual
world. New Media & Society, 11(5), 685–707.
doi:10.1177/1461444809105346.
Chandler, D., & Torbert, B. (2003). Condescend-
ing ethics and action research. Action Research,
1(2), 37–47.
Chertoff, D. B., Goldiez, B., & LaViola, J. J.
(2010). Virtual experience test: A virtual environ-
ment evaluation questionnaire. In Proceedings of
2010 IEEE Virtual Reality Conference VR (pp.
103–110). IEEE. doi:10.1109/VR.2010.5444804
Chong, D., & Druckman, J. N. (2007). Fram-
ing theory. Annual Review of Political Science,
10(1), 103–126. doi:10.1146/annurev.polis-
ci.10.072805.103054.
Connolly, T. M., Boyle, E. A., MacArthur, E.,
Hainey, T., & Boyle, J. M. (2012). A system-
atic literature review of empirical evidence on
computer games and serious games. Computers
& Education, 59(2), 661–686. doi:10.1016/j.
compedu.2012.03.004.
383
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Connolly, T. M., Stansfield, M., & Hainey, T.
(2009). Towards the development of a games-based
learning evaluation framework: Games-based
learning advancements for multisensory human
compuiter interfaces: techniques and effective
pratices. Academic Press. doi:10.4018/978-1-
60566-360-9.
€conomia. (n.d.). Serious game about EU financial
policy. European Central bank (ECB). Retrieved
from http://www.ecb.int/ecb/educational/econo-
mia/html/index.en.html
Cook, T. D., & Campbell, D. T. (1979). Quasi-
experimentation: Design and analysis issues for
field settings. Boston: Houghlin Mifflin Company.
Cooper, S., Khatib, F., Treuille, A., Barbero,
J., Lee, J., & Beenen, M. etal. (2010). Predict-
ing protein structures with a multiplayer online
game. Nature, 466(7307), 756–760. doi:10.1038/
nature09304 PMID:20686574.
Copier, M. (2007). Beyond the magic circle:
A network perspective on role-play in on-
line games. Utrecht University. Retrieved
from http://igitur-archive.library.uu.nl/disserta-
tions/2007-0710-214621/index.htm
Coulthard, G. J. (2009). A review of the educational
use and learning effectiveness of simulations and
games. Retrieved May 3, 2013, from http://www.
coulthard.com/library/paper-simulation.html
Creswell, J. W. (2002). Research design: Qualita-
tive, quantitative, and mixed methods approaches.
Organizational Research Methods, 6, 246.
Csikszentmihalyi, M. (1991). Flow: The psychol-
ogy of optimal experience. New York: Harper-
Perennial.
De Caluwe, L., Hofstede, G. J., & Peters, V. A.
M. (2008). Why do games work: In search of the
active substance. Retrieved from http://www.
narcis.nl/publication/RecordID/oai:library.wur.
nl:wurpubs/368664
De Freitas, S., & Oliver, M. (2006). How can
exploratory learning with games and simulations
within the curriculum be most effectively evalu-
ated? Computers & Education, 46(3), 249–264.
doi:10.1016/j.compedu.2005.11.007.
De Freitas, S., Rebolledo-Mendez, G., Liarokapis,
F., Magoulas, G., & Poulovassilis, A. (2010).
Learning as immersive experiences: Using the
four-dimensional framework for designing and
evaluating immersive learning experiences in
a virtual world. British Journal of Educational
Technology, 41(1), 69–85. doi:10.1111/j.1467-
8535.2009.01024.x.
De Vaus, D. (2001). Research design in social
research. British Educational Research Journal,
28, 296.
Delft, T. U. Tygron Serious Gaming, & Port of
Rotterdam. (n.d.). SimPort MV2 serious game. Re-
trieved May 3, 2013, from http://www.simport.eu/
Deltares Deltabrain. (n.d.). Levee patroller. Delft,
The Netherlands. Deltares.
Djaouti, D. (2011). Serious game design - Con-
sidérations théoriques et techniques sur la créa-
tion de jeux vidéo à vocation utilitaire. (Thèse de
doctorat en informatique). Université de Toulouse
III, Toulouse, France.
Ducrot, R. (2009). Gaming across scale in
peri-urban water management: Contribution
from two experiences in Bolivia and Brazil.
International Journal of Sustainable De-
velopment World Ecology, 16(4), 240–252.
doi:10.1080/13504500903017260.
Duke, R. D., & Geurts, J. L. A. (2004). Policy
games for strategic management: Pathways into
the unknown. Amsterdam: Dutch University Press.
Edelenbos, J., & Van Buuren, A. (2005). The
learning evaluation: A theoretical and em-
pirical exploration. Evaluation Review, 29(6),
591–612. doi:10.1177/0193841X05276126
PMID:16244054.
384
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Edmondson, A. C. (1999). Psychological
safety and learning behavior in work teams.
Administrative Science Quarterly, 44(2), 350.
doi:10.2307/2666999.
Egenfeldt-Nielsen, S. (2005). Beyond edutain-
ment: Exploring the educational potential of com-
puter games. Retrieved May 3, 2013, from http://
www.seriousgames.dk/downloads/egenfeldt.pdf
Egenfeldt-Nielsen, S. (2006). Overview of re-
search on the educational use of video games.
Digital Kompetanse, 1, 184–213.
Egenfeldt-Nielsen, S. (2006a). Can education and
psychology join forces: The clash of benign and
malign learning from computer games. Nordicom
Review, 26(2), 103–107.
Enochsson, L., Isaksson, B., Tour, R., Kjellin,
A., Hedman, L., & Wredmark, T. etal. (2004).
Visuospatial skills and computer game experi-
ence influence the performance of virtual en-
doscopy. Journal of Gastrointestinal Surgery
Official Journal of the Society for Surgery of the
Alimentary Tract, 8(7), 876–882. doi:10.1016/j.
gassur.2004.06.015 PMID:15531242.
Ermi, L., & Mäyrä, F. (2003). Power and control
of games: Children as the actors of game cul-
tures. Retrieved May 3, 2013, from http://www.
uta.fi/~tlilma/Ermi_Mayra_Power_and_Con-
trol_of_Games.pdf
Fernández-Aranda, F., Jiménez-Murcia, S., Sant-
amaría, J. J., Gunnard, K., Soto, A., & Kalapanidas,
E. etal. (2012). Video games as a complementary
therapy tool in mental disorders: PlayMancer, a Eu-
ropean multicentre study. Journal of Mental Health
(Abingdon, England), 21(4), 364–374. doi:10.3
109/09638237.2012.664302 PMID:22548300.
Ferris, G. R. (2005). Development and validation of
the political skill inventory. Journal of Management,
31(1), 126–152. doi:10.1177/0149206304271386.
Firestone, W. A. (1989). Educational policy as an
ecology of games. Educational Researcher, 18(7),
18–24. doi: doi:10.3102/0013189X018007018.
Fisher, K. (1997). Locating frames in the discur-
sive universe. Sociological Research Online, 2(3),
1–41. Retrieved from http://www.socresonline.
org.uk/2/3/4.html doi:10.5153/sro.78.
Frasca, G. (2007). 12th September. Retrieved from
http://www.newsgaming.com/newsgames.htm
Garris, R., Ahlers, R., & Driskell, J. E. (2002).
Games, motivation, and learning: A research and
practice model. Simulation & Gaming, 33(4),
441–467. doi:10.1177/1046878102238607.
Gasnier, A. (2007). The patenting paradox: A
game-based approach to patent management.
Delft, The Netherlands: Eburon.
GATE. (2011). Help! Over de ontwikkeling van een
serious game als oefenmiddel voor burgemeesters
om zich voor te bereiden op een incident. Utrecht,
The Netherlands: Game Research for Training
and Entertainment.
Geurts, J. L. A., Duke, R. D., & Vermeulen, P.
A. M. (2007). Policy gaming for strategy and
change. Long Range Planning, 40(6), 535–558.
doi:10.1016/j.lrp.2007.07.004.
Giddens, A. (1988). Goffman as a systematic
social theorist. In P. Drew, & A. Wootton (Eds.),
Erving Goffman Exploring the interaction order
(pp. 250–279). New York: Polity Press.
Girard, C., Ecalle, J., & Magnan, A. (2012). Serious
games as new educational tools: How effective are
they? A meta-analysis of recent studies. Journal
of Computer Assisted Learning, 19(3), 207–219.
doi: doi:10.1111/j.1365-2729.2012.00489.x.
Göbel, S. (2010). Definition serious games term –
Approach Examples Conclusion serious games
– Motivation, der mensch ist nur da ganz mensch
wo er spielt. Technology (Elmsford, N.Y.), 12.
385
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Goffman, E. (1974). Frame analysis: An essay
on the organization of experience. New York:
Harper & Row.
Good, B. M., & Su, A. I. (2011). Games with a
scientific purpose. Genome Biology, 12(12), 135.
doi:10.1186/ gb-2011-12-12-135
Gosen, J., & Washbush, J. (2004). A review of
scholarship on assessing experiential learning
effectiveness. Simulation & Gaming, 35(2),
270–293. doi:10.1177/1046878104263544.
Graafland, M., Schraagen, J. M., & Schijven, M.
P. (2012). Systematic review of serious games
for medical education and surgical skills training.
British Journal of Surgery, 99(10), 1322–1330.
doi:10.1002/bjs.8819 PMID:22961509.
Greenblat, C. S. (1973). Teaching with simulation
games: A review of claims and evidence. Teaching
Sociology, 1(1), 62–83. doi:10.2307/1317334.
Guay, F., Vallerand, R. J., & Blanchard, C. (2000).
On the assessment of situational intrinsic and
extrinsic motivation: The situational motivation
scale (SIMS). Motivation and Emotion, 24(3),
175–213. doi:10.1023/A:1005614228250.
Guba, E. G., & Lincoln, Y. S. (1989). Fourth
generation evaluation. Thousand Oaks, CA: Sage
Publications.
Hainey, T. (2010). Using games-based learning
to teach requirements collection and analysis
at tertiary education level. Retrieved May 3,
2013, from http://scholar. google.com/scholar
?hl=en&btnG=Search&q=intitle:Using+Gam
es-Based+Learning+to+Teach+ Requirements
+Collection+and+Analysis+at+Tertiary+Educ
ation+Level#1
Hainey, T., & Connolly, T. M. (2010). Evaluating
games-based learning. International Journal of
Virtual and Personal Learning Environments,
1(1), 57–71. doi:10.4018/jvple.2010091705.
Harper, J. D., Kaiser, S., Ebrahimi, K., Lam-
berton, G. R., Hadley, H. R., Ruckle, H. C., et
al. (2007). Prior video game exposure does not
enhance robotic surgical performance. Journal of
Endourology / Endourological Society, 21(10),
1207–1210. doi:10.1089/end.2007.9905
Hart, C. (2008). Critical discourse analysis and
metaphor: Toward a theoretical framework.
Critical Discourse Studies, 5(2), 91–106.
doi:10.1080/17405900801990058.
Harteveld, C. (2011). Triadic game design. Lon-
don: Springer. doi:10.1007/978-1-84996-157-8.
Harteveld, C. (2012). Making sense of virtual
risks, a quasi experimental investigation into
game-based training. Delft, The Netherlands:
IOS Press.
Hays, R. T. (2005). The effectiveness of instruc-
tional games: a literature review and discussion.
Orlando, FL: Naval War Center: Training Systems
Division.
Healseeker. (n.d.). Yulius academie. Retrieved
April 11, 2013, from http://www.yuliusacademie.
nl/nl/healseeker
Hofstede, G. (1986). Cultural differences in
teaching and learning. International Journal
of Intercultural Relations, 10(3), 301–320.
doi:10.1016/0147-1767(86)90015-5.
Hofstede, G. J., De Caluwe, L., & Peters, V.
A. M. (2010). Why simulation games work-
in search of the active substance: A synthe-
sis. Simulation & Gaming, 41(6), 824–843.
doi:10.1177/104687811037 5596
Holsbrink-Engels, G. A. (1998). Computer-based
role playing for interpersonal skills training.
Enschede, The Netherlands: Universiteit Twente.
Houtkamp, J. (2012). Affective appraisal of virtual
environments. Retrieved from http://igitur-archive.
library.uu.nl/dissertations/2012-0620-200449/
UUindex.html
386
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Huang, W.-H. (2011). Evaluating learners’ mo-
tivational and cognitive processing in an online
game-based learning environment. Computers in
Human Behavior, 27(2), 694–704. doi:10.1016/j.
chb.2010.07.021.
Hussaan, A. M. (2012). Generation of adaptive
pedagogical scenarios in serious games. Lyon,
France: University of Lyon.
Jackson, C. L., Colquitt, J., Wesson, M. J., & Zapa-
ta-Phelan, C. P. (2006). Psychological collectivism:
A measurement validation and linkage to group
member performance. The Journal of Applied
Psychology, 91(4), 884–899. doi:10.1037/0021-
9010.91.4.884 PMID:16834512.
Janssen, M., & Klievink, B. (2010). Gaming and
simulation for transforming and reengineering
government: Towards a research agenda. Trans-
forming Government People Process and Policy,
4(2), 132–137. doi:10.1108/17506161011047361.
Jantke, K. P. (2010). Toward a taxonomy of game
based learning. In Proceedings of IEEE Interna-
tional Conference on Progress in Informatics and
Computing, (pp. 858–862). IEEE. doi:10.1109/
PIC.2010.5687903
Jenson, J., & De Castell, S. (2010). Gender,
simulation, and gaming: Research review and
redirections. Simulation & Gaming, 41(1), 51–71.
doi:10.1177/1046878109353473.
Kankaanranta, M., & Neittaanmki, P. (2008).
Design and use of serious games. Engineering,
37, 208. doi: doi:10.1007/978-1-4020-9496-5.
Kato, P. M., Cole, S. W., Bradlyn, A. S., &
Pollock, B. H. (2008). A video game improves
behavioral outcomes in adolescents and young
adults with cancer: A randomized trial. Pediatrics,
122(2), e305–e317. doi:10.1542/peds.2007-3134
PMID:18676516.
Ke, F. (2009). A qualitative meta-analysis of
computer games as learning tools. Hershey, PA:
IGI Global.
Khatib, F., Cooper, S., Tyka, M. D., Xu, K.,
Makedon, I., & Popovic, Z. etal. (2011). Algo-
rithm discovery by protein folding game players.
Proceedings of the National Academy of Sciences
of the United States of America, 108(47), 1–5.
doi:10.1073/pnas.1115898108 PMID:22065763.
Kickmeier-Rust, M. D., Steiner, C. M., & Albert,
D. (2009). Non-invasive assessment and adaptive
interventions in learning games. In Proceedings of
Intelligent Networking and Collaborative Systems
2009 INCOS 09 International Conference (pp.
301–305). IEEE. doi:10.1109/INCOS.2009.30
Kim, R. B., & Kim, J. P. (2011). Creative economy
in Korea: A case of online game industry. Actual
Problems of Economics, 124(10), 435–442.
Kimmel, A. J. (1988). Ethics and values in applied
social research. Methods (San Diego, Calif.), 12,
160. doi: doi:10.4135/9781412984096.
Kinzie, M. B., & Joseph, D. R. D. (2008). Gender
differences in game activity preferences of middle
school children: Implications for educational game
design. Educational Technology Research and
Development, 56(5–6), 643–663. doi:10.1007/
s11423-007-9076-z.
Knight, J. F., Carley, S., Tregunna, B., Jarvis, S.,
Smithies, R., & De Freitas, S. etal. (2010). Seri-
ous gaming technology in major incident triage
training: A pragmatic controlled trial. Resuscita-
tion, 81(9), 1175–1179. doi:10.1016/j.resuscita-
tion.2010.03.042 PMID:20732609.
Kortmann, R., Bekebrede, G., Van Daalen, C. E.,
Harteveld, C., Mayer, I. S., & Van Dierendonck,
D. (2012). Veerkracht—A game for servant-
leadership development. In Proceedings of the
43rd Annual Conference of the International
Simulation and Gaming Association. ISGA.
387
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Krathwohl, D. R., Bloom, B. S., & Betram, B. M.
(1973). Taxonomy of educational objectives, the
classification of educational goals: Handbook
II: Affective domain. New York: David McKay
Co., Inc.
Kriz, W. C., & Hense, J. U. (2004). Evaluation of
the EU-project simgame in business education.
In W. C. Kriz, & T. Eberle (Eds.), Bridging the
gap: Transforming knowledge into action through
gaming and simulation (pp. 352–363). Munchen,
Germany: Sagsaga.
Kriz, W. C., & Hense, J. U. (2006). Theory-
oriented evaluation for the design of and research
in gaming and simulation. Simulation & Gaming,
37(2), 268–283. doi:10.1177/1046878106287950.
Kuit, M. (2002). Strategic behavior and regu-
latory styles in The Netherlands energy indus-
try. Retrieved May 3, 2013, from http://www.
narcis.nl/publication/RecordID/oai:tudelft.
nl:uuid:f940a515-f967-4cf3-9a96-07cb69a61ae5
Lawson, L. L., & Lawson, C. L. (2010). Video
game-based methodology for business re-
search. Simulation & Gaming, 41(3), 360–373.
doi:10.1177/1046878109334038.
Lee, J. (1999). Effectiveness of computer-based
instructional simulation: A meta analysis. Interna-
tional Journal of Instructional Media, 26(1), 71–
85. Retrieved May 3, 2013, from http://www.ques-
tia.com/googleScholar.qst?docId=5001238108
Lee, K., & Ashton, M. C. (2004). Psychometric
properties of the HEXACO personality inven-
tory. Multivariate Behavioral Research, 39(2),
329–358. doi:10.1207/s15327906mbr3902_8.
Leemkuil, H. (2006). Is it all in the game? Learner
support in an educational knowledge management
simulation game. Enschede, The Netherlands:
University of Twente. Retrieved May 3, 2013, from
http://users.gw.utwente.nl/leemkuil/PhDThesisL-
eemkuil2006.pdf
Leemkuil, H., de Jong, T., & Ootes, S. (2000).
Review of educational use of games and simula-
tions. Retrieved from http://kits.edte.utwente.nl/
documents/D1.pdf
Lempert, R. J., & Schwabe, W. (1993). Transition
to sustainable waste management, a simulation
gaming approach. Santa Monica, CA: Rand
Corporation.
Leyton-Brown, K., & Shoham, Y. (2008). Es-
sentials of game theory. In R. J. Brachman, & T.
Dietterich (Eds.), Political science. San Francisco,
CA: Morgan & Claypool Publishers.
Ma, Y., Williams, D., Prejean, L., & Richard,
C. (2007). A research agenda for developing
and implementing educational computer games:
Colloquium. British Journal of Educational
Technology, 38(3), 513–518. doi:10.1111/j.1467-
8535.2007.00714.x.
Mackenzie, N., & Knipe, S. (2006). Research
dilemmas: Paradigms, methods and methodology.
Issues in Educational Research, 16(2), 193–205.
Maier, F. H., & Grössler, A. (2000). What
are we talking about? A taxonomy of com-
puter simulations to support learning. Sys-
tem Dynamics Review, 1 6 (2), 135–148.
doi:10.1002/1099-1727(200022)16:2<135::AID-
SDR193>3.0.CO;2-P.
Malone, T. W., & Lepper, M. (1987a). Intrinsic
motivation and instructional effectiveness in
computer-based education. In R. E. Snow, & M.
J. Farr (Eds.), Aptitude learning and instruction.
London: Lawrence Erlbaum Associates Publish-
ers.
Malone, T. W., & Lepper, M. (1987b). Making
learning fun: A taxonomy of intrinsic motivation
for learning. In R. E. Snow, & M. J. Farr (Eds.),
Aptitude, learning, and instruction: III: Cogni-
tive and affective process analysis (pp. 223–225).
Hillsdale, NJ: Lawrence Erlbaum Associates
Publishers.
388
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Martin, A. J., & Jackson, S. (2008). Brief approach-
es to assessing task absorption and enhanced
subjective experience: Examining short and core
flow in diverse performance domains. Motiva-
tion and Emotion, 32(3), 141–157. doi:10.1007/
s11031-008-9094-0.
Mayer, I. S. (2009). The gaming of policy and the
politics of gaming: A review. Simulation & Gaming,
40(6), 825–862. doi:10.1177/1046878109346456.
Mayer, I. S. (2012). Towards a comprehensive
methodology for the research and evaluation of
serious games. In Proceedings VS-Games 2012
(Vol. 15, pp. 1–15). Genoa, Italy: Procedia Com-
puter Science. doi:10.1016/j.procs.2012.10.075
Mayer, I. S., Bekebrede, G., Harteveld, C.,
Warmelink, H. J. G., Zhou, Q., & Lo, J. etal.
(2013). The research and evaluation of serious
games: Towards a comprehensive methodol-
ogy. British Journal of Educational Technology.
doi:10.1111/bjet.12067.
Mayer, I. S., Carton, L., De Jong, M., Leijten,
M., & Dammers, E. (2004). Gaming the future
of an urban network. Futures, 36(3), 311–333.
doi:10.1016/S0016-3287(03)00159-9.
Mayer, I. S., & Veeneman, W. (Eds.). (2003).
Games in a world of infrastructures: Gaming
simulation for research, learning and interven-
tion. Eburon Publishers.
Mayer, I. S., Warmelink, H. J. G., & Bekebrede,
G. (2013). Learning in a game-based virtual en-
vironment: A comparative evaluation in higher
education. European Journal of Engineering
Education, 38(1), 85–106. doi:10.1080/030437
97.2012.742872.
Mayer, I. S., Zhou, Q., Lo, J., Abspoel, L., Keijser,
X., Olsen, E., et al. (2012). Integrated, ecosystem-
based marine spatial planning? First results from
international simulation-game experiment. In
Proceedings of Third International Engineering
Systems Symposium. Delft, The Netherlands:
IEEE.
Mayer, I. S., Zhou, Q., Lo, J., Abspoel, L., Keijser,
X., Olsen, E., et al. (2013). Integrated, ecosystem-
based marine spatial planning: Design and results
of a game-based quasi-experiment. Ocean and
Coastal Management, 82, 7–26. doi:dx.doi.
org/10.1016/j.ocecoaman.2013.04.006
Mayer, R. E. (2005). Cognitive theory of multime-
dia learning. In R. E. Mayer (Ed.), The Cambridge
handbook of multimedia learning (pp. 31–48).
Cambridge, MA: Cambridge University Press.
doi:10.1017/CBO9780511816819.004.
Mayes, D. K., & Cotton, J. E. (2001). Measur-
ing engagement in video games: A question-
naire. In Proceedings of the Human Factors and
Ergonomics Society 45th Annual Meeting (pp.
692–696). Retrieved May 3, 2013, from http://
www.scopus.com/inward/record.url?eid=2-
s2.0-0442310948&partnerID=40&md5=b56e
3e59c17f58d4d21a35d7755b9f6a
McGonigal, J. E. (2011). Reality is broken: Why
games make us better and how they can change
the world. NewYork. The Penguin Press.
Meadows, D. L. (1999). Learning to be simple: My
odyssey with games. Simulation & Gaming, 30(3),
342–351. doi:10.1177/104687819903000310.
Meijer, S. A. (2009). The organisation of
transactions: Studying supply networkd using
gaming simulation. Retrieved from http://www.
narcis.nl/publication/RecordID/oai:library.wur.
nl:wurpubs/376893
Meijer, S. A., Mayer, I. S., Van Luipen, J., &
Weitenberg, N. (2011). Gaming rail cargo man-
agement: Exploring and validating alternative
modes of organization. Simulation & Gaming,
43(1), 85–101. doi:10.1177/1046878110382161.
Meyer, B. (2010). Comparative studies in game-
based language learning: A discussion of meth-
odology. In Proceedings of the 9th European
Conference on Elearning, (pp. 362–369). IEEE.
389
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Mortagy, Y., & Boghikian-Whitby, S. (2010). A
longitudinal comparative study of student percep-
tions in online education. Interdisciplinary Jour-
nal of Elearning and Learning Objects, 6, 23–44.
Nieborg, D. B. (2004). America’s army: More than
a game. [Munich, Germany: SAGA.]. Proceedings
of ISAGA, 2004, 883–891.
Nieborg, D. B. (2011). Triple-A the political
economy of the blockbuster video game. Amster-
dam: Universiteit van Amsterdam.
Noy, A., Raban, D. R., & Ravid, G. (2006).
Testing social theories in computer-mediated
communication through gaming and simula-
tion. Simulation & Gaming, 37(2), 174–194.
doi:10.1177/1046878105286184.
Nygaard, C., Courtney, N., & Leigh, E. (Eds.).
(2012). Simulations, games and role play in
university education. Fringdon Oxfordshire, UK:
Libri Publishing.
Olsen, T., Procci, K., & Bowers, C. (2011). Seri-
ous games usability testing: How to ensure proper
usability, playability, and effectiveness. In A. Mar-
cus (Ed.), Design User Experience and Usability
Theory Methods Tools and Practice Proceedings
First International Conference DUXU 2011.
doi:10.1007/978-3-642-21708-1
Ordeshook, P. C. (1986). Game theo-
ry and political theory: An introduction.
Retrieved from http://books.google.com/
books?id=faytzOKtquMC&pgis=1
Oslin, J. L., Mitchell, S. A., & Griffin, L. L. (1998).
The game performance assessment instrument
(GPAI), some concerns and solutions for further
development. Journal of Teaching in Physical
Education, 17(2), 220–240.
Papastergiou, M. (2009). Digital game-based
learning in high school computer science educa-
tion: Impact on educational effectiveness and
student motivation. Computers & Education,
52(1), 1–12. doi:10.1016/j.compedu.2008.06.004.
Petersen, S. A., & Bedek, M. A. (2012). Challenges
and opportunities in evaluating learning in seri-
ous games: A look at behavioural aspects. In M.
Ma, M. Fradinho, J. B. Hauge, H. Duin, & K.-D.
Thoben (Eds.), Serious Games Development and
Applications - Third International Conference,
SGDA 2012 (LNCS), (vol. 7528). Berlin: Springer.
Pfister, R. (2011). Gender effects in gaming
research: A case for regression residuals? Cy-
berpsychology Behavior and Social Networking,
14(10), 603–607. doi:10.1089/cyber.2010.0547
PMID:21486141.
Poverty Is Not, A. Game (PING). (n.d.). Serious
game on poverty awareness. Retrieved from http://
www.povertyisnotagame.com/?lang=en
Preschl, B., Wagner, B., Forstmeier, S., & Mae-
rcker, A. (2011). E-health interventions for de-
pression, anxiety disorder, dementia, and other
disorders in older adults: A review. Cyber Therapy
and Rehabilitation, 4(3), 371–385.
Raessens, J. (2006). Playful identities, or the
ludification of culture. Games and Culture, 1(1),
52–57. doi:10.1177/1555412005281779.
Raessens, J. (2009). Homo ludens 2.0. Metropolis
M, (5), 64–69, 85–88.
Randel, J. M., Morris, B. A., Wetzelf, C. D., &
Whitehill, B. V. (1992). The effectiveness of
games for educational purposes: A review of the
research. Simulation & Gaming, 25, 261–276.
doi:10.1177/1046878192233001.
390
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Raphael, C., Bachen, C., Lynn, K. M., Baldwin-
Philippi, J., & McKee, K. A. (2009). Games for
civic learning: A conceptual framework and agen-
da for research and design. Games and Culture,
5(2), 199–235. doi:10.1177/1555412009354728.
Rebolledo-Mendez, G., Avramides, K., & De
Freitas, S. (2009). Societal impact of a serious
game on raising public awareness: The case of
FloodSim. Methodology, 148(2), 15–22. doi:
doi:10.1145/1581073.1581076.
Reichlin, L., Mani, N., McArthur, K., Harris, A.
M., Rajan, N., & Dacso, C. C. (2011). Assessing
the acceptability and usability of an interactive
serious game in aiding treatment decisions for
patients with localized prostate cancer. Journal of
Medical Internet Research, 13(1), e4. doi:10.2196/
jmir.1519 PMID:21239374.
Ritterfeld, U., Cody, M. J., & Vorderer, P.
(2009). Serious games: Mechanisms and ef-
fects. Retrieved from http://books.google.com/
books?id=GwPf7tbO5mgC&pgis=1
Rockwell, G. M., & Kee, K. (2011). The leisure of
serious games: A dialogue. Game Studies, 11(2).
Salomon, M., & Soudoplatoff, S. (2010). Why
virtual-world economies matter. Journal of Virtual
Worlds Research, 2(4), 14.
Sanchez, A., Cannon-Bowers, J., & Bowers, C.
(2010). Establishing a science of game based
learning. In J. A. Cannon-Bowers, & C. Bowers
(Eds.), Serious game design and development:
Technologies for training and learning (pp. 290–
304). Hershey, PA: IGI Global. doi:10.4018/978-
1-61520-739-8.ch016.
Sawyer, B. (2008). Serious games taxonomy.
Sawyer, Ben Smith, Peter.
Scharpf, F. W. (1997). Games real actors play?
Actor-centered institutionalism in policy research.
In P. Sabatier (Ed.), Theoretical lenses on public
policy. Boulder, CO: Westview Press.
Scheufele, D. A., Iyengar, S., Kenski, K., Hall, K.,
& Eds, J. (2002). The state of framing research:
A call for new directions. In K. Kenski, & K. H.
Jamieson (Eds.), The Oxford handbook of political
communication theories (pp. 1–27). Oxford, UK:
Oxford, University Press.
Schneider, D. K. (2005). Research design for
educational technologists. Geneva: AACE.
Schrage, M. (2000). Serious play: How the world’s
best companies simulate to innovate. Boston:
Harvard Business School Press.
Schuurink, E., Houtkamp, J., & Toet, A. (2008).
Engagement and EMG in serious gaming: experi-
menting with sound and dynamics in the levee
patroller training game. In P. Markopoulos, B. de
Ruyter, W. Ijsselsteijn, & D. Rowland (Eds.), Fun
and games: Second international conference, (pp.
139-149). Eindhoven, The Netherlands: Springer
Verlag. doi:10.1007/978-3-540- 88322-7_14
Seitlinger, P., Bedek, M. A., Kopeinik, S., &
Albert, D. (2012). Evaluating the validity of a
non-invasive assessment procedure. In M. Ma,
M. Fradinho, J. B. Hauge, H. Duin, & K.-D.
Thoben (Eds.), Serious games development. Re-
trieved from http://www.springerlink.com/index/
K66751M360805G62.pdf
Senge, P. M. (1990). The fifth discipline, the art
and practice of the learning organization. New
York: Doubleday.
Senge, P. M., Roberts, C., & Ross, R. B. (1994).
The fifth discipline fieldbook, strategies and tools
for building a learning organization. Garden City,
NJ: Doubleday.
Shaffer, D. W. (2006). Epistemic frames for
epistemic games. Computers & Education, 46(3),
223–234. doi:10.1016/j.compedu.2005.11.003.
Shaw, A. (2010). What is video game culture? Cul-
tural studies and game studies. Games and Culture,
5(4), 403–424. doi:10.1177/1555412009360414.
391
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Shubik, M. (1999). Political economy, oligopoly
and experimental games: The selected essays of
Martin Shubik volume one. Cheltenham, UK:
Elgar.
Shute, V. J. (2011). Stealth assessment in
computer-based games to support learning. In S.
Tobias, & J. D. Fletcher (Eds.), Computer games
and instruction (pp. 503–524). Charlotte, NC:
Information Age Publishers.
Shute, V. J., & Kim, Y. J. (2010). Formative as-
sessment: Opportunities and challenges. Journal of
Language Teaching and Research, 1(6), 838–841.
doi: doi:10.4304/jltr.1.6.838-841.
Shute, V. J., Masduki, I., & Donmez, O. (2010).
Conceptual framework for modeling, assessing
and supporting competencies within game envi-
ronments. Cognition and Learning, 8(2), 137–161.
Shute, V. J., Ventura, M., Bauer, M., & Zapata-
Rivera, D. (2009). Melding the power of serious
games and embedded assessment to monitor and
foster learning? Flow and grow. Serious Games
Mechanisms and Effects, 1(1), 1–33.
Simon, H. A. (1991). Bounded rationality and
organizational learning. Organization Science,
2(1), 125–134. doi:10.1287/orsc.2.1.125.
Squire, K. (2002). Cultural framing of computer/
video games. The International Journal of Com-
puter Game Research, 2(1).
Squire, K. (2004). Replaying history: Learning
world history through playing civilization III.
Bloomington, IN: Indiana University.
Steinkuehler, C. A. (2005). Cognition & learn-
ing in massively multiplayer online games: A
critical approach. Madison, WI: University of
Wisconsin-Madison.
Steunenberg, B., Schmidtchen, D., & Koboldt, C.
(1999). Strategic power in the European Union:
Evaluating the distribution of power in policy
games. Journal of Theoretical Politics, 11(3),
339–366. doi:10.1177/0951692899011003005.
Sweeney, L. B., & Meadows, D. L. (2001). The
systems thinking playbook. Durham, NC: Pegasus
Communication.
Szturm, T., Betker, A. L., Moussavi, Z., Desai, A.,
& Goodman, V. (2011). Effects of an interactive
computer game exercise regimen on balance im-
pairment in frail community-dwelling older adults:
A randomized controlled trial. Physical Therapy,
91(10), 1449–1462. doi:10.2522/ptj.20090205
PMID:21799138.
Tallir, I. B., Lenoir, M., Valcke, M., & Musch, E.
(2007). Do alternative instructional approaches
result in different game performance learning
outcomes? Authentic assessment in varying
game conditions. International Journal of Sport
Psychology, 38(3), 263–282.
Taylor, J. L. (1971). Instructional planning sys-
tems, a gaming-simulation approach to urban
problems. London: Cambridge University Press.
doi:10.1017/CBO9780511720789.
Trepte, S., & Reinecke, L. (2011). The pleasures
of success: Game-related efficacy experiences
as a mediator between player performance and
game enjoyment. Cyberpsychology Behavior and
Social Networking, 14(9), 555–557. doi:10.1089/
cyber.2010.0358 PMID:21342012.
Tykhonov, D., Jonker, C., & Meijer, S. A. (2008).
Agent-based simulation of the trust and tracing
game for supply chains and networks, & social
simulation. Retrieved May 3, 2013, from http://
jasss.soc.surrey.ac.uk/11/3/1.html
392
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Van der Spek, E. D. (2011). (n.d.). Experiments
in serious game design: A cognitive approach.
Retrieved May 3, 2013, from http://www.narcis.
nl/publication/RecordID/oai:dspace. library.
uu.nl:1874/211480
Van der Spek, E. D., Wouters, P., & Van Oosten-
dorp, H. (2011). Code red: Triage or cognition-
based design rules enhancing decisionmaking
training in a game environment. British Journal
of Educational Technology, 42(3), 441–455.
doi:10.1111/j.1467-8535.2009.01021.x.
Van Houten, S. P. A. (2007, November 6). A
suite for developing and using business games:
Supporting supply chain business games in a
distributed context. Retrieved from http://www.
narcis.nl/publication/RecordID/oai:tudelft.
nl:uuid:006a4dfd-8aba-47e5-8a15-c481eace87e1
van Ruijven, T. (n.d.). Virtual emergency man-
agement training. Delft, The Netherlands: Delft
University of Technology.
Van Staalduinen, J. P. (2012). Gamers on games
and gaming: Implications for Educational game
design. Delft, The Netherlands: TU Delft.
Varoufakis, Y. (2008). Game theory: Can it unify
the social sciences? Organization Studies, 29(8-
9), 1255–1277. doi:10.1177/0170840608094779.
Vartiainen, P. (2000). Evaluation methods and
comparative study. Retrieved May 3, 2013,
from http://cjpe.ca/distribution/20001012_var-
tiainen_pirkko.pdf
Verdaasdonk, E. G. G., Dankelman, J., Schi-
jven, M. P., Lange, J. F., Wentink, M., &
Stassen, L. P. S. (2009). Serious gaming and
voluntary laparoscopic skills training: Amulti-
center study. Official Journal of the Society for
Minimally Invasive Therapy, 18(4), 232–238.
doi:10.1080/13645700903054046.
Virtual Reality at Ford Motor Company. (n.d.).
Retrieved April 11, 2013, from http://www.you-
tube.com/watch?v=zmeR-u-DioE
Weiss, C. H. (1997). Theory-based evaluation:
Past, present, and future. New Directions for
Evaluation, (76): 41–55. doi:10.1002/ev.1086.
Wenzler, I. (2008). The ten commandments for
translating simulation results into real-life per-
formance. Simulation & Gaming, 40(1), 98–109.
doi:10.1177/1046878107308077.
Whitlock, L. A., McLaughlin, A. C., & Allaire,
J. C. (2012). Individual differences in response
to cognitive training: Using a multi-modal, at-
tentionally demanding game-based intervention
for older adults. Computers in Human Behavior,
28, 1091–1096. doi:10.1016/j.chb.2012.01.012.
Wittgenstein, L. (1953). Philosophical investiga-
tions. Malden, MA: Blackwell.
Wolfe, J., & Box, T. M. (1988). Team co-
hesion effects on business game perfor-
mance. Simulation & Gaming, 19(1), 82–98.
doi:10.1177/003755008801900105.
Wood, R. T. A., Griffiths, M. D., & Eatough, V.
(2004). Online data collection from video game
players: Methodological issues. Cyberpsychology
& Behavior, 7(5), 511–518. doi: doi:10.1089/
cpb.2004.7.511 PMID:15667045.
Wright, J. D., & Marsden, P. V. (Eds.). (2010).
Handbook of survey research (2nd ed.). Bingley,
UK: Emerald Group Publishing.
Wu, W.-H., Hsiao, H.-C., Wu, P.-L., Lin, C.-
H., & Huang, S.-H. (2011). Investigating the
learning-theory foundations of game-based
learning: A meta-analysis. Journal of Computer
Assisted Learning, 28. doi: doi:10.1111/j.1365-
2729.2011.00437.x.
393
A Brief Methodology for Researching and Evaluating Serious Games and Game-Based Learning
Young, M. F., Slota, S., Cutter, A. B., Jalette, G.,
Mullin, G., & Lai, B. etal. (2012). Our princess is
in another castle: A review of trends in serious gam-
ing for education. Review of Educational Research,
82(1), 61–89. doi:10.3102/0034654312436980.
Zapata-Rivera, D., VanWinkle, W., Doyle, B.,
Buteux, A., & Bauer, M. (2009). Combining
learning and assessment in assessment-based
gaming environments: A case study from a
New York City school. Interactive Technol-
ogy and Smart Education, 6(3), 173–188.
doi:10.1108/17415650911005384.
Zichermann, G., & Linder, J. (2010). Game-based
marketing: Inspire customer loyalty through
rewards, challenges, and contests. Hoboken, NJ:
Wiley.
KEY TERMS AND DEFINITIONS
Assessment: Evaluation with the specific
intention to establish / judge the performance,
suitability and / or cost-effectiveness of something
or someone in the light of past, present or future
objectives, tasks, function or other properties.
Evaluation: Applied research with the specific
intention of determining the ‘value’ of something
or someone in the light of past, present or future
objectives, tasks, function or other properties.
Frame: An instrument for defining reality
rather than describing reality.
Frame Analysis: How people understand an
issue, and to track the way in which this under-
standing changes over time.
Learning: Adaptive changes of a system in
response to internal incommensurability and/or
external pressure.
Method: Means, manner or procedure for
systematic inquiry or action.
Methodology: The rationale and principles
upon which systematic inquiry is founded.
Quasi Experimental Design: Form of ex-
perimental research where because of practical,
functional and/or ethical reasons some require-
ments for experimental control, like randomiza-
tion, control group or lab conditions, cannot /
should not be met.
Research Design: Description of the setup of
systematic inquiry, including study type, research
questions, hypotheses, operationalization, data
collection and analysis.
Serious Game: Non-definable but common
reference to the many different forms in which
(principles, technology, elements of) games are
utilized for society, business, politics.
... Many scholars write about learning as an experiential process, linked to adaptive management, emphasizing learning by doing. Mayer et al. (2014), being one of the two defining learning explicitly, refer to it as "adaptive changes of a system in response to internal incommensurability and/or external pressure" (Mayer et al. 2014: p. 393). Furlan et al. (2018), for example, state that public authorities, instead of creating new and expensive policy cycles, may use lessons learned to adjust existing ones (Furlan et al. 2018). ...
... The majority of the reviewed literature refer to the ways by which learning can be facilitated and fostered, the biggest part even states the "how" explicitly, and a wide variety of tools and methods are mentioned. For example, Mayer et al. (2013Mayer et al. ( , 2014 and Keijser et al. (2018) focus on the use of serious games, more explicitly the Maritime Spatial Planning Challenge, as a tool for policy-oriented learning. They indicate how simulation gaming (SG) can help participants exploring the complexity of MSP (Mayer et al. 2013) and let them "thinking and talking" about the interrelations among different marine uses and objectives (Keijser et al. 2018). ...
... The Ministry of Infrastructure and Water Management kick-started the development of the MSP Challenge in 2011 (also described/referred to in the reviewed literature, e.g. Mayer et al. 2013Mayer et al. , 2014Keijser et al. 2018). Since then, three types of serious games have been developed: a role-playing game, a simulation game, and a board game. ...
Article
Full-text available
Both policy-makers and scholars acknowledge and emphasize the need for learning in maritime spatial planning (MSP). However, few explain why learning is important. As such, it remains a vague and understudied process and is taken for granted and assumed to be and do “only good” which might hinder an in-depth assessment of the effectiveness of learning in policy-making. In this paper, we investigate whether, and if so in what way, explicit attention is given to learning in MSP. In this way, we try to unpack a (plausible) “learning paradox” and gain more insight into the different conceptualizations of learning in MSP. We use seven dimensions to examine learning in MSP by conducting a literature review of scientific MSP literature and a case study, which analyzes learning in the Dutch MSP process. The literature review and case study point to a “learning paradox” in MSP, showing both similarities and differences. The common lack of attention for risk and ambiguities is particularly problematic, while the existing clarity about who (should) learn and how can be seen as opportunities to gain insights in learning in MSP. Overall, we argue that acknowledging the paradox is paramount to improve the effectiveness of learning processes in MSP.
... Кроме того, можно приветствовать работы известных ученых, Natalia S. по применению технологий CLIL в биологии [7], Сойле Д. [8], Ball [9], Воигдопјоп J. [10], Мауг I. [11], Zaharias Р. [12] и др. ...
... The aim of the study is to improve the scientific and methodological system of subject-language integrated biology teaching and to test its effectiveness in general education schools, evaluation by pedagogical experiment. The study involved students of 8,9,10 grades of secondary school named after. Makarenko, S. Sas-Tube, Tulkubassky district of Turkestan region.The selected classes were divided into experimental and control groups. ...
Article
The article discusses important issues of the education system, knowledge of the international language, including English. This is a necessity arising from the requirements of modernity. It is necessary to teach students in three languages. There is a need to improve the language competence of both students and teachers. The aim of the study is to improve the scientific and methodological system of subject-language integrated biology teaching and to test its effectiveness in general education schools, evaluation by pedagogical experiment. The study involved students of 8,9,10 grades of secondary school named after. Makarenko, S. Sas-Tube, Tulkubassky district of Turkestan region.The selected classes were divided into experimental and control groups. With the help of test control, their initial level of knowledge in English was determined. The results of the experimental work once again show that teaching natural sciences in English is one of the effective methods, one can observe an increase in learning outcomes. With the help of lexico-grammatical testing, an approximate homogeneous composition of control and experimental groups was determined, therefore, the formation of communicative competence based on the theory of speech activity was studied. Initially, the results of two groups (control and experimental) were indicative of English language proficiency. Secondly, the methods used to increase the motivation of students have increased the speech skills of the experimental group and the levels of formation of integrated subject-language competence. The results of the study can be used in the formation of cognitive activity of students in secondary and higher educational institutions, improving the quality of the educational process and in institutes of advanced professional education of biology teachers and biology specialists, secondary schools, colleges.
... Traditional programs and teaching formats need to be adapted with both new content (imparting new competences) and new learning strategies (imparting competences with new formats). Gamification and simulation games in particular have been proposed as one meaningful way to impart competences [19][20][21]. Simulation games not only train competences in a risk-free environment [12] but can also convey the complexities and oftentimes opaque structures in which public organizations act [11]. ...
Chapter
The digital transformation of the public sector is progressing but regularly not at the desired pace. Here, the digitalization skills of public officials are one important resource to cope with the demanding digital shift and rise of e-government. As there is still a lack of those competences, alternative educational approaches are needed. Promising and flexible methods are simulation games—although not widely used in the public sector. As a resolve, we are developing on a corresponding simulation game platform for about two years. In addition to sharing the artifact, this manuscript shall provide a set of design principles helping to create and facilitate the adoption of related platforms for the public sector.
... The interactive imagery therapy in VR emphasized on engaging patients with in-built games, which provided more sustained VR content for prolonged participation (Mayer et al., 2014). The VR software module 'Enchanted Forest' was applied in two studies: during the gynaecological surgery; and pre-and post-operatively in the cardiac surgery, to help evoke relaxation, deep breathing and pain distraction (Luis Mosso et al., 2017;Vázquez et al., 2017). ...
Article
Full-text available
Aims: To evaluate the different types of virtual reality (VR) therapy received by adult patients undergoing surgical procedures in acute care settings and the outcome measures, as well as to highlight the acceptability and feasibility of VR approaches among patients and healthcare workers. Design: Whittemore and Knafl's integrative review method guided the analysis. Data sources: Searches were conducted in ScienceDirect, ProQuest, Wiley Online Library, Medline, PsycINFO and PubMed and Google Scholar from 2000 to June 2021. Review methods: A systematic search on articles published in English was carried out with electronic databases and hand search references. Keywords searched included primary qualitative and quantitative studies that utilized VR therapy in surgical care settings. Results: Eighteen articles were reviewed, which reported the use of two main strategies: guided and interactive imagery therapy. The findings identified: (i) patient-clinical outcome measures including the use of analgesics, vital signs, functional capacity and length of hospital stay; and (ii) patient-reported experience measures including pain, anxiety and satisfaction level. Comfort, age, knowledge and attitude were key factors influencing the acceptability of VR among the patients, whereas cost-effectiveness and infection control were two main factors affecting the feasibility of use among the health care workers. Conclusion: VR therapy demonstrated potential improvements in both the patient-clinical outcomes and patient-reported experiences of those undergoing surgical procedures. However, the findings were inconsistent, which required further research to explore and establish the effectiveness of using VR in the context of acute care settings. Impact: VR distraction has been increasingly used as a non-pharmacological method in managing pain, easing anxiety and optimizing other associated outcomes in patients undergoing surgical procedures. It is essential to examine the effectiveness of VR therapy on the adult patients' outcomes in acute care settings with surgical procedures, as well as its acceptability and feasibility of use.
... Despite the globally growing interest in digital game-based learning and the significant efforts in researching and evaluating serious games, considerable weaknesses remain, including a lack of comprehensive frameworks for comparative evaluation: it is possible to evaluate a single title, problems come when you have to deal with more than one since it is very difficult to assess all the characteristics and the relative level of learning they allow for. While some game-based learning models have been developed in the literature (Mayer et al. 2014), they do not specifically tackle the evaluation of the learning impact produced in the learner by playing (serious) games. Despite many methodologies that have been elaborated in the last years (Catalano, Luccini, Mortara 2014), this remains nowadays one of the most important challenges researchers have to deal with. ...
Article
Full-text available
In the last decades, digital technologies have pervaded every aspect of the production of archaeological knowledge and they have been massively used to communicate the past. This contribution analyses the potential and benefits of serious games as they appear a promising tool for engaging the users in active learning of cultural contents, for attracting new audiences and promoting knowledge and awareness around archaeological heritage. Moreover, the need for multidisciplinary collaborations between archaeologists and developers and the necessity of assessment studies on learning levels to implement their effectiveness will be highlighted.
... Traditional programs and teaching formats need to be adapted with both new content (imparting new competences) and new learning strategies (imparting competences with new formats). Gamification and simulation games in particular have been proposed as one meaningful way to impart competences [19][20][21]. Simulation games not only train competences in a risk-free environment [12] but can also convey the complexities and oftentimes opaque structures in which public organizations act [11]. ...
Conference Paper
The digital transformation of the public sector is progressing but regularly not at the desired pace. Here, the digitalization skills of public officials are one important resource to cope with the demanding digital shift and rise of e-government. As there is still a lack of those competences, alternative educational approaches are needed. Promising and flexible methods are simulation games—although not widely used in the public sector. As a resolve, we are developing on a corresponding simulation game platform for about two years. In addition to sharing the artifact, this manuscript shall provide a set of design principles helping to create and facilitate the adoption of related platforms for the public sector.
... To overcome these limitations, the approach of "stealth assessment" [43] emerged focusing on the study of how psycho-cognitive states can be assessed in an ecological, non-intrusive, non-biased way. Studies under this paradigm record subjects' performance during a serious game, and then conclusions are drawn about individual competencies based on the data [44,45]. In the field of RT, the Bechara gambling task (BGT) [46] and the balloon analogue risk task (BART) [34] could be considered the most used measures that aim to assess RT under this methodology. ...
Article
Full-text available
Risk taking (RT) measurement constitutes a challenge for researchers and practitioners and has been addressed from different perspectives. Personality traits and temperamental aspects such as sensation seeking and impulsivity influence the individual’s approach to RT, prompting risk-seeking or risk-aversion behaviors. Virtual reality has emerged as a suitable tool for RT measurement, since it enables the exposure of a person to realistic risks, allowing embodied interactions, the application of stealth assessment techniques and physiological real-time measurement. In this article, we present the assessment on decision making in risk environments (AEMIN) tool, as an enhanced version of the spheres and shield maze task, a previous tool developed by the authors. The main aim of this article is to study whether it is possible is to discriminate participants with high versus low scores in the measures of personality, sensation seeking and impulsivity, through their behaviors and physiological responses during playing AEMIN. Applying machine learning methods to the dataset we explored: (a) if through these data it is possible to discriminate between the two populations in each variable; and (b) which parameters better discriminate between the two populations in each variable. The results support the use of AEMIN as an ecological assessment tool to measure RT, since it brings to light behaviors that allow to classify the subjects into high/low risk-related psychological constructs. Regarding physiological measures, galvanic skin response seems to be less salient in prediction models.
Conference Paper
Full-text available
the use of serious games for educating people and transferring organizational content is increasing globally. In this context, assessing the game’s intervention to ensure the implementation of defined goals is necessary. The current research, using a scoping review research methodology, for the first time, introduces the concept of impact assessment (IA) to measure the consequences of game’s intervention on player. Besides, the current research presents a framework for the impact assessment of educational games. The research findings inform the researchers, policymakers, developers, and other engaged people in the world of serious games, about the notions, and insights of IA during and after the game design.
Article
Coastal flooding is among the most significant disasters that threaten littoral populations in today's world. Climate change and global warming increase this risk and require implementing efficient strategies to anticipate the hazard and reduce human and material damage. This paper describes a generic platform called LittoSIM-GEN (a coupling of agent-based and hydrodynamic models), which aims to involve stakeholders in a participatory simulation of land use management to minimize the risks of coastal submersion. An application to the southwest coast of France (Rochefort) demonstrates the features of this platform. We propose several recommendations to improve the generic conception and implementation of models dedicated to participatory simulation.
Article
Full-text available
Bu çalışmanın amacı, ilkokul dördüncü sınıf fen bilimleri dersinde uygulanan oyun temelli fen öğretimi ve öğrenme yaşantılarının akademik başarıya ve derse yönelik tutuma etkileri ile öğrenme sürecine katkılarına ilişkin öğretmen ve öğrenci görüşlerini ortaya koymaktır. Araştırma, 2016-2017 eğitim-öğretim yılında, Bilecik’te bulunan bir ilkokulda deney ve kontrol grubu olmak üzere iki sınıfla dokuz hafta boyunca yürütülmüştür. Araştırmanın yöntemi, nicel ve nitel araştırma desenlerinin birlikte kullanıldığı karma yöntemdir. Nicel veriler, “akademik başarı testi” ve “fen bilimleri dersi tutum ölçeği”nden elde edilmiştir. Nitel veri elde etmek için ise süreç boyunca öğrenciler günlük tutmuşlar, süreç boyunca ve süreç sonunda öğretmen ve öğrenciler ile görüşmeler gerçekleştirilmiştir. Araştırmanın sonuçlarına göre uygulama sonrası deney grubu ve kontrol grubu öğrencilerinin akademik başarı ve derse yönelik tutumları arasında deney grubu lehine anlamlı farklılık elde edilmiştir. Ayrıca deneysel uygulamadan beş hafta sonra “akademik başarı testi” kalıcılık testi olarak tekrar uygulanmış ve deney grubu lehine anlamlı farklılık elde edilmiştir. Araştırmanın nitel bulgularına göre ise öğrencilerin; süreçte eğlenerek, aktif, yaparak-yaşayarak öğrenmeler gerçekleştirdikleri, görev ve sorumluluk üstlendikleri, derse yönelik olumlu tutum geliştirdikleri anlaşılmaktadır. Bu sonuçlara göre oyun temelli öğretimin derslerde işe koşulmasının önemi ve gerekliliği dile getirilebilir. Extended Abstract Widespread individual differences of the students is one of the biggest challenges faced by educators in the teaching and learning process today. Students come to educational environments at different levels of readiness in terms of interest, need, motivation, ability and learning styles. However, it can be said that playing is one of the basic needs of individuals, no matter how different characteristics the individuals have. Especially for children, it is stated in the literature that playing is an extremely important need for children, like bread and water. Plays has a distinct importance in almost every period of life, especially in childhood. Plays, which is a serious occupation for children, are also a source of entertainment and learning. Children play at every age, in every society and in every part of the world. Plays are a crucial need in children's life as much as eating and breathing. Children always play with great pleasure and willingness (Aksoy and Çiftçi, 2014, Küçükali, 2015, Sığırtmaç, 2011). Plays are also a way of learning about the world. It plays a mediating role between experience and insight/understanding (Rieber and Noah, 2008). Game-based learning can increase students’ academic achievement and interest to courses of students by providing students with interesting and motivating learning experiences. Game-based learning activities can also provide students with rich learning experiences and ensure effective, productive and permanent learning. Similarly, game-based learning contributes significantly to the acquisition of skills such as problem solving, collaborative work, social communication and interaction, and taking responsibility for learning by meeting students’ interests, needs and expectations (Chang et al., 2012, Liu et al., 2011, Meluso et al., 2012). When we look at the literature, numerous studies are found that researches effects of game-based learning approach (All, 2016, Chen et al., 2016, Hamari et al., 2016, Kaya and Elgün, 2015, Liu and Chen, 2013, Musselman, 2014, Yien et al., 2011). For example, in their study, Kayabaşı and Akbaş (2017) have concluded that the play-based teaching method has been effective in raising the academic achievement of the students both within the group and between the groups. In addition to these studies, it is seen in the literature that the game-based teaching, which has been implemented in various stages of education and disciplines, has not been studied extensively in the field of natural sciences. In addition, it is observed that the studies in this regard have mainly been conducted by quantitative methods, and that the studies carried out by the mixed methods were extremely limited. In this regard, it is anticipated that this study will be different from other studies with these aspects and will also provide significant contributions to the literature. It cannot be denied that the students in Turkey have a low score on national and international examinations in the field of natural sciences. Despite the development of science education curriculum in contemporary standards in the direction of progressiveism and restructuring educational philosophies and constructivist approach in 2005, 2012 and 2017 by the Ministry of Education, the failures of students in the field of science give rise to the idea that there are some problems in science education in Turkey. Not only in Turkey, but also in Europe there is no specific support policy for students with low levels of achievement in science education. In schools in major countries, nationwide initiatives have been launched to fight against low level of achievement (Eurydice, 2011). Given the above considerations, it is important to evaluate these failures in the field of science and to take into account the individual differences in science teaching and to employ alternative approaches. In this respect, it is believed that play-based science learning activities discussed in the research will increase students’ interests towards natural sciences, their academic achievement, and will enable them to have significant gains in the way to become science-literate individuals. The aim of this study is to determine the effects of the Game-Based Science Education applied in the primary school 4th grade science course on the students' academic achievements, attitudes towards the course and on the teaching and learning process. In this regard, the study sought answers to the following questions: 1. What is the effect of the Game-Based Science Education applied in the primary school 4th grade science course on the students’ academic achievements and permanent learning? 2. What is the effect of the Game -Based Science Education applied in the primary school 4th grade science course on the students’ attitudes towards the course? 3. What is the contribution of the Game -Based Science Education applied in the primary school 4th grade science course to the teaching and learning process? The study was conducted in the 2016-2017 academic year, for ten weeks, with two classes, divided into experimental and control groups, in an elementary school in the Central District of the Province of Bilecik, Turkey. The research utilizes a mixed methodology, in which both quantitative and qualitative research designs were used together. Quantitative data were obtained using the "academic achievement test" developed by the researcher in accordance with expert opinions, and the "natural sciences course attitudes scale" developed by Yasar and Anagun (2008). In order to obtain qualitative data, interviews with the form teachers and students in the experimental group were carried out. According to the results of the study, a significant difference was found in favor of the experimental group between the students in the experimental and control groups in terms of academic achievement and attitudes towards the course, after the play-based learning activities applied in the “Microscopic Living Creatures and Our Environment” and “Simple Electric Circuits” course units. According to the qualitative findings of the research, it is understood that the students have achieved an effective and permanent learning, improved their thinking and problem solving skills, enjoyed the learning process and developed a positive attitude toward the course. In addition, it is seen that the students participated actively, performed collaborative work in social interaction and communication in cooperation and solidarity during the process. The results of this study are similar to the findings of many studies in the literature. In conclusion, it can be said based on the findings of this study that the game-based science learning experiences provide important contributions to the cognitive, emotional and social skills of the students. The aim of this study is to determine the effects of the Game-Based Science Education applied in the primary school 4th grade science course on the students' academic achievements, attitudes towards the course and on the teaching and learning process. According to the results of the research, the game-based science learning experiences applied in the experimental group for ten weeks have resulted in increasing the students’ academic achievement, permanent learning and the attitude towards the lesson in favor of the experimental group according to the control group. When the literature is examined, it has been found that the game-based learning approach increases the students’ academic achievements, motivations and attitudes towards the lessons (Giannakkos, 2013, Hwang et al., 2015; Kayabaşı and Akbaş, 2017; Ke, 2008; Mayo, 2009; Rosas et al., 2003, Schaaf, 2012, Virvou et al., 2005, Yang, 2012).
Article
Full-text available
Educational simulations and games provide learners with engaging instructional activities and successfully promote learning effectiveness. Although the face validity of this initial statement seems reasonable to many teachers, students, designers, and researchers, both the semantics and implied evidence are somewhat problematic. First, what constitutes a simulation? a game? a microworld? And, how do these variations on a theme differ from one another when it comes to instructional design and implementation? In the literature reviewed from the past 40 years, the first point of business is often, if not always, to define these troublesome terms-a trend that we shall continue at the outset of this paper. Second, the opening sentence implies that simulations and games are both successfully "engaging" and "effective" strategies for learning. While this statement may be generally-accepted by many scholars, the two hypotheses ("engaging" and "effective") have been the subject of many debates, reviews, and research studies across numerous disciplines. Unfortunately, most of the field's research has lacked rigorous designs, appropriate controls, and external validity to be able to extrapolate and generalize the results so convincingly. The purpose of this paper is to synthesize the literature on educational simulations and games in order to guide future research, in the hopes of providing reliable, valid, and empirical evidence regarding student engagement and learning effectiveness.
Book
Formal political theory seeks to develop formal, mathematical models of political and economic processes. This book attempts to integrate the last twenty years of development in this field. Professor Ordeshook uses the modern developments in the theory of games (decision making with multiple, interactive decision makers) as the basis for the synthesis. Topics covered include models of elections and of committee processes, the demand and supply of public goods, and surveys of game theory and social-choice theory. Game Theory and Political Theory is designed as a textbook for graduate courses in formal political theory and political economy.
Chapter
Using video games to train and educate is a notion that is gaining traction among gamers, parents, and serious educators alike. Unfortunately, to date there have been few rigorous studies to determine whether games can be effective learning tools. Given their inherent features, the authors feel certain that games can teach, and they are interested instead in addressing the question of how best to design games that will optimize learning. To accomplish this goal, the authors offer a simple framework for organizing variables and then discuss findings from psychology and education as a basis to formulate a research agenda for game-based training. In doing so, they hope to stimulate researchers to conduct appropriately controlled experiments that will begin to provide insight into how various features affect motivation and learning. In this way, a true science of educational games can be formed.
Article
In this special edition on virtual-world goods and trade, we are pleased to present articles from a global cohort of contributors covering a wide range of issues. Some of our writers, such Edward Castronova, Julian Dibbell or KZero’s Nic Mitham will be well known to you as distinguished leaders in the field, but it is equally our pleasure to introduce exciting new voices. Here you will find pieces written by academics, practitioners, journalists, a documentary filmmaker and perhaps the youngest contributor to JVWR yet, Eli Kosminksy, who attends high school in upstate New York. We would also point out that this issue extends its format to include Anthony Gilmore’s pictorial story, Julian Dibbell’s audio interview, and Lori Landay’s machinima. In real life, most contributors live in the US, the UK and Europe, and we, the editors, are based in Australia and France. We express warm thanks to the team at the University of Texas, especially to Jeremiah Spence, our editor–in-chief for his guidance throughout this process. We begin with our own thought piece, which is designed to contextualise the deeper contents herein by way of plotting the virtual goods path and placing some historical sign posts along the way.Mandy and Serge
Book
The two-volume set LNCS 6769 + LNCS 6770 constitutes the proceedings of the First International Conference on Design, User Experience, and Usability, DUXU 2011, held in Orlando, FL, USA in July 2011 in the framework of the 14th International Conference on Human-Computer Interaction, HCII 2011, incorporating 12 thematically similar conferences. A total of 4039 contributions was submitted to HCII 2011, of which 1318 papers were accepted for publication. The total of 154 contributions included in the DUXU proceedings were carefully reviewed and selected for inclusion in the book. The papers are organized in topical sections on DUXU theory, methods and tools; DUXU guidelines and standards; novel DUXU: devices and their user interfaces; DUXU in industry; DUXU in the mobile and vehicle context; DXU in Web environment; DUXU and ubiquitous interaction/appearance; DUXU in the development and usage lifecycle; DUXU evaluation; and DUXU beyond usability: culture, branding, and emotions.
Book
During the last few years, a new area of creative media industry, namely Serious Games, has started to emerge around the world. The term serious games has become more popular for example in the fields of education, business, welfare and safety. Despite this, there has been no single definition of serious games. A key question, what the concept itself means, has stayed unsolved though most have agreed on a definition that serious games are games or game-like interactive systems developed with game technology and design principles for a primary purpose other than pure entertainment. In this book, serious games are understood as games which aim at providing an engaging, self-reinforcing context in which to motivate and educate the players. Serious games can be of any genre, use any game technology, and be developed for any platform. They can be entertaining, but usually they teach the user something. The central aim of serious games is to raise quality of life and well-being. As part of interactive media industry, the serious games field focuses on designing and using digital games for real-life purposes and for the everyday life of citizens in information societies. The field of serious games focuses on such areas as education, business, welfare, military, traffic, safety, travelling and tourism.
Chapter
Drawing on grounded theory approach and a qualitative meta-analysis, this chapter intends to systematically review and synthesize the theories, methods, and findings of both qualitative and quantitative inquiries on computer-based instructional games. A major purpose of this literature review and meta-analysis is to inform policy and practice based on existing studies. Four major recurring themes concerning the effectiveness of computer-based instructional games have emerged from a comparative analysis with 89 instructional gaming studies and are discussed with the support of exemplar research. The chapter will assist practitioners and policymakers in understanding the “best practices” and key factors of a computer game-based learning program.