The authors define the requirements and a conceptual model for comparative evaluation research of simulation games and serious games (SGs) in a learning context. A first operationalisation of the model was used to comparatively evaluate a suite of 14 SGs on varying topics played between 2004 and 2009 in 13 institutes of higher education in the Netherlands. The questions in this research were: what is the perceived learning effectiveness of the games and what factors explain it? How can we comparatively evaluate games for learning? Data were gathered through pre- and post-game questionnaires among 1000 students, leading to 500 useful datasets and 230 complete datasets for analysis (factor analysis, scaling, t-test and correlation analysis) to give an explorative, structural model. The findings are discussed and a number of propositions for further research are formulated. The conclusion of the analysis is that the students' motivation and attitudes towards game-based learning before the game, their actual enjoyment, their efforts during the game and the quality of the facilitator/teacher are most strongly correlated with their learning satisfaction. The degree to which the experiences during the game were translated back into the underlying theories significantly determines the students' learning satisfaction. The quality of the virtual game environment did not matter so much. The authors reflect upon the general methodology used and offer suggestions for further research and development.
The After-Action Review (AAR) in Virtual Reality (VR) training for police provides new opportunities to enhance learning. We investigated whether perspectives (bird’s eye & police officer, bird’s eye & suspect, bird’s eye) and line of fire displayed in the AAR impacted the officers’ learning efficacy. A 3 x 2 ANOVA revealed a significant main effect of AAR perspectives. Post hoc pairwise comparisons showed that using a bird’s eye view in combination with the suspect perspective elicits significantly greater learning efficacy compared to using a bird’s eye view alone. Using the line of fire feature did not influence learning efficacy. Our findings show that the use of the suspect perspective during the AAR in VR training can support the learning efficacy of police officers.Practitioner summary: VR systems possess After-Action Review tools that provide objective performance feedback. This study found that reviewing a VR police training scenario from the bird’s eye view in combination with the suspect perspective enhanced police officers’ learning efficacy. Designing and applying the After-Action Review effectively can improve learning efficacy in VR.
From the article: "The educational domain is momentarily witnessing the emergence of learning analytics – a form of data analytics within educational institutes. Implementation of learning analytics tools, however, is not a trivial process. This research-in-progress focuses on the experimental implementation of a learning analytics tool in the virtual learning environment and educational processes of a case organization – a major Dutch university of applied sciences. The experiment is performed in two phases: the first phase led to insights in the dynamics associated with implementing such tool in a practical setting. The second – yet to be conducted – phase will provide insights in the use of pedagogical interventions based on learning analytics. In the first phase, several technical issues emerged, as well as the need to include more data (sources) in order to get a more complete picture of actual learning behavior. Moreover, self-selection bias is identified as a potential threat to future learning analytics endeavors when data collection and analysis requires learners to opt in."
The objective of DIGIREAL-XL is to build a Research, Development & Innovation (RD&I) Center (SPRONG GROUP, level 4) on Digital Realities (DR) for Societal-Economic Impact. DR are intelligent, interactive, and immersive digital environments that seamlessly integrate Data, Artificial Intelligence/Machine Learning, Modelling-Simulation, and Visualization by using Game and Media Technologies (Game platforms/VR/AR/MR). Examples of these DR disruptive innovations can be seen in many domains, such as in the entertainment and service industries (Digital Humans); in the entertainment, leisure, learning, and culture domain (Virtual Museums and Music festivals) and within the decision making and spatial planning domain (Digital Twins). There are many well-recognized innovations in each of the enabling technologies (Data, AI,V/AR). However, DIGIREAL-XL goes beyond these disconnected state-of-the-art developments and technologies in its focus on DR as an integrated socio-technical concept. This requires pre-commercial, interdisciplinary RD&I, in cross-sectoral and inter-organizational networks. There is a need for integrating theories, methodologies, smart tools, and cross-disciplinary field labs for the effective and efficient design and production of DR. In doing so, DIGIREAL-XL addresses the challenges formulated under the KIA-Enabling Technologies / Key Methodologies for sectoral and societal transformation. BUas (lead partner) and FONTYS built a SPRONG group level 4 based on four pillars: RD&I-Program, Field Labs, Lab-Infrastructure, and Organizational Excellence Program. This provides a solid foundation to initiate and execute challenging, externally funded RD&I projects with partners in SPRONG stage one ('21-'25) and beyond (until' 29). DIGIREAL-XL is organized in a coherent set of Work Packages with clear objectives, tasks, deliverables, and milestones. The SPRONG group is well-positioned within the emerging MINDLABS Interactive Technologies eco-system and strengthens the regional (North-Brabant) digitalization agenda. Field labs on DR work with support and co-funding by many network organizations such as Digishape and Chronosphere and public, private, and societal organizations.
The objective of DIGIREAL-XL is to build a Research, Development & Innovation (RD&I) Center (SPRONG GROUP, level 4) onDigital Realities (DR) for Societal-Economic Impact. DR are intelligent, interactive, and immersive digital environments thatseamlessly integrate Data, Artificial Intelligence/Machine Learning, Modelling-Simulation, and Visualization by using Gameand Media Technologies (Game platforms/VR/AR/MR). Examples of these DR disruptive innovations can be seen in manydomains, such as in the entertainment and service industries (Digital Humans); in the entertainment, leisure, learning, andculture domain (Virtual Museums and Music festivals) and within the decision making and spatial planning domain (DigitalTwins). There are many well-recognized innovations in each of the enabling technologies (Data, AI,V/AR). However, DIGIREAL-XL goes beyond these disconnected state-of-the-art developments and technologies in its focus on DR as an integrated socio-technical concept. This requires pre-commercial, interdisciplinary RD&I, in cross-sectoral andinter-organizational networks. There is a need for integrating theories, methodologies, smart tools, and cross-disciplinaryfield labs for the effective and efficient design and production of DR. In doing so, DIGIREAL-XL addresses the challengesformulated under the KIA-Enabling Technologies / Key Methodologies for sectoral and societal transformation. BUas (lead partner) and FONTYS built a SPRONG group level 4 based on four pillars: RD&I-Program, Field Labs, Lab-Infrastructure, and Organizational Excellence Program. This provides a solid foundation to initiate and execute challenging, externally funded RD&I projects with partners in SPRONG stage one ('21-'25) and beyond (until' 29). DIGIREAL-XL is organized in a coherent set of Work Packages with clear objectives, tasks, deliverables, and milestones. The SPRONG group is well-positioned within the emerging MINDLABS Interactive Technologies eco-system and strengthens the regional (North-Brabant) digitalization agenda. Field labs on DR work with support and co-funding by many network organizations such as Digishape and Chronosphere and public, private, and societal organizations
The PhD research by Joris Weijdom studies the impact of collective embodied design techniques in collaborative mixed-reality environments (CMRE) in art- and engineering design practice and education. He aims to stimulate invention and innovation from an early stage of the collective design process.Joris combines theory and practice from the performing arts, human-computer interaction, and engineering to develop CMRE configurations, strategies for its creative implementation, and an embodied immersive learning pedagogy for students and professionals.This lecture was given at the Transmedia Arts seminar of the Mahindra Humanities Center of Harvard University. In this lecture, Joris Weijdom discusses critical concepts, such as embodiment, presence, and immersion, that concern mixed-reality design in the performing arts. He introduces examples from his practice and interdisciplinary projects of other artists.About the researchMultiple research areas now support the idea that embodiment is an underpinning of cognition, suggesting new discovery and learning approaches through full-body engagement with the virtual environment. Furthermore, improvisation and immediate reflection on the experience itself, common creative strategies in artist training and practice, are central when inventing something new. In this research, a new embodied design method, entitled Performative prototyping, has been developed to enable interdisciplinary collective design processes in CMRE’s and offers a vocabulary of multiple perspectives to reflect on its outcomes.Studies also find that engineering education values creativity in design processes, but often disregards the potential of full-body improvisation in generating and refining ideas. Conversely, artists lack the technical know-how to utilize mixed-reality technologies in their design process. This know-how from multiple disciplines is thus combined and explored in this research, connecting concepts and discourse from human-computer interaction and media- and performance studies.This research is a collaboration of the University of Twente, Utrecht University, and HKU University of the Arts Utrecht. This research is partly financed by the Dutch Research Council (NWO).Mixed-reality experiences merge real and virtual environments in which physical and digital spaces, objects, and actors co-exist and interact in real-time. Collaborative Mix-Reality Environments, or CMRE's, enable creative design- and learning processes through full-body interaction with spatial manifestations of mediated ideas and concepts, as live-puppeteered or automated real-time computer-generated content. It employs large-scale projection mapping techniques, motion-capture, augmented- and virtual reality technologies, and networked real-time 3D environments in various inter-connected configurations.This keynote was given at the IETM Plenary meeting in Amsterdam for more than 500 theatre and performing arts professionals. It addresses the following questions in a roller coaster ride of thought-provoking ideas and examples from the world of technology, media, and theatre:What do current developments like Mixed Reality, Transmedia, and The Internet of Things mean for telling stories and creating theatrical experiences? How do we design performances on multiple "stages" and relate to our audiences when they become co-creators?Contactjoris.weijdom@hku.nl / LinkedIn profileThis research is part of the professorship Performative Processes