Het RAAK MKB-project Geautomatiseerd Game Design is na de oorspronkelijke projectduur van 1 maart 2013 t/m 31 mei 2015 nog met drie maanden uitgebreid om nog gebruik te kunnen maken van tot dan toe onbenutte projectresources, ten behoeve van (1) de ontwikkeling van cursus- en trainingsmateriaal, (2) evaluatie van dit cursus- en trainingsmateriaal, en (3), verfijning, verduurzaming, en disseminatie van de resultaten van het project. Dit rapport geeft een verslag van de activiteiten en resultaten van deze extensieperiode, van 1 oktober t/m 31 december 2015.Projectnummer 2012-20-43M, Subsidieperiode : 1 maart 2013–31 mei 2015 + extensie van 1 okt-31 dec 2015
Video game designers iteratively improve player experience by play testing game software and adjusting its design. Deciding how to improve gameplay is difficult and time-consuming because designers lack an effective means for exploring decision alternatives and modifying a game’s mechanics. We aim to improve designer productivity and game quality by providing tools that speed-up the game design process. In particular, we wish to learn how patterns en- coding common game design knowledge can help to improve design tools. Micro-Machinations (MM) is a language and software library that enables game designers to modify a game’s mechanics at run-time. We propose a pattern-based approach for leveraging high-level design knowledge and facilitating the game design process with a game design assistant. We present the Mechanics Pattern Language (MPL) for encoding common MM structures and design intent, and a Mechanics Design Assistant (MeDeA) for analyzing, explaining and understanding existing mechanics, and generating, filtering, exploring and applying design alternatives for modifying mechanics. We implement MPL and MeDeA using the meta-programming language Rascal, and evaluate them by modifying the mechanics of a prototype of Johnny Jetstream, a 2D shooter developed at IC3D Media.
The rising rate of preprints and publications, combined with persistent inadequate reporting practices and problems with study design and execution, have strained the traditional peer review system. Automated screening tools could potentially enhance peer review by helping authors, journal editors, and reviewers to identify beneficial practices and common problems in preprints or submitted manuscripts. Tools can screen many papers quickly, and may be particularly helpful in assessing compliance with journal policies and with straightforward items in reporting guidelines. However, existing tools cannot understand or interpret the paper in the context of the scientific literature. Tools cannot yet determine whether the methods used are suitable to answer the research question, or whether the data support the authors’ conclusions. Editors and peer reviewers are essential for assessing journal fit and the overall quality of a paper, including the experimental design, the soundness of the study’s conclusions, potential impact and innovation. Automated screening tools cannot replace peer review, but may aid authors, reviewers, and editors in improving scientific papers. Strategies for responsible use of automated tools in peer review may include setting performance criteria for tools, transparently reporting tool performance and use, and training users to interpret reports.