In this paper, we present a framework for gamified motor learning through the use of a serious game and high-fidelity motion capture sensors. Our implementation features an Inertial Measurement Unit and a set of Force Plates in order to obtain real-time, high-frequency measurements of patients' core movements and centre of pressure displacement during physical rehabilitation sessions. The aforementioned signals enable two mechanisms, namely a) a game avatar controlled through patient motor skills and b) a rich data stream for post-game motor performance analysis. Our main contribution is a fine-grained processing pipeline for sensor signals, enabling the extraction of a reliable and accurate mapping between patient motor movements, in-game avatar controls and overall motor performance. Moreover, we discuss the potential of this framework towards the implementation of personalised therapeutic sessions and present a pilot study conducted in that direction.
LINK
We present a number of methodological recommendations concerning the online evaluation of avatars for text-to-sign translation, focusing on the structure, format and length of the questionnaire, as well as methods for eliciting and faithfully transcribing responses.
LINK
Stel: je bent op vakantie in Frankrijk. Je staat bij een bakkerij met 15 mensen achter je in de rij. Je voelt de tijdsdruk en daardoor schieten de woorden tekort: je kunt de boodschap niet goed overbrengen. Voor mensen met afasie is dit de dagelijkse realiteit.
In this project, the AGM R&D team developed and refined the use of a facial scanning rig. The rig is a physical device comprising multiple cameras and lighting that are mounted on scaffolding around a 'scanning volume'. This is an area at which objects are placed before being photographed from multiple angles. The object is typically a person's head, but it can be anything of this approximate size. Software compares the photographs to create a digital 3D recreation - this process is called photogrammetry. The 3D model is then processed by further pieces of software and eventually becomes a face that can be animated inside in Unreal Engine, which is a popular piece of game development software made by the company Epic. This project was funded by Epic's 'Megagrant' system, and the focus of the work is on streamlining and automating the processing pipeline, and on improving the quality of the resulting output. Additional work has been done on skin shaders (simulating the quality of real skin in a digital form) and the use of AI to re/create lifelike hair styles. The R&D work has produced significant savings in regards to the processing time and the quality of facial scans, has produced a system that has benefitted the educational offering of BUas, and has attracted collaborators from the commercial entertainment/simulation industries. This work complements and extends previous work done on the VIBE project, where the focus was on creating lifelike human avatars for the medical industry.