This position paper presents a proposal for evaluating interaction with light in a mixed reality setup. Current processes of designing and testing new forms of user interaction (UI) for controlling lighting are long and end up being restricted in actually testing a small number of possible interactions. Apart from the apparent advantage of overcoming testing a small number of potential interactions, the advantages of a simulated environment lie in the fact that such an environment is fully controllable and adaptable to the researchers' needs. Finally, we sketch potential challenges of using a mixed reality setup for evaluating interaction with light.
LINK
Many studies have suggested that personal practical knowledge is essential for professional development. Recently, there has been growing recognition of the importance of teacher educators’ personal practical knowledge of ‘language’ for student learning development. However, the need for teacher educators to first understand their own language-oriented development in content-based classroom interaction has not received as much emphasis. The current intervention study investigates how eleven experienced teacher educators understand their language-oriented development through the control of task difficulty, small-group instruction and directed response questioning. Data were examined by conducting content and constant comparison analyses. The results showed that the intervention affected the educators’ language-oriented development, which in turn affected their awareness and decisions made to improve their methods of initiation and response during classroom interaction. The results call for more concrete ways to expend teacher educators’ practical knowledge of language to further develop and enhance their language-oriented teaching performance in content-based classroom interaction.
Over the past forty years, the use of process models in practice has grown extensively. Until twenty years ago, remarkably little was known about the factors that contribute to the human understandability of process models in practice. Since then, research has, indeed, been conducted on this important topic, by e.g. creating guidelines. Unfortunately, the suggested modelling guidelines often fail to achieve the desired effects, because they are not tied to actual experimental findings. The need arises for knowledge on what kind of visualisation of process models is perceived as understandable, in order to improve the understanding of different stakeholders. Therefore the objective of this study is to answer the question: How can process models be visually enhanced so that they facilitate a common understanding by different stakeholders? Consequently, five subresearch questions (SRQ) will be discussed, covering three studies. By combining social psychology and process models we can work towards a more human-centred and empirical-based solution to enhance the understanding of process models by the different stakeholders with visualisation.
MULTIFILE
Nederland kent ongeveer 220.000 bedrijfsongevallen per jaar (met 60 mensen die overlijden). Vandaar dat elke werkgever verplicht is om bedrijfshulpverlening (BHV) te organiseren, waaronder BHV-trainingen. Desondanks brengt slechts een-derde van alle bedrijven de arbeidsrisico’s in kaart via een Risico-Inventarisatie & Evaluatie (RI&E) en blijft het aandeel werknemers met een arbeidsongeval hoog. Daarom wordt er continu geïnnoveerd om BHV-trainingen te optimaliseren, o.a. door middel van Virtual Reality (VR). VR is niet nieuw, maar is wel doorontwikkeld en betaalbaarder geworden. VR biedt de mogelijkheid om veilige realistische BHV-noodsimulaties te ontwikkelen waarbij de cursist het gevoel heeft daar echt te zijn. Ondanks de toename in VR-BHV-trainingen, is er weinig onderzoek gedaan naar het effect van VR in BHV-trainingen en zijn resultaten tegenstrijdig. Daarnaast zijn er nieuwe technologische ontwikkelingen die het mogelijk maken om kijkgedrag te meten in VR m.b.v. Eye-Tracking. Tijdens een BHV-training kan met Eye-Tracking gemeten worden hoe een instructie wordt opgevolgd, of cursisten worden afgeleid en belangrijke elementen (gevaar en oplossingen) waarnemen tijdens de simulatie. Echter, een BHV-training met VR en Eye-Tracking (interacties) bestaat niet. In dit project wordt een prototype ontwikkeld waarin Eye-Tracking wordt verwerkt in een 2021 ontwikkelde VR-BHV-training, waarin noodsituaties zoals een kantoorbrand worden gesimuleerd (de BHVR-toepassing). Door middel van een experiment zal het prototype getest worden om zo voor een deel de vraag te beantwoorden in hoeverre en op welke manier Eye-Tracking in VR een meerwaarde biedt voor (RI&E) BHV-trainingen. Dit project sluit daarmee aan op het missie-gedreven innovatiebeleid ‘De Veiligheidsprofessional’ en helpt het MKB dat vaak middelen en kennis ontbreekt voor onderzoek naar effectiviteit rondom innovatieve-technologieën in educatie/training. Het project levert onder meer een prototype op, een productie-rapport en onderzoeks-artikel, en staat open voor nieuwe deelnemers bij het schrijven van een grotere aanvraag rondom de toepassing en effect van VR en Eye-Tracking in BHV-trainingen.
Events play an increasingly big role in our society. Whereas events were mainly considered entertainment in the past, the social function of events is becoming more and more apparent, in particular, in the field of social bonding and in creating a feeling of solidarity.During an event, visitors identify with a theme or topic, and interact with each other about it. Thanks to social media, they can continue these interactions online, which leads to a hybrid network of individuals sharing the same interests. Eventually, this may lead to forming new communities, who communicate with each other both online and offline. However, it is not clear yet how exactly these new communities are being created.This PhD research studies the online and offline interaction rituals of various events and online communities. Through interviews and participating observations at events such as Redhead Days and the Elfia fantasy event, processes are mapped out that result in forming communities at and around events.Partner: Tilburg University
-Chatbots are being used at an increasing rate, for instance, for simple Q&A conversations, flight reservations, online shopping and news aggregation. However, users expect to be served as effective and reliable as they were with human-based systems and are unforgiving once the system fails to understand them, engage them or show them human empathy. This problem is more prominent when the technology is used in domains such as health care, where empathy and the ability to give emotional support are most essential during interaction with the person. Empathy, however, is a unique human skill, and conversational agents such as chatbots cannot yet express empathy in nuanced ways to account for its complex nature and quality. This project focuses on designing emotionally supportive conversational agents within the mental health domain. We take a user-centered co-creation approach to focus on the mental health problems of sexual assault victims. This group is chosen specifically, because of the high rate of the sexual assault incidents and its lifetime destructive effects on the victim and the fact that although early intervention and treatment is necessary to prevent future mental health problems, these incidents largely go unreported due to the stigma attached to sexual assault. On the other hand, research shows that people feel more comfortable talking to chatbots about intimate topics since they feel no fear of judgment. We think an emotionally supportive and empathic chatbot specifically designed to encourage self-disclosure among sexual assault victims could help those who remain silent in fear of negative evaluation and empower them to process their experience better and take the necessary steps towards treatment early on.