In this study, we test the immersive character in an interactive content narrative developed for Microsoft HoloLens 2 mixed reality glasses in the dining context. We use retrospective think aloud protocol (RTAP) and galvanic skin response (GSR) to explore different types of immersion that can be created through interactive content narratives. Leaning on the core dimensions of the experience economy, we expand the current understanding on the role of immersion through integration of four immersive experience facilitators. The study revealed that these immersive experience facilitators occur simultaneously and can be enhanced through interactive content narrative design. Perceived novelty and curiosity were identified as key determinants to keep consumers engaged in the immersive experience and engage with the content. The study verifies the use of galvanic skin response in combination with retrospective think aloud protocol as a suitable approach to measure emotional engagement potential in interpreting consumers’ recollection of immersive experiences.
MULTIFILE
Background and objectivesBefore the Covid-19 pandemic, important social studies already indicated the severe negative feedback associated with high-rise developments. During the Covid-19 pandemic, citizens were confronted with their neighbourhoods’ insufficient restorative capacity to maintain their health and well-being. New methods are urgently required to analyse and learn from existing high-density developments to prevent a repetition of past mistakes and to catalyse the salutary effects of architecture in new developments.Process and methods (for empirical research)The Sensing Streetscapes research investigated the potential of emerging biometric technologies to examine the effects of commonly applied urban design principles in six western cities. In one outdoor and four laboratory tests, eye-tracking technology with sound-recording and Galvanic Skin Response captured subjects’ (un)conscious attention patterns and arousal levels when viewing streets on eye level. Triangulation with other techniques, such as mouse tracking to record participants’ appreciation value and expert panels from spatial design practice, showed the positive and negative impact of stimuli.Main results (or main arguments in the case of critical reviews)The preliminary results provide a dynamic understanding of urban experience and how it is affected by the presence or absence of design principles. The results suggest that streets with high levels of detail and variety may contribute to a high level of engagement with the built environment. It also shows that traffic is likely an important factor in causing stress and diminishing the restorative capacity society seeks.Implications for research and practice/policy | Importance and originality of the contributionThe research study led to the development of a Dynamic User Experience Assessment (D-UXA) tool that supports researchers and designers in understanding the impact of design decisions on users’ experience, spatial perception and (walking) behaviour. D-UXA enables a human-centred analysis and is designed to fill the gap between traditional empirical methods and aspirations for an evidence-based promotion of human health and wellbeing in (high-density) urban developments.
MULTIFILE
Learning is all about feedback. Runners, for example, use apps like the RunKeeper. Research shows that apps like that enhance engagement and results. And people think it is fun. The essence being that the behavior of the runner is tracked and communicated back to the runner in a dashboard. We wondered if you can reach the same positive effect if you had a dashboard for Study-behaviour. For students. And what should you measure, track and communicate? We wondered if we could translate the Quantified Self Movement into a Quantified Student. So, together with students, professors and companies we started designing & building Quantified Student Apps. Apps that were measuring all kinds of study-behaviour related data. Things like Time On Campus, Time Online, Sleep, Exercise, Galvanic Skin Response, Study Results and so on. We developed tools to create study – information and prototyped the Apps with groups of student. At the same time we created a Big Data Lake and did a lot of Privacy research. The Big Difference between the Quantified Student Program and Learning Analytics is that we only present the data to the student. It is his/her data! It is his/her decision to act on it or not. The Quantified Student Apps are designed as a Big Mother never a Big Brother. The project has just started. But we already designed, created and learned a lot. 1. We designed and build for groups of prototypes for Study behavior Apps: a. Apps that measure sleep & exercise and compare it to study results, like MyRhytm; b. Apps that measure study hours and compare it to study results, like Nomi; c. Apps that measure group behavior and signal problems, like Groupmotion; d. Apps that measure on campus time and compare it with peers, like workhorse; 2. We researched student fysics to see if we could find his personal Cup-A-Soup-Moment (meaning, can we find by looking at his/her biometrics when the concentration levels dip?); 3. We created a Big Data lake with student data and Open Data and are looking for correlation and causality there. We already found some interesting patterns. In doing so we learned a lot. We learned it is often hard to acquire the right data. It is hard to create and App or a solution that is presenting the data in the right way and presents it in a form of actionable information. We learned that health trackers are still very inprecise. We learned about (and solved some) challenges surrounding privacy. Next year (2017) we will scale the most promising prototype, measure the effects, start a new researchproject and continu working on our data lake. Things will be interesting, and we will blog about it on www.quantifiedstudent.nl.
LINK