In this study, we test the immersive character in an interactive content narrative developed for Microsoft HoloLens 2 mixed reality glasses in the dining context. We use retrospective think aloud protocol (RTAP) and galvanic skin response (GSR) to explore different types of immersion that can be created through interactive content narratives. Leaning on the core dimensions of the experience economy, we expand the current understanding on the role of immersion through integration of four immersive experience facilitators. The study revealed that these immersive experience facilitators occur simultaneously and can be enhanced through interactive content narrative design. Perceived novelty and curiosity were identified as key determinants to keep consumers engaged in the immersive experience and engage with the content. The study verifies the use of galvanic skin response in combination with retrospective think aloud protocol as a suitable approach to measure emotional engagement potential in interpreting consumers’ recollection of immersive experiences.
MULTIFILE
Learning is all about feedback. Runners, for example, use apps like the RunKeeper. Research shows that apps like that enhance engagement and results. And people think it is fun. The essence being that the behavior of the runner is tracked and communicated back to the runner in a dashboard. We wondered if you can reach the same positive effect if you had a dashboard for Study-behaviour. For students. And what should you measure, track and communicate? We wondered if we could translate the Quantified Self Movement into a Quantified Student. So, together with students, professors and companies we started designing & building Quantified Student Apps. Apps that were measuring all kinds of study-behaviour related data. Things like Time On Campus, Time Online, Sleep, Exercise, Galvanic Skin Response, Study Results and so on. We developed tools to create study – information and prototyped the Apps with groups of student. At the same time we created a Big Data Lake and did a lot of Privacy research. The Big Difference between the Quantified Student Program and Learning Analytics is that we only present the data to the student. It is his/her data! It is his/her decision to act on it or not. The Quantified Student Apps are designed as a Big Mother never a Big Brother. The project has just started. But we already designed, created and learned a lot. 1. We designed and build for groups of prototypes for Study behavior Apps: a. Apps that measure sleep & exercise and compare it to study results, like MyRhytm; b. Apps that measure study hours and compare it to study results, like Nomi; c. Apps that measure group behavior and signal problems, like Groupmotion; d. Apps that measure on campus time and compare it with peers, like workhorse; 2. We researched student fysics to see if we could find his personal Cup-A-Soup-Moment (meaning, can we find by looking at his/her biometrics when the concentration levels dip?); 3. We created a Big Data lake with student data and Open Data and are looking for correlation and causality there. We already found some interesting patterns. In doing so we learned a lot. We learned it is often hard to acquire the right data. It is hard to create and App or a solution that is presenting the data in the right way and presents it in a form of actionable information. We learned that health trackers are still very inprecise. We learned about (and solved some) challenges surrounding privacy. Next year (2017) we will scale the most promising prototype, measure the effects, start a new researchproject and continu working on our data lake. Things will be interesting, and we will blog about it on www.quantifiedstudent.nl.
LINK
Recent advancements in mobile sensing and wearable technologies create new opportunities to improve our understanding of how people experience their environment. This understanding can inform urban design decisions. Currently, an important urban design issue is the adaptation of infrastructure to increasing cycle and e-bike use. Using data collected from 12 cyclists on a cycle highway between two municipalities in The Netherlands, we coupled location and wearable emotion data at a high spatiotemporal resolution to model and examine relationships between cyclists' emotional arousal (operationalized as skin conductance responses) and visual stimuli from the environment (operationalized as extent of visible land cover type). We specifically took a within-participants multilevel modeling approach to determine relationships between different types of viewable land cover area and emotional arousal, while controlling for speed, direction, distance to roads, and directional change. Surprisingly, our model suggests ride segments with views of larger natural, recreational, agricultural, and forested areas were more emotionally arousing for participants. Conversely, segments with views of larger developed areas were less arousing. The presented methodological framework, spatial-emotional analyses, and findings from multilevel modeling provide new opportunities for spatial, data-driven approaches to portable sensing and urban planning research. Furthermore, our findings have implications for design of infrastructure to optimize cycling experiences.
MULTIFILE