The user’s experience with a recommender system is significantly shaped by the dynamics of user-algorithm interactions. These interactions are often evaluated using interaction qualities, such as controllability, trust, and autonomy, to gauge their impact. As part of our effort to systematically categorize these evaluations, we explored the suitability of the interaction qualities framework as proposed by Lenz, Dieffenbach and Hassenzahl. During this examination, we uncovered four challenges within the framework itself, and an additional external challenge. In studies examining the interaction between user control options and interaction qualities, interdependencies between concepts, inconsistent terminology, and the entity perspective (is it a user’s trust or a system’s trustworthiness) often hinder a systematic inventory of the findings. Additionally, our discussion underscored the crucial role of the decision context in evaluating the relation of algorithmic affordances and interaction qualities. We propose dimensions of decision contexts (such as ‘reversibility of the decision’, or ‘time pressure’). They could aid in establishing a systematic three-way relationship between context attributes, attributes of user control mechanisms, and experiential goals, and as such they warrant further research. In sum, while the interaction qualities framework serves as a foundational structure for organizing research on evaluating the impact of algorithmic affordances, challenges related to interdependencies and context-specific influences remain. These challenges necessitate further investigation and subsequent refinement and expansion of the framework.
LINK
Introduction: Sensor-feedback systems can be used to support people after stroke during independent practice of gait. The main aim of the study was to describe the user-centred approach to (re)design the user interface of the sensor feedback system “Stappy” for people after stroke, and share the deliverables and key observations from this process. Methods: The user-centred approach was structured around four phases (the discovery, definition, development and delivery phase) which were fundamental to the design process. Fifteen participants with cognitive and/or physical limitations participated (10 women, 2/3 older than 65). Prototypes were evaluated in multiple test rounds, consisting of 2–7 individual test sessions. Results: Seven deliverables were created: a list of design requirements, a personae, a user flow, a low-, medium- and high-fidelity prototype and the character “Stappy”. The first six deliverables were necessary tools to design the user interface, whereas the character was a solution resulting from this design process. Key observations related to “readability and contrast of visual information”, “understanding and remembering information”, “physical limitations” were confirmed by and “empathy” was additionally derived from the design process. Conclusions: The study offers a structured methodology resulting in deliverables and key observations, which can be used to (re)design meaningful user interfaces for people after stroke. Additionally, the study provides a technique that may promote “empathy” through the creation of the character Stappy. The description may provide guidance for health care professionals, researchers or designers in future user interface design projects in which existing products are redesigned for people after stroke.
DOCUMENT
Our study introduces an open general-purpose platform for the embodiment of conversational AI systems. Conversational User-interface Based Embodiment (CUBE) is designed to streamline the integration of embodied solutions into text-based dialog managers, providing flexibility for customization depending on the specific use case and application. CUBE is responsible for naturally interacting with users by listening, observing, and responding to them. A detailed account of the design and implementation of the solution is provided, as well as a thorough examination of how it can be integrated by developers and AI dialogue manager integrators. Through interviews with developers, insight was gained into the advantages of such systems. Additionally, key areas that require further research were identified in the current challenges in achieving natural interaction between the user and the embodiments. CUBE bridges some of the gaps by providing controls to further develop natural non-verbal communication.
LINK
Alcohol Use Disorder (AUD) involves uncontrollable drinking despite negative consequences, a challenge amplified in festivals. ARise is a project using Augmented Reality (AR) to prevent AUD by helping festival visitors refuse alcohol and other substances. Based on the first Augmented Reality Exposure Therapy (ARET) for clinical AUD treatment, ARise uses a smartphone app with AR glasses to project virtual humans that tempt visitors to drink alcohol. Users interact in a safe and personalized way with these virtual humans through phone, voice, and gesture interactions. The project gathers festival feedback on user experience, awareness, usability, and potential expansion to other substances.Societal issueHelping treatment of addiction and stimulate social inclusion.Benefit to societyMore people less patients: decrease health cost and increase in inclusion and social happiness.Collaborative partnersNovadic-Kentron, Thalamusa
Alcohol use disorder (AUD) is a pattern of alcohol use that involves having trouble controlling drinking behaviour, even when it causes health issues (addiction) or problems functioning in daily (social and professional) life. Moreover, festivals are a common place where large crowds of festival-goers experience challenges refusing or controlling alcohol and substance use. Studies have shown that interventions at festivals are still very problematic. ARise is the first project that wants to help prevent AUD at festivals using Augmented Reality (AR) as a tool to help people, particular festival visitors, to say no to alcohol (and other substances). ARise is based on the on the first Augmented Reality Exposure Therapy (ARET) in the world that we developed for clinical treatment of AUD. It is an AR smartphone driven application in which (potential) visitors are confronted with virtual humans that will try to seduce the user to accept an alcoholic beverage. These virtual humans are projected in the real physical context (of a festival), using innovative AR glasses. Using intuitive phone, voice and gesture interactions, it allows users to personalize the safe experience by choosing different drinks and virtual humans with different looks and levels of realism. ARET has been successfully developed and tested on (former) AUD patients within a clinical setting. Research with patients and healthcare specialists revealed the wish to further develop ARET as a prevention tool to reach people before being diagnosed with AUD and to extend the application for other substances (smoking and pills). In this project, festival visitors will experience ARise and provide feedback on the following topics: (a) experience, (b) awareness and confidence to refuse alcohol drinks, (c) intention to use ARise, (d) usability & efficiency (the level of realism needed), and (e) ideas on how to extend ARise with new substances.
The focus of the research is 'Automated Analysis of Human Performance Data'. The three interconnected main components are (i)Human Performance (ii) Monitoring Human Performance and (iii) Automated Data Analysis . Human Performance is both the process and result of the person interacting with context to engage in tasks, whereas the performance range is determined by the interaction between the person and the context. Cheap and reliable wearable sensors allow for gathering large amounts of data, which is very useful for understanding, and possibly predicting, the performance of the user. Given the amount of data generated by such sensors, manual analysis becomes infeasible; tools should be devised for performing automated analysis looking for patterns, features, and anomalies. Such tools can help transform wearable sensors into reliable high resolution devices and help experts analyse wearable sensor data in the context of human performance, and use it for diagnosis and intervention purposes. Shyr and Spisic describe Automated Data Analysis as follows: Automated data analysis provides a systematic process of inspecting, cleaning, transforming, and modelling data with the goal of discovering useful information, suggesting conclusions and supporting decision making for further analysis. Their philosophy is to do the tedious part of the work automatically, and allow experts to focus on performing their research and applying their domain knowledge. However, automated data analysis means that the system has to teach itself to interpret interim results and do iterations. Knuth stated: Science is knowledge which we understand so well that we can teach it to a computer; and if we don't fully understand something, it is an art to deal with it.[Knuth, 1974]. The knowledge on Human Performance and its Monitoring is to be 'taught' to the system. To be able to construct automated analysis systems, an overview of the essential processes and components of these systems is needed.Knuth Since the notion of an algorithm or a computer program provides us with an extremely useful test for the depth of our knowledge about any given subject, the process of going from an art to a science means that we learn how to automate something.