Schön describes the way a designer engages with their materials as a “conversation”. In clothing design this typically involves tangible and situated actions such as draping, ripping, and cutting—actions that evoke responses from the fabric at hand. Dynamic fabrics—surface- changing fabrics that combine digital and physical states— are still novel fashion-design materials. When working with the digital, intangible qualities of these fabrics, how does a dialogue unfold for designers accustomed to working physically with fabrics? In this paper we examine the design process of Phem, a collection of garments that use dynamic fabrics that function similarly to augmented reality. We reflect upon the improvisations required to satisfy a productive dialogue with the digital forms of these materials. We conclude with a discussion that proposes revisiting Schön’s notion of a conversation in the context of digital forms, and use Ingold’s perspectives on making to inform this inquiry.
Post-partum hemorrhaging is a medical emergency that occurs during childbirth and, in extreme cases, can be life-threatening. It is the number one cause of maternal mortality worldwide. High-quality training of medical staff can contribute to early diagnosis and work towards preventing escalation towards more serious cases. Healthcare education uses manikin-based simulators to train obstetricians for various childbirth scenarios before training on real patients. However, these medical simulators lack certain key features portraying important symptoms and are incapable of communicating with the trainees. The authors present a digital embodiment agent that can improve the current state of the art by providing a specification of the requirements as well as an extensive design and development approach. This digital embodiment allows educators to respond and role-play as the patient in real time and can easily be integrated with existing training procedures. This research was performed in collaboration with medical experts, making a new contribution to medical training by bringing digital humans and the representation of affective interfaces to the field of healthcare.
Dit promotieproject richt zich op Conversational Agents en hun rol in de dienstverlening in het publieke domein. Geautomatiseerde vormen van communicatie komen steeds vaker voor. Dit roept vragen op over het opbouwen van relaties, vertrouwen, vormen van servicegebruik en data-ethiek.Doel De interdisciplinaire studie onderzoekt kritisch hoe de interacties van burgers met Conversational Agents het vertrouwen in publieke organisaties vormgeven. Resultaten Inzichten over huidig en eerder onderzoek naar vertrouwen en Conversational Agents door middel van een systematisch literatuuronderzoek Identificatie van ‘trust markers’ in gebruikersinteracties met bots Inzichten over opvattingen en reacties van burgers op verschillende gradaties van antropomorfisering in CA-design Begrip over de rol van Conversational Agents in de citizen journey Looptijd 01 januari 2023 - 01 januari 2027 Aanpak Er zullen vier onderzoeken worden uitgevoerd, afgestemd op dimensies van vertrouwen. Deze studies gaan over concepten van vertrouwen, identificeren ‘trust markers’ in mens-bot-dialogen, voeren experimenten uit rond mens-bot-relaties en onderzoeken de rol van CA's in de burgerreis door digitale diensten. Afstudeerproject Chatbots en Voice assistants Tijdens het onderzoeksproject Bots of Trust (BOT) zijn er verschillende mogelijkheden om met studenten samen te werken aan een gerelateerd vraagstuk zoals chatbots en/of voice assistants en hoe deze vorm geven aan vertrouwen in verschillende sectoren.
-Chatbots are being used at an increasing rate, for instance, for simple Q&A conversations, flight reservations, online shopping and news aggregation. However, users expect to be served as effective and reliable as they were with human-based systems and are unforgiving once the system fails to understand them, engage them or show them human empathy. This problem is more prominent when the technology is used in domains such as health care, where empathy and the ability to give emotional support are most essential during interaction with the person. Empathy, however, is a unique human skill, and conversational agents such as chatbots cannot yet express empathy in nuanced ways to account for its complex nature and quality. This project focuses on designing emotionally supportive conversational agents within the mental health domain. We take a user-centered co-creation approach to focus on the mental health problems of sexual assault victims. This group is chosen specifically, because of the high rate of the sexual assault incidents and its lifetime destructive effects on the victim and the fact that although early intervention and treatment is necessary to prevent future mental health problems, these incidents largely go unreported due to the stigma attached to sexual assault. On the other hand, research shows that people feel more comfortable talking to chatbots about intimate topics since they feel no fear of judgment. We think an emotionally supportive and empathic chatbot specifically designed to encourage self-disclosure among sexual assault victims could help those who remain silent in fear of negative evaluation and empower them to process their experience better and take the necessary steps towards treatment early on.