This project was a collaboration with UMC Utrecht and the Center for Sexual Assault (CSG) in the Netherlands, exploring the use of a chatbot to support victims of sexual assault in self-disclosure and seeking professional help.
Through co-creation sessions and interviews with experts at UMC and CSG, we identified key barriers victims face: stigma, fear of judgment, shame, guilt, and emotional overwhelm. Many victims feel alone, misunderstood, unheard, and avoid discussing their trauma, as it feels like opening Pandora’s box. Although their experience disrupts daily life—causing flashbacks, anxiety, sleep disorders, and strained relationships—they often suppress it.
We also found that even when victims contact helplines, they struggle to verbalize their experiences, especially expressing the physical aspects in explicit language disgusts them. One therapist shared: “Our biggest challenge is helping those who call and say: ‘I need to talk, but I can’t,’ then cry and hang up.”
Next in collaboration with our partners at conversation design institute and the chatbot companies, we developed a prototype for guided-self-disclosure. Instead of prompting victims to openly describe their experience, and wanting them to start talking, this chatbot presents stories of others allowing the individuals to recognize their own experience among them. This approach is less daunting as it allows individuals to unfold their experience through bits of others’ stories, helping them find words like building blocks to describe their experience. It also empowers them by showing they are not alone and that others have experienced the same.
We tested this prototype with the therapists, who found it more inviting and accessible than direct questioning. They noted: “Learning through someone else’s story makes the topic more approachable and less threatening than being asked: ‘Tell me what happened.’”
We believe this approach has the potential to become a real-world model. The Center for Sexual Assault hopes to integrate the chatbot into its website, extending its reach and supporting victims in their journey toward seeking help.
-Chatbots are being used at an increasing rate, for instance, for simple Q&A conversations, flight reservations, online shopping and news aggregation. However, users expect to be served as effective and reliable as they were with human-based systems and are unforgiving once the system fails to understand them, engage them or show them human empathy. This problem is more prominent when the technology is used in domains such as health care, where empathy and the ability to give emotional support are most essential during interaction with the person. Empathy, however, is a unique human skill, and conversational agents such as chatbots cannot yet express empathy in nuanced ways to account for its complex nature and quality.
This project focuses on designing emotionally supportive conversational agents within the mental health domain. We take a user-centered co-creation approach to focus on the mental health problems of sexual assault victims. This group is chosen specifically, because of the high rate of the sexual assault incidents and its lifetime destructive effects on the victim and the fact that although early intervention and treatment is necessary to prevent future mental health problems, these incidents largely go unreported due to the stigma attached to sexual assault. On the other hand, research shows that people feel more comfortable talking to chatbots about intimate topics since they feel no fear of judgment. We think an emotionally supportive and empathic chatbot specifically designed to encourage self-disclosure among sexual assault victims could help those who remain silent in fear of negative evaluation and empower them to process their experience better and take the necessary steps towards treatment early on.
Er zijn geen producten gekoppeld
Afgerond
Niet bekend