In this series on psychological foundations from an applied psychological perspective, we focus this time on shame, guilt and sin. Shame, guilt and sin share that all three are strong motivators for behavioural change. However, we can respond to them in very different ways. In this article it is argued that original sin is a variant of guilt and misfortune, that seems unfair: how can a new born child be blamed for being born in an unbalanced world? Denial is a common reaction to original sin, while this form of guilt plays a major role in our technological ostrich policy.
MULTIFILE
In this series on psychological foundations from an applied psychological perspective, we focus this time on shame, guilt and sin. Shame, guilt and sin share that all three are strong motivators for behavioural change. However, we can respond to them in very different ways. In this article it is argued that original sin is a variant of guilt and misfortune, that seems unfair: how can a new born child be blamed for being born in an unbalanced world? Denial is a common reaction to original sin, while this form of guilt plays a major role in our technological ostrich policy.
MULTIFILE
From January 2011 until December 2012, forty Family Group Conferences (FGCs) will be studied in the public mental health care (PMHC) setting in the province of Groningen, the Netherlands. Research should yield an answer to whether FGCs are valuable for clients in PMHC as a means to generate social support, to prevent coercion and to elevate the work of professionals. The present study reports on two case studies in which shame and fear of rejection are designated as main causes for clients to avoid contact with their social network, resulting in isolated and marginalised living circumstances. Shame, on the other hand, is also a powerful engine in preventing clients from relapse into marginalised circumstances for which one needs to feel ashamed again. An FGC offers a forum where clients are able to discuss their shameful feelings with their social network; it generates support and helps breaking through vicious circles of marginalisation and social isolation. Findings of these case studies confirm an assumption from a previous study that a limited or broken social network is not a contraindication, but a reason for organising FGCs.
LINK
-Chatbots are being used at an increasing rate, for instance, for simple Q&A conversations, flight reservations, online shopping and news aggregation. However, users expect to be served as effective and reliable as they were with human-based systems and are unforgiving once the system fails to understand them, engage them or show them human empathy. This problem is more prominent when the technology is used in domains such as health care, where empathy and the ability to give emotional support are most essential during interaction with the person. Empathy, however, is a unique human skill, and conversational agents such as chatbots cannot yet express empathy in nuanced ways to account for its complex nature and quality. This project focuses on designing emotionally supportive conversational agents within the mental health domain. We take a user-centered co-creation approach to focus on the mental health problems of sexual assault victims. This group is chosen specifically, because of the high rate of the sexual assault incidents and its lifetime destructive effects on the victim and the fact that although early intervention and treatment is necessary to prevent future mental health problems, these incidents largely go unreported due to the stigma attached to sexual assault. On the other hand, research shows that people feel more comfortable talking to chatbots about intimate topics since they feel no fear of judgment. We think an emotionally supportive and empathic chatbot specifically designed to encourage self-disclosure among sexual assault victims could help those who remain silent in fear of negative evaluation and empower them to process their experience better and take the necessary steps towards treatment early on.
In this project, we explore how healthcare providers and the creative industry can collaborate to develop effective digital mental health interventions, particularly for survivors of sexual assault. Sexual assault victims face significant barriers to seeking professional help, including shame, self-blame, and fear of judgment. With over 100,000 cases reported annually in the Netherlands the need for accessible, stigma-free support is urgent. Digital interventions, such as chatbots, offer a promising solution by providing a safe, confidential, and cost-effective space for victims to share their experiences before seeking professional care. However, existing commercial AI chatbots remain unsuitable for complex mental health support. While widely used for general health inquiries and basic therapy, they lack the human qualities essential for empathetic conversations. Additionally, training AI for this sensitive context is challenging due to limited caregiver-patient conversation data. A key concern raised by professionals worldwide is the risk of AI-driven chatbots being misused as therapy substitutes. Without proper safeguards, they may offer inappropriate responses, potentially harming users. This highlights the urgent need for strict design guidelines, robust safety measures, and comprehensive oversight in AI-based mental health solutions. To address these challenges, this project brings together experts from healthcare and design fields—especially conversation designers—to explore the power of design in developing a trustworthy, user-centered chatbot experience tailored to survivors' needs. Through an iterative process of research, co-creation, prototyping, and evaluation, we aim to integrate safe and effective digital support into mental healthcare. Our overarching goal is to bridge the gap between digital healthcare and the creative sector, fostering long-term collaboration. By combining clinical expertise with design innovation, we seek to develop personalized tools that ethically and effectively support individuals with mental health problems.