Explainable Artificial Intelligence (XAI) aims to provide insights into the inner workings and the outputs of AI systems. Recently, there’s been growing recognition that explainability is inherently human-centric, tied to how people perceive explanations. Despite this, there is no consensus in the research community on whether user evaluation is crucial in XAI, and if so, what exactly needs to be evaluated and how. This systematic literature review addresses this gap by providing a detailed overview of the current state of affairs in human-centered XAI evaluation. We reviewed 73 papers across various domains where XAI was evaluated with users. These studies assessed what makes an explanation “good” from a user’s perspective, i.e., what makes an explanation meaningful to a user of an AI system. We identified 30 components of meaningful explanations that were evaluated in the reviewed papers and categorized them into a taxonomy of human-centered XAI evaluation, based on: (a) the contextualized quality of the explanation, (b) the contribution of the explanation to human-AI interaction, and (c) the contribution of the explanation to human- AI performance. Our analysis also revealed a lack of standardization in the methodologies applied in XAI user studies, with only 19 of the 73 papers applying an evaluation framework used by at least one other study in the sample. These inconsistencies hinder cross-study comparisons and broader insights. Our findings contribute to understanding what makes explanations meaningful to users and how to measure this, guiding the XAI community toward a more unified approach in human-centered explainability.
MULTIFILE
Laurence Alpay, Harmen Bijwaard en Rob Doms hebben bijdrage geleverd aan dit boek. zie hoofdstuk 7. Blz. 159 In dit hoofdstuk bekijken we de betekenis van ‘de mens centraal’ bij de ontwikkeling van technologie voor gezondheidszorg en welzijnsbevordering. In de zorg- en welzijnssector zijn door de vergrijzing straks meer professionals nodig, maar deze zijn waarschijnlijk in onvoldoende mate beschikbaar vanwege budgettaire beperkingen en te weinig menskracht. Technologie kan hier een oplossing bieden door taken over te nemen of te vergemakkelijken.
DOCUMENT
Home care patients often use many medications and are prone to drug-related problems (DRPs). For the management of problems related to drug use, home care could add to the multidisciplinary expertise of general practitioners (GPs) and pharmacists. The home care observation of medication-related problems by home care employees (HOME)-instrument is paper-based and assists home care workers in reporting potential DRPs. To facilitate the multiprofessional consultation, a digital report of DRPs from the HOME-instrument and digital monitoring and consulting of DRPs between home care and general practices and pharmacies is desired. The objective of this study was to develop an electronic HOME system (eHOME), a mobile version of the HOME-instrument that includes a monitoring and a consulting system for primary care.
DOCUMENT
Kinderen met een autisme spectrum stoornis (ASS) kunnen zich vaak moeilijk in anderen verplaatsen en hebben moeite met sociale interactie. In de behandeling van kinderen met ASS wordt ingezet op het trainen van deze sociale vaardigheden (SoVa). SoVa-trainingen hebben echter te weinig effect. Het probleem van de huidige sociale vaardigheidstraining (SoVa) is enerzijds het gebrek aan motivatie bij kinderen met ASS om de training vol te houden en anderzijds de beperkte toepassing van dat wat in de SoVa training wordt geleerd naar het dagelijks leven. Zorgprofessionals concluderen dat aanpassing van de werkvormen gewenst is en hiervoor is een innovatieve blik nodig. De professionals willen nadrukkelijk kijken naar de inzet van digitale toepassingen. Om het effect van de SoVa-trainingen te vergroten wordt in dit project een zgn. Behaviour Change Support System (BCSS) ontwikkeld. Dit BCSS zal bestaan uit een aantal (digitale) toepassingen die met elkaar een logisch samenhangend geheel vormen, passend bij de doelen en methodische kaders die professionals hanteren in de SoVa-trainingen. De toepassingen moeten een set van op maat aan te bieden interventies zijn, gericht op belangrijke c.q. vaak benodigde vaardigheden in sociale interactie. Naast de ontwikkeling van het BCSS richt het project zich ook op het delen van kennis die gegenereerd wordt gedurende het ontwikkelproces van dit BCSS. Het project is een samenwerkingsverband tussen de lectoraten Zorg voor Jeugd, Zorg & Innovatie in de Psychiatrie en iHuman (NHL Hogeschool), het lectoraat User- Centered Design (Hanzehogeschool) en het lectoraat ICT innovatie in de Zorg (Windesheim). Daarnaast wordt samengewerkt met zorgaanbieders van kinder en jeugdpsychiatrie in Noord Nederland (Accare, Kinnik en GGZ Drenthe), diverse scholen basis- en voortgezet onderwijs in Noord-Nederland, het RGOc, de RUG en het kenniscentrum Kinder en Jeugdpsychiatrie. De ontwikkeling van de digitale toepassingen wordt gedaan door 8Dgames.
-Chatbots are being used at an increasing rate, for instance, for simple Q&A conversations, flight reservations, online shopping and news aggregation. However, users expect to be served as effective and reliable as they were with human-based systems and are unforgiving once the system fails to understand them, engage them or show them human empathy. This problem is more prominent when the technology is used in domains such as health care, where empathy and the ability to give emotional support are most essential during interaction with the person. Empathy, however, is a unique human skill, and conversational agents such as chatbots cannot yet express empathy in nuanced ways to account for its complex nature and quality. This project focuses on designing emotionally supportive conversational agents within the mental health domain. We take a user-centered co-creation approach to focus on the mental health problems of sexual assault victims. This group is chosen specifically, because of the high rate of the sexual assault incidents and its lifetime destructive effects on the victim and the fact that although early intervention and treatment is necessary to prevent future mental health problems, these incidents largely go unreported due to the stigma attached to sexual assault. On the other hand, research shows that people feel more comfortable talking to chatbots about intimate topics since they feel no fear of judgment. We think an emotionally supportive and empathic chatbot specifically designed to encourage self-disclosure among sexual assault victims could help those who remain silent in fear of negative evaluation and empower them to process their experience better and take the necessary steps towards treatment early on.
In this project, we explore how healthcare providers and the creative industry can collaborate to develop effective digital mental health interventions, particularly for survivors of sexual assault. Sexual assault victims face significant barriers to seeking professional help, including shame, self-blame, and fear of judgment. With over 100,000 cases reported annually in the Netherlands the need for accessible, stigma-free support is urgent. Digital interventions, such as chatbots, offer a promising solution by providing a safe, confidential, and cost-effective space for victims to share their experiences before seeking professional care. However, existing commercial AI chatbots remain unsuitable for complex mental health support. While widely used for general health inquiries and basic therapy, they lack the human qualities essential for empathetic conversations. Additionally, training AI for this sensitive context is challenging due to limited caregiver-patient conversation data. A key concern raised by professionals worldwide is the risk of AI-driven chatbots being misused as therapy substitutes. Without proper safeguards, they may offer inappropriate responses, potentially harming users. This highlights the urgent need for strict design guidelines, robust safety measures, and comprehensive oversight in AI-based mental health solutions. To address these challenges, this project brings together experts from healthcare and design fields—especially conversation designers—to explore the power of design in developing a trustworthy, user-centered chatbot experience tailored to survivors' needs. Through an iterative process of research, co-creation, prototyping, and evaluation, we aim to integrate safe and effective digital support into mental healthcare. Our overarching goal is to bridge the gap between digital healthcare and the creative sector, fostering long-term collaboration. By combining clinical expertise with design innovation, we seek to develop personalized tools that ethically and effectively support individuals with mental health problems.