Youth care is under increasing pressure, with rising demand, longer waiting lists, and growing staff shortages. In the Netherlands, one in seven children and adolescents is currently receiving youth care. At the same time, professionals face high workloads, burnout risks, and significant administrative burdens. This combination threatens both the accessibility and quality of care, leading to escalating problems for young people and families.
Artificial intelligence (AI) offers promising opportunities to relieve these pressures by supporting professionals in their daily work. However, many AI initiatives in youth care fail to move beyond pilot stages, due to barriers such as lack of user acceptance, ethical concerns, limited professional ownership, and insufficient integration into daily practice. Empirical research on how AI can be responsibly and sustainably embedded in youth care is still scarce.
This PD project aims to develop practice-based insights and strategies that strengthen the acceptance and long-term adoption of AI in youth care, in ways that support professional practice and contribute to appropriate care. The focus lies not on the technology itself, but on how professionals can work with AI within complex, high-pressure contexts.
The research follows a cyclical, participatory approach, combining three complementary implementation frameworks: the Implementation Guide (Kaptein), the CFIR model (Damschroder), and the NASSS-CAT framework (Greenhalgh). Three case studies serve as core learning environments: (1) a speech-to-text AI tool to support clinical documentation, (2) Microsoft Copilot 365 for organization-wide adoption in support teams, and (3) an AI chatbot for parents in high-conflict divorces.
Throughout the project, professionals, clients, ethical experts, and organizational stakeholders collaborate to explore the practical, ethical, and organizational conditions under which AI can responsibly strengthen youth care services.
This project has no products
To be started
Not known
PD.PD.PD03.016