Background: The aim of this study is to validate a newly developed nurses' self-efficacy sources inventory. We test the validity of a five-dimensional model of sources of self-efficacy, which we contrast with the traditional four-dimensional model based on Bandura's theoretical concepts. Methods: Confirmatory factor analysis was used in the development of the newly developed self-efficacy measure. Model fit was evaluated based upon commonly recommended goodness-of-fit indices, including the χ2 of the model fit, the Root Mean Square Error of approximation (RMSEA), the Tucker-Lewis Index (TLI), the Standardized Root Mean Square Residual (SRMR), and the Bayesian Information Criterion (BIC). Results: All 22 items of the newly developed five-factor sources of self-efficacy have high factor loadings (range .40-.80). Structural equation modeling showed that a five-factor model is favoured over the four-factor model. Conclusions and implications: Results of this study show that differentiation of the vicarious experience source into a peer- and expert based source reflects better how nursing students develop self-efficacy beliefs. This has implications for clinical learning environments: a better and differentiated use of self-efficacy sources can stimulate the professional development of nursing students.
Backgroundafter hospitalisation for cardiac disease, older patients are at high risk of readmission and death.Objectivethe cardiac care bridge (CCB) transitional care programme evaluated the impact of combining case management, disease management and home-based cardiac rehabilitation (CR) on hospital readmission and mortality.Designsingle-blind, randomised clinical trial.Settingthe trial was conducted in six hospitals in the Netherlands between June 2017 and March 2020. Community-based nurses and physical therapists continued care post-discharge.Subjectscardiac patients ≥ 70 years were eligible if they were at high risk of functional loss or if they had had an unplanned hospital admission in the previous 6 months.Methodsthe intervention group received a comprehensive geriatric assessment-based integrated care plan, a face-to-face handover with the community nurse before discharge and follow-up home visits. The community nurse collaborated with a pharmacist and participants received home-based CR from a physical therapist. The primary composite outcome was first all-cause unplanned readmission or mortality at 6 months.Resultsin total, 306 participants were included. Mean age was 82.4 (standard deviation 6.3), 58% had heart failure and 92% were acutely hospitalised. 67% of the intervention key-elements were delivered. The composite outcome incidence was 54.2% (83/153) in the intervention group and 47.7% (73/153) in the control group (risk differences 6.5% [95% confidence intervals, CI −4.7 to 18%], risk ratios 1.14 [95% CI 0.91–1.42], P = 0.253). The study was discontinued prematurely due to implementation activities in usual care.Conclusionin high-risk older cardiac patients, the CCB programme did not reduce hospital readmission or mortality within 6 months.Trial registrationNetherlands Trial Register 6,316, https://www.trialregister.nl/trial/6169
MULTIFILE
Background: Home care professionals regularly observe drug-related problems during home care provision. Problems related to the process of the medication therapy could involve discrepancies in medication prescriptions between the hospital discharge letter and the medication administration record lists (MARL) or insufficient drug delivery. The objective of this study is to determine the potential clinical consequences of medication process problems observed by home care professionals, since those consequences have not been assessed before. Methods: A retrospective descriptive study design was performed. An expert panel performed an assessment procedure on the clinical consequences of medication process problems. Such problems were reported by home care professionals during routine care (May 2016 until May 2017) using the eHOME system, which is a digital system developed to assist in the reporting and monitoring of drug-related problems. Using a three-point scale, an expert panel assessed the potential clinical consequences of those medication process problems among older home care patients (aged 65 years and over). Results: 309 medication process problems in 120 out of 451 patients were assessed for potential discomfort or clinical deterioration. The problems involved the following: medication discrepancies (new prescription not listed on the MARL [n = 69, 36.7%]; medication stopped by the prescriber but still listed on the MARL [n = 43, 22.9%]; discrepant time of intake [n = 25, 13.3%]; frequency [n = 24, 12.8%]; and dose [n = 21, 11.2%], therapeutic duplication listed on the MARL [n = 5, 2.6%]; and discrepant information on route of administration [n = 1, 0.5%]); an undelivered MARL [n = 103, 33.3%]; undelivered medication [n = 16, 5.2%]; and excessive medication delivery [n = 2, 0.7%]. Furthermore, 180 (58.2%) out of 309 medication process problems were assessed as having the potential for moderate or severe discomfort or clinical deterioration in patients. Conclusions: The majority of medication process problems may result in patient discomfort or clinical deterioration.
Alcohol use disorder (AUD) is a major problem. In the USA alone there are 15 million people with an AUD and more than 950,000 Dutch people drink excessively. Worldwide, 3-8% of all deaths and 5% of all illnesses and injuries are attributable to AUD. Care faces challenges. For example, more than half of AUD patients relapse within a year of treatment. A solution for this is the use of Cue-Exposure-Therapy (CET). Clients are exposed to triggers through objects, people and environments that arouse craving. Virtual Reality (VRET) is used to experience these triggers in a realistic, safe, and personalized way. In this way, coping skills are trained to counteract alcohol cravings. The effectiveness of VRET has been (clinically) proven. However, the advent of AR technologies raises the question of exploring possibilities of Augmented-Reality-Exposure-Therapy (ARET). ARET enjoys the same benefits as VRET (such as a realistic safe experience). But because AR integrates virtual components into the real environment, with the body visible, it presumably evokes a different type of experience. This may increase the ecological validity of CET in treatment. In addition, ARET is cheaper to develop (fewer virtual elements) and clients/clinics have easier access to AR (via smartphone/tablet). In addition, new AR glasses are being developed, which solve disadvantages such as a smartphone screen that is too small. Despite the demand from practitioners, ARET has never been developed and researched around addiction. In this project, the first ARET prototype is developed around AUD in the treatment of alcohol addiction. The prototype is being developed based on Volumetric-Captured-Digital-Humans and made accessible for AR glasses, tablets and smartphones. The prototype will be based on RECOVRY, a VRET around AUD developed by the consortium. A prototype test among (ex)AUD clients will provide insight into needs and points for improvement from patient and care provider and into the effect of ARET compared to VRET.
Huntington’s disease (HD) and various spinocerebellar ataxias (SCA) are autosomal dominantly inherited neurodegenerative disorders caused by a CAG repeat expansion in the disease-related gene1. The impact of HD and SCA on families and individuals is enormous and far reaching, as patients typically display first symptoms during midlife. HD is characterized by unwanted choreatic movements, behavioral and psychiatric disturbances and dementia. SCAs are mainly characterized by ataxia but also other symptoms including cognitive deficits, similarly affecting quality of life and leading to disability. These problems worsen as the disease progresses and affected individuals are no longer able to work, drive, or care for themselves. It places an enormous burden on their family and caregivers, and patients will require intensive nursing home care when disease progresses, and lifespan is reduced. Although the clinical and pathological phenotypes are distinct for each CAG repeat expansion disorder, it is thought that similar molecular mechanisms underlie the effect of expanded CAG repeats in different genes. The predicted Age of Onset (AO) for both HD, SCA1 and SCA3 (and 5 other CAG-repeat diseases) is based on the polyQ expansion, but the CAG/polyQ determines the AO only for 50% (see figure below). A large variety on AO is observed, especially for the most common range between 40 and 50 repeats11,12. Large differences in onset, especially in the range 40-50 CAGs not only imply that current individual predictions for AO are imprecise (affecting important life decisions that patients need to make and also hampering assessment of potential onset-delaying intervention) but also do offer optimism that (patient-related) factors exist that can delay the onset of disease.To address both items, we need to generate a better model, based on patient-derived cells that generates parameters that not only mirror the CAG-repeat length dependency of these diseases, but that also better predicts inter-patient variations in disease susceptibility and effectiveness of interventions. Hereto, we will use a staggered project design as explained in 5.1, in which we first will determine which cellular and molecular determinants (referred to as landscapes) in isogenic iPSC models are associated with increased CAG repeat lengths using deep-learning algorithms (DLA) (WP1). Hereto, we will use a well characterized control cell line in which we modify the CAG repeat length in the endogenous ataxin-1, Ataxin-3 and Huntingtin gene from wildtype Q repeats to intermediate to adult onset and juvenile polyQ repeats. We will next expand the model with cells from the 3 (SCA1, SCA3, and HD) existing and new cohorts of early-onset, adult-onset and late-onset/intermediate repeat patients for which, besides accurate AO information, also clinical parameters (MRI scans, liquor markers etc) will be (made) available. This will be used for validation and to fine-tune the molecular landscapes (again using DLA) towards the best prediction of individual patient related clinical markers and AO (WP3). The same models and (most relevant) landscapes will also be used for evaluations of novel mutant protein lowering strategies as will emerge from WP4.This overall development process of landscape prediction is an iterative process that involves (a) data processing (WP5) (b) unsupervised data exploration and dimensionality reduction to find patterns in data and create “labels” for similarity and (c) development of data supervised Deep Learning (DL) models for landscape prediction based on the labels from previous step. Each iteration starts with data that is generated and deployed according to FAIR principles, and the developed deep learning system will be instrumental to connect these WPs. Insights in algorithm sensitivity from the predictive models will form the basis for discussion with field experts on the distinction and phenotypic consequences. While full development of accurate diagnostics might go beyond the timespan of the 5 year project, ideally our final landscapes can be used for new genetic counselling: when somebody is positive for the gene, can we use his/her cells, feed it into the generated cell-based model and better predict the AO and severity? While this will answer questions from clinicians and patient communities, it will also generate new ones, which is why we will study the ethical implications of such improved diagnostics in advance (WP6).
Over a million people in the Netherlands have type 2 diabetes (T2D), which is strongly related to overweight, and many more people are at-risk. A carbohydrate-rich diet and insufficient physical activity play a crucial role in these developments. It is essential to prevent T2D, because this condition is associated with a reduced quality of life, high healthcare costs and premature death due to cardiovascular diseases. The hormone insulin plays a major role in this. This hormone lowers the blood glucose concentration through uptake in body cells. If an excess of glucose is constantly offered, initially the body maintains blood glucose concentration within normal range by releasing higher concentrations of insulin into the blood, a condition that is described as “prediabetes”. In a process of several years, this compensating mechanism will eventually fail: the blood glucose concentration increases resulting in T2D. In the current healthcare practice, T2D is actually diagnosed by recognizing only elevated blood glucose concentrations, being insufficient for identification of people who have prediabetes and are at-risk to develop T2D. Although the increased insulin concentrations at normal glucose concentrations offer an opportunity for early identification/screening of people with prediabetes, there is a lack of effective and reliable methods/devices to adequately measure insulin concentrations. An integrated approach has been chosen for identification of people at-risk by using a prediabetes screening method based on insulin detection. Users and other stakeholders will be involved in the development and implementation process from the start of the project. A portable and easy-to-use demonstrator will be realised, based on rapid lateral flow tests (LFTs), which is able to measure insulin in clinically relevant samples (serum/blood) quickly and reliably. Furthermore, in collaboration with healthcare professionals, we will investigate how this screening method can be implemented in practice to contribute to a healthier lifestyle and prevent T2D.