BACKGROUND: An early return to normal intake and early mobilization enhances postoperative recovery. However, one out of six surgical patients is undernourished during hospitalization and approximately half of the patients eat 50% or less of the food provided to them. We assessed the use of newly introduced breakfast buffets in two wards for gastrointestinal and oncological surgery and determined the impact on postoperative protein and energy intake.METHODS: A prospective pilot cohort study was conducted to assess the impact of the introduction of breakfast buffets in two surgical wards. Adult patients had the opportunity to choose between an attractive breakfast buffet and regular bedside breakfast service. Primary outcomes were protein and energy intake during breakfast. We asked patients to report the type of breakfast service and breakfast intake in a diary over a seven-day period. Prognostic factors were used during multivariable regression analysis.RESULTS: A total of 77 patients were included. The median percentage of buffet use per patient during the seven-day study period was 50% (IQR 0-83). Mean protein intake was 14.7 g (SD 8.4) and mean energy intake 332.3 kcal (SD 156.9). Predictors for higher protein intake included the use of the breakfast buffet (β = 0.06, p = 0.01) and patient weight (β = 0.13, p = 0.01). Both use of the breakfast buffet (β = 1.00, p = 0.02) and Delirium Observation Scale scores (β = -246.29, p = 0.02) were related to higher energy intake.CONCLUSION: Introduction of a breakfast buffet on a surgical ward was associated with higher protein and energy intake and it could be a promising approach to optimizing such intake in surgical patients. Large, prospective and preferably randomized studies should confirm these findings.
DOCUMENT
Objectives To identify factors associated with kinesiophobia (fear of movement) after cardiac hospitalisation and to assess the impact of kinesiophobia on cardiac rehabilitation (CR) initiation.Design Prospective cohort study.Setting Academic Medical Centre, Department of Cardiology.Participants We performed a prospective cohort study in cardiac patients recruited at hospital discharge. In total, 149 patients (78.5% male) with a median age of 65 years were included, of which 82 (59%) were referred for CR.Primary and secondary outcome measures We assessed kinesiophobia with the Tampa Scale for Kinesiophobia (TSK). For this study, the total score was used (range 13–52). We assessed baseline factors (demographics, cardiac disease history, questionnaire data on anxiety, biopsychosocial complexity and self-efficacy) associated with kinesiophobia using linear regression with backward elimination. For linear regression, the standardised beta (β) was reported. Prospectively, the impact of kinesiophobia on probability of CR initiation, in the first 3 months after hospital discharge (subsample referred for CR), was assessed with logistic regression. For logistic regression, the OR was reported.Results Moderate and severe levels of kinesiophobia were found in 22.8%. In the total sample, kinesiophobia was associated with cardiac anxiety (β=0.33, 95% CI: 0.19 to 0.48), social complexity (β=0.23, 95% CI: 0.06 to 0.39) and higher education (β=−0.18, 95% CI: −0.34 to −0.02). In those referred for CR, kinesiophobia was negatively associated with self-efficacy (β=−0.29, 95% CI: −0.47 to −0.12) and positively with cardiac anxiety (β=0.43, 95% CI: 0.24 to 0.62). Kinesiophobia decreased the probability of CR initiation (ORRange13–52 points=0.92, 95% CI: 0.85 to 0.99).Conclusion In patients hospitalised for cardiovascular disease, kinesiophobia is associated with cardiac anxiety, social complexity, educational level and self-efficacy. Kinesiophobia decreased the likelihood of CR initiation with 8% per point on the TSK.
DOCUMENT
BackgroundSubstance use disorders (SUDs) are prevalent in the general population, tend to follow a chronic course, are associated with many individual and social problems, and often have their onset in adolescence. However, the knowledge base from prospective population surveys and treatment-outcome studies on the course of SUD in adolescents is limited at best. The present study aims to fill this gap and focuses on a subgroup that is particularly at risk for chronicity: adolescents in addiction treatment. We will investigate the rate of persistent SUD and its predictors longitudinally from adolescence to young adulthood among youth with DSM-5 SUD from the start of their addiction treatment to 2 and 4 years following treatment-entry. In addition to SUD, we will investigate the course of comorbid mental disorders, social functioning, and quality of life and their association with SUD over time.Methods/designIn a naturalistic, multi-center prospective cohort design, we will include youths (n = 420), who consecutively enter addiction treatment at ten participating organizations in the Netherlands. Inclusion is prestratified by treatment organization, to ensure a nationally representative sample. Eligible youths are 16 to 22 years old and seek help for a primary DSM-5 cannabis, alcohol, cocaine or amphetamine use disorder. Assessments focus on lifetime and current substance use and SUD, non-SUD mental disorders, family history, life events, social functioning, treatment history, quality of life, chronic stress indicators (hair cortisol) and neuropsychological tests (computerized executive function tasks) and are conducted at baseline, end of treatment, and 2 and 4 years post-baseline. Baseline data and treatment data (type, intensity, duration) will be used to predict outcome – persistence of or desistance from SUD.DiscussionThere are remarkably few prospective studies worldwide that investigated the course of SUD in adolescents in addiction treatment for longer than 1 year. We are confident that the Youth in Transition study will further our understanding of determinants and consequences of persistent SUD among high-risk adolescents during the critical transition from adolescence to young adulthood.Trial registrationThe Netherlands National Trial Register Trial NL7928. Date of registration January 17, 2019.
DOCUMENT
Background: Effective telemonitoring is possible through repetitive collection of electronic patient-reported outcome measures (ePROMs) in patients with chronic diseases. Low adherence to telemonitoring may have a negative impact on the effectiveness, but it is unknown which factors are associated with adherence to telemonitoring by ePROMs. The objective was to identify factors associated with adherence to telemonitoring by ePROMs in patients with chronic diseases. Methods: A systematic literature search was conducted in PubMed, Embase, PsycINFO and the Cochrane Library up to 8 June 2021. Eligibility criteria were: (1) interventional and cohort studies, (2) patients with a chronic disease, (3) repetitive ePROMs being used for telemonitoring, and (4) the study quantitatively investigating factors associated with adherence to telemonitoring by ePROMs. The Cochrane risk of bias tool and the risk of bias in nonrandomized studies of interventions were used to assess the risk of bias. An evidence synthesis was performed assigning to the results a strong, moderate, weak, inconclusive or an inconsistent level of evidence. Results: Five studies were included, one randomized controlled trial, two prospective uncontrolled studies and two retrospective cohort studies. A total of 15 factors potentially associated with adherence to telemonitoring by ePROMs were identified in the predominate studies of low quality. We found moderate-level evidence that sex is not associated with adherence. Some studies showed associations of the remaining factors with adherence, but the overall results were inconsistent or inconclusive. Conclusions: None of the 15 studied factors had conclusive evidence to be associated with adherence. Sex was, with moderate strength, not associated with adherence. The results were conflicting or indecisive, mainly due to the low number and low quality of studies. To optimize adherence to telemonitoring with ePROMs, mixed-method studies are needed.
DOCUMENT
INTRODUCTION: While prone positioning (PP) has been shown to improve patient survival in moderate to severe acute respiratory distress syndrome (ARDS) patients, the rate of application of PP in clinical practice still appears low.AIM: This study aimed to determine the prevalence of use of PP in ARDS patients (primary endpoint), the physiological effects of PP, and the reasons for not using it (secondary endpoints).METHODS: The APRONET study was a prospective international 1-day prevalence study performed four times in April, July, and October 2016 and January 2017. On each study day, investigators in each ICU had to screen every patient. For patients with ARDS, use of PP, gas exchange, ventilator settings and plateau pressure (Pplat) were recorded before and at the end of the PP session. Complications of PP and reasons for not using PP were also documented. Values are presented as median (1st-3rd quartiles).RESULTS: Over the study period, 6723 patients were screened in 141 ICUs from 20 countries (77% of the ICUs were European), of whom 735 had ARDS and were analyzed. Overall 101 ARDS patients had at least one session of PP (13.7%), with no differences among the 4 study days. The rate of PP use was 5.9% (11/187), 10.3% (41/399) and 32.9% (49/149) in mild, moderate and severe ARDS, respectively (P = 0.0001). The duration of the first PP session was 18 (16-23) hours. Measured with the patient in the supine position before and at the end of the first PP session, PaO2/FIO2 increased from 101 (76-136) to 171 (118-220) mmHg (P = 0.0001) driving pressure decreased from 14 [11-17] to 13 [10-16] cmH2O (P = 0.001), and Pplat decreased from 26 [23-29] to 25 [23-28] cmH2O (P = 0.04). The most prevalent reason for not using PP (64.3%) was that hypoxemia was not considered sufficiently severe. Complications were reported in 12 patients (11.9%) in whom PP was used (pressure sores in five, hypoxemia in two, endotracheal tube-related in two ocular in two, and a transient increase in intracranial pressure in one).CONCLUSIONS: In conclusion, this prospective international prevalence study found that PP was used in 32.9% of patients with severe ARDS, and was associated with low complication rates, significant increase in oxygenation and a significant decrease in driving pressure.
DOCUMENT
OBJECTIVE: To examine how a healthy lifestyle is related to life expectancy that is free from major chronic diseases.DESIGN: Prospective cohort study.SETTING AND PARTICIPANTS: The Nurses' Health Study (1980-2014; n=73 196) and the Health Professionals Follow-Up Study (1986-2014; n=38 366).MAIN EXPOSURES: Five low risk lifestyle factors: never smoking, body mass index 18.5-24.9, moderate to vigorous physical activity (≥30 minutes/day), moderate alcohol intake (women: 5-15 g/day; men 5-30 g/day), and a higher diet quality score (upper 40%).MAIN OUTCOME: Life expectancy free of diabetes, cardiovascular diseases, and cancer.RESULTS: The life expectancy free of diabetes, cardiovascular diseases, and cancer at age 50 was 23.7 years (95% confidence interval 22.6 to 24.7) for women who adopted no low risk lifestyle factors, in contrast to 34.4 years (33.1 to 35.5) for women who adopted four or five low risk factors. At age 50, the life expectancy free of any of these chronic diseases was 23.5 (22.3 to 24.7) years among men who adopted no low risk lifestyle factors and 31.1 (29.5 to 32.5) years in men who adopted four or five low risk lifestyle factors. For current male smokers who smoked heavily (≥15 cigarettes/day) or obese men and women (body mass index ≥30), their disease-free life expectancies accounted for the lowest proportion (≤75%) of total life expectancy at age 50.CONCLUSION: Adherence to a healthy lifestyle at mid-life is associated with a longer life expectancy free of major chronic diseases.
DOCUMENT
Introduction: Retrospective studies suggest that a rapid initiation of treatment results in a better prognosis for patients in the emergency department. There could be a difference between the actual medication administration time and the documented time in the electronic health record. In this study, the difference between the observed medication administration time and documentation time was investigated. Patient and nurse characteristics were also tested for associations with observed time differences. Methods: In this prospective study, emergency nurses were followed by observers for a total of 3 months. Patient inclusion was divided over 2 time periods. The difference in the observed medication administration time and the corresponding electronic health record documentation time was measured. The association between patient/nurse characteristics and the difference in medication administration and documentation time was tested with a Spearman correlation or biserial correlation test. Results: In 34 observed patients, the median difference in administration and documentation time was 6.0 minutes (interquartile range 2.0-16.0). In 9 (26.5%) patients, the actual time of medication administration differed more than 15 minutes with the electronic health record documentation time. High temperature, lower saturation, oxygen-dependency, and high Modified Early Warning Score were all correlated with an increasing difference between administration and documentation times. Discussion: A difference between administration and documentation times of medication in the emergency department may be common, especially for more acute patients. This could bias, in part, previously reported time-to-treatment measurements from retrospective research designs, which should be kept in mind when outcomes of retrospective time-to-treatment studies are evaluated.
DOCUMENT
This study investigated potential risk factors (coping, perfectionism, and self-regulation) for substantial injuries in contemporary dance students using a prospective cohort design, as high-quality studies focusing on mental risk factors for dance injuries are lacking. Student characteristics (age, sex, BMI, educational program, and history of injury) and psychological constructs (coping, perfectionism, and self-regulation) were assessed using the Performing artist and Athlete Health Monitor (PAHM), a web-based system. Substantial injuries were measured with the Oslo Sports Trauma Research Center (OSTRC) Questionnaire on Health Problems and recorded on a monthly basis as part of the PAHM system. Univariate and multivariate logistic regression analyses were conducted to test the associations between potential risk factors (i.e., student characteristics and psychological constructs) and substantial injuries. Ninety-nine students were included in the analyses. During the academic year 2016/2017, 48 students (48.5%) reported at least one substantial injury. Of all factors included, coping skills (OR: 0.91; 95% CI: 0.84–0.98), age (OR: 0.67; 95% CI: 0.46–0.98), and BMI (OR: 1.38; 95% CI: 1.05–1.80) were identified as significant risk factors in the multivariate analysis. The model explained 24% of the variance in the substantial injury group. Further prospective research into mental risk factors for dance injuries with larger sample sizes is needed to develop preventive strategies. Yet, dance schools could consider including coping skills training as part of injury prevention programs and, perhaps, providing special attention to younger dancers and those with a higher BMI through transitional programs to assist them in managing the stress they experience throughout their (academic) career.
DOCUMENT
For deep partial-thickness burns no consensus on the optimal treatment has been reached due to conflicting study outcomes with low quality evidence. Treatment options in high- and middle-income countries include conservative treatment with delayed excision and grafting if needed; and early excision and grafting. The majority of timing of surgery studies focus on survival rather than on quality of life. This study protocol describes a study that aims to compare long-term scar quality, clinical outcomes, and patient-reported outcomes between the treatment options. A multicentre prospective study will be conducted in the three Dutch burn centres (Rotterdam, Beverwijk, and Groningen). All adult patients with acute deep-partial thickness burns, based on healing potential with Laser Doppler Imaging, are eligible for inclusion. During a nine-month baseline period, standard practice will be monitored. This includes conservative treatment with dressings and topical agents, and excision and grafting of residual defects if needed 14–21 days post-burn. The subsequent nine months, early surgery is advocated, involving excision and grafting in the first week to ten days post-burn. The primary outcome compared between the two groups is long-term scar quality assessed by the Patient and Observer Scar Assessment Scale 3.0 twelve months after discharge. Secondary outcomes include clinical outcomes and patient-reported outcomes like quality of life and return to work. The aim of the study is to assess long-term scar quality in deep partial-thickness burns after conservative treatment with delayed excision and grafting if needed, compared to early excision and grafting. Adding to the ongoing debate on the optimal treatment of these burns. The broad range of studied outcomes will be used for the development of a decision aid for deep partial-thickness burns, to fully inform patients at the point of consent to surgery and support optimal person-centred care.
DOCUMENT
Background: The Clinical Frailty Scale (CFS) is frequently used to measure frailty in critically ill adults. There is wide variation in the approach to analysing the relationship between the CFS score and mortality after admission to the ICU. This study aimed to evaluate the influence of modelling approach on the association between the CFS score and short-term mortality and quantify the prognostic value of frailty in this context. Methods: We analysed data from two multicentre prospective cohort studies which enrolled intensive care unit patients ≥ 80 years old in 26 countries. The primary outcome was mortality within 30-days from admission to the ICU. Logistic regression models for both ICU and 30-day mortality included the CFS score as either a categorical, continuous or dichotomous variable and were adjusted for patient’s age, sex, reason for admission to the ICU, and admission Sequential Organ Failure Assessment score. Results: The median age in the sample of 7487 consecutive patients was 84 years (IQR 81–87). The highest fraction of new prognostic information from frailty in the context of 30-day mortality was observed when the CFS score was treated as either a categorical variable using all original levels of frailty or a nonlinear continuous variable and was equal to 9% using these modelling approaches (p < 0.001). The relationship between the CFS score and mortality was nonlinear (p < 0.01). Conclusion: Knowledge about a patient’s frailty status adds a substantial amount of new prognostic information at the moment of admission to the ICU. Arbitrary simplification of the CFS score into fewer groups than originally intended leads to a loss of information and should be avoided.
DOCUMENT