Abstract: Disability is associated with lower quality of life and premature death in older people. Therefore, prevention and intervention targeting older people living with a disability is important. Frailty can be considered a major predictor of disability. In this study, we aimed to develop nomograms with items of the Tilburg Frailty Indicator (TFI) as predictors by using cross-sectional and longitudinal data (follow-up of five and nine years), focusing on the prediction of total disability, disability in activities of daily living (ADL), and disability in instrumental activities of daily living (IADL). At baseline, 479 Dutch community-dwelling people aged 75 years participated. They completed a questionnaire that included the TFI and the Groningen Activity Restriction Scale to assess the three disability variables. We showed that the TFI items scored different points, especially over time. Therefore, not every item was equally important in predicting disability. ‘Difficulty in walking’ and ‘unexplained weight loss’ appeared to be important predictors of disability. Healthcare professionals need to focus on these two items to prevent disability. We also conclude that the points given to frailty items differed between total, ADL, and IADL disability and also differed regarding years of follow-up. Creating one monogram that does justice to this seems impossible.
DOCUMENT
Background: Modern modeling techniques may potentially provide more accurate predictions of dichotomous outcomes than classical techniques. Objective: In this study, we aimed to examine the predictive performance of eight modeling techniques to predict mortality by frailty. Methods: We performed a longitudinal study with a 7-year follow-up. The sample consisted of 479 Dutch community-dwelling people, aged 75 years and older. Frailty was assessed with the Tilburg Frailty Indicator (TFI), a self-report questionnaire. This questionnaire consists of eight physical, four psychological, and three social frailty components. The municipality of Roosendaal, a city in the Netherlands, provided the mortality dates. We compared modeling techniques, such as support vector machine (SVM), neural network (NN), random forest, and least absolute shrinkage and selection operator, as well as classical techniques, such as logistic regression, two Bayesian networks, and recursive partitioning (RP). The area under the receiver operating characteristic curve (AUROC) indicated the performance of the models. The models were validated using bootstrapping. Results: We found that the NN model had the best validated performance (AUROC=0.812), followed by the SVM model (AUROC=0.705). The other models had validated AUROC values below 0.700. The RP model had the lowest validated AUROC (0.605). The NN model had the highest optimism (0.156). The predictor variable “difficulty in walking” was important for all models. Conclusions: Because of the high optimism of the NN model, we prefer the SVM model for predicting mortality among community-dwelling older people using the TFI, with the addition of “gender” and “age” variables. External validation is a necessary step before applying the prediction models in a new setting.
DOCUMENT
Individuals with autism increasingly enroll in universities, but little is known about predictors for their success. This study developed predictive models for the academic success of autistic bachelor students (N=101) in comparison to students with other health conditions (N=2465) and students with no health conditions (N=25,077). We applied propensity score weighting to balance outcomes. The research showed that autistic students’ academic success was predictable, and these predictions were more accurate than predictions of their peers’ success. For first-year success, study choice issues were the most important predictors (parallel program and application timing). Issues with participation in pre-education (missingness of grades in pre-educational records) and delays at the beginning of autistic students’ studies (reflected in age) were the most influential predictors for the second-year success and delays in the second and final year of their bachelor’s program. In addition, academic performance (average grades) was the strongest predictor for degree completion in 3 years. These insights can enable universities to develop tailored support for autistic students. Using early warning signals from administrative data, institutions can lower dropout risk and increase degree completion for autistic students.
DOCUMENT
The focus of the research is 'Automated Analysis of Human Performance Data'. The three interconnected main components are (i)Human Performance (ii) Monitoring Human Performance and (iii) Automated Data Analysis . Human Performance is both the process and result of the person interacting with context to engage in tasks, whereas the performance range is determined by the interaction between the person and the context. Cheap and reliable wearable sensors allow for gathering large amounts of data, which is very useful for understanding, and possibly predicting, the performance of the user. Given the amount of data generated by such sensors, manual analysis becomes infeasible; tools should be devised for performing automated analysis looking for patterns, features, and anomalies. Such tools can help transform wearable sensors into reliable high resolution devices and help experts analyse wearable sensor data in the context of human performance, and use it for diagnosis and intervention purposes. Shyr and Spisic describe Automated Data Analysis as follows: Automated data analysis provides a systematic process of inspecting, cleaning, transforming, and modelling data with the goal of discovering useful information, suggesting conclusions and supporting decision making for further analysis. Their philosophy is to do the tedious part of the work automatically, and allow experts to focus on performing their research and applying their domain knowledge. However, automated data analysis means that the system has to teach itself to interpret interim results and do iterations. Knuth stated: Science is knowledge which we understand so well that we can teach it to a computer; and if we don't fully understand something, it is an art to deal with it.[Knuth, 1974]. The knowledge on Human Performance and its Monitoring is to be 'taught' to the system. To be able to construct automated analysis systems, an overview of the essential processes and components of these systems is needed.Knuth Since the notion of an algorithm or a computer program provides us with an extremely useful test for the depth of our knowledge about any given subject, the process of going from an art to a science means that we learn how to automate something.
Organ-on-a-chip technology holds great promise to revolutionize pharmaceutical drug discovery and development which nowadays is a tremendously expensive and inefficient process. It will enable faster, cheaper, physiologically relevant, and more reliable (standardized) assays for biomedical science and drug testing. In particular, it is anticipated that organ-on-a-chip technology can substantially replace animal drug testing with using the by far better models of true human cells. Despite this great potential and progress in the field, the technology still lacks standardized protocols and robust chip devices, which are absolutely needed for this technology to bring the abovementioned potential to fruition. Of particular interest is heart-on-a-chip for drug and cardiotoxicity screening. There is presently no preclinical test system predicting the most important features of cardiac safety accurately and cost-effectively. The main goal of this project is to fabricate standardized, robust generic heart-on-a-chip demonstrator devices that will be validated and further optimized to generate new physiologically relevant models to study cardiotoxicity in vitro. To achieve this goal various aspects will be considered, including (i) the search for alternative chip materials to replace PDMS, (ii) inner chip surface modification and treatment (chemistry and topology), (iii) achieving 2D/3D cardiomyocyte (long term) cell culture and cellular alignment within the chip device, (iv) the possibility of integrating in-line sensors in the devices and, finally, (v) the overall chip design. The achieved standardized heart-on-a-chip technology will be adopted by pharmaceutical industry. This proposed project offers a unique opportunity for the Netherlands, and Twente in particular, which has relevant expertise, potential, and future perspective in this field as it hosts world-leading companies pioneering various core aspects of the technology that are relevant for organs-on-chips, combined with two world-leading research institutes within the University of Twente.
This project assists architects and engineers to validate their strategies and methods, respectively, toward a sustainable design practice. The aim is to develop prototype intelligent tools to forecast the carbon footprint of a building in the initial design process given the visual representations of space layout. The prediction of carbon emission (both embodied and operational) in the primary stages of architectural design, can have a long-lasting impact on the carbon footprint of a building. In the current design strategy, emission measures are considered only at the final phase of the design process once major parameters of space configuration such as volume, compactness, envelope, and materials are fixed. The emission assessment only at the final phase of the building design is due to the costly and inefficient interaction between the architect and the consultant. This proposal offers a method to automate the exchange between the designer and the engineer using a computer vision tool that reads the architectural drawings and estimates the carbon emission at each design iteration. The tool is directly used by the designer to track the effectiveness of every design choice on emission score. In turn, the engineering firm adapts the tool to calculate the emission for a future building directly from visual models such as shared Revit documents. The building realization is predominantly visual at the early design stages. Thus, computer vision is a promising technology to infer visual attributes, from architectural drawings, to calculate the carbon footprint of the building. The data collection for training and evaluation of the computer vision model and machine learning framework is the main challenge of the project. Our consortium provides the required resources and expertise to develop trustworthy data for predicting emission scores directly from architectural drawings.