Automated Analysis of Human Performance Data could help to understand and possibly predict the performance of the human. To inform future research and enable Automated Analysis of Human Performance Data a systematic mapping study (scoping study) on the state-of-the-art knowledge is performed on three interconnected components(i)Human Performance (ii) Monitoring Human Performance and (iii) Automated Data Analysis. Using a systematic method of Kitchenham and Charters for performing the systematic mapping study, resulted in a comprehensive search for studies and a categorisation the studies using a qualitative method. This systematic mapping review extends the philosophy of Shyr and Spisic, and Knuth and represents the state-of-art knowledge on Human Performance,Monitoring Human Performance and Automated Data Analysis
DOCUMENT
Abstract: Background: Chronic obstructive pulmonary disease (COPD) and asthma have a high prevalence and disease burden. Blended self-management interventions, which combine eHealth with face-to-face interventions, can help reduce the disease burden. Objective: This systematic review and meta-analysis aims to examine the effectiveness of blended self-management interventions on health-related effectiveness and process outcomes for people with COPD or asthma. Methods: PubMed, Web of Science, COCHRANE Library, Emcare, and Embase were searched in December 2018 and updated in November 2020. Study quality was assessed using the Cochrane risk of bias (ROB) 2 tool and the Grading of Recommendations, Assessment, Development, and Evaluation. Results: A total of 15 COPD and 7 asthma randomized controlled trials were included in this study. The meta-analysis of COPD studies found that the blended intervention showed a small improvement in exercise capacity (standardized mean difference [SMD] 0.48; 95% CI 0.10-0.85) and a significant improvement in the quality of life (QoL; SMD 0.81; 95% CI 0.11-1.51). Blended intervention also reduced the admission rate (relative ratio [RR] 0.61; 95% CI 0.38-0.97). In the COPD systematic review, regarding the exacerbation frequency, both studies found that the intervention reduced exacerbation frequency (RR 0.38; 95% CI 0.26-0.56). A large effect was found on BMI (d=0.81; 95% CI 0.25-1.34); however, the effect was inconclusive because only 1 study was included. Regarding medication adherence, 2 of 3 studies found a moderate effect (d=0.73; 95% CI 0.50-0.96), and 1 study reported a mixed effect. Regarding self-management ability, 1 study reported a large effect (d=1.15; 95% CI 0.66-1.62), and no effect was reported in that study. No effect was found on other process outcomes. The meta-analysis of asthma studies found that blended intervention had a small improvement in lung function (SMD 0.40; 95% CI 0.18-0.62) and QoL (SMD 0.36; 95% CI 0.21-0.50) and a moderate improvement in asthma control (SMD 0.67; 95% CI 0.40-0.93). A large effect was found on BMI (d=1.42; 95% CI 0.28-2.42) and exercise capacity (d=1.50; 95% CI 0.35-2.50); however, 1 study was included per outcome. There was no effect on other outcomes. Furthermore, the majority of the 22 studies showed some concerns about the ROB, and the quality of evidence varied. Conclusions: In patients with COPD, the blended self-management interventions had mixed effects on health-related outcomes, with the strongest evidence found for exercise capacity, QoL, and admission rate. Furthermore, the review suggested that the interventions resulted in small effects on lung function and QoL and a moderate effect on asthma control in patients with asthma. There is some evidence for the effectiveness of blended self-management interventions for patients with COPD and asthma; however, more research is needed. Trial Registration: PROSPERO International Prospective Register of Systematic Reviews CRD42019119894; https://www.crd.york.ac.uk/prospero/display_record.php?RecordID=119894
DOCUMENT
Although reengineering is strategically advantageous fororganisations in order to keep functional and sustainable, safety must remain apriority and respective efforts need to be maintained. This paper suggeststhe combination of soft system methodology (SSM) and Pareto analysison the scope of safety management performance evaluation, and presents theresults of a survey, which was conducted in order to assess the effectiveness,efficacy and ethicality of the individual components of an organisation’s safetyprogram. The research employed quantitative and qualitative data and ensureda broad representation of functional managers and safety professionals, whocollectively hold the responsibility for planning, implementing and monitoringsafety practices. The results showed that SSM can support the assessment ofsafety management performance by revealing weaknesses of safety initiatives,and Pareto analysis can underwrite the prioritisation of the remedies required.The specific methodology might be adapted by any organisation that requires adeep evaluation of its safety management performance, seeks to uncover themechanisms that affect such performance, and, under limited resources, needsto focus on the most influential deficiencies.
DOCUMENT
Due to the existing pressure for a more rational use of the water, many public managers and industries have to re-think/adapt their processes towards a more circular approach. Such pressure is even more critical in the Rio Doce region, Minas Gerais, due to the large environmental accident occurred in 2015. Cenibra (pulp mill) is an example of such industries due to the fact that it is situated in the river basin and that it has a water demanding process. The current proposal is meant as an academic and engineering study to propose possible solutions to decrease the total water consumption of the mill and, thus, decrease the total stress on the Rio Doce basin. The work will be divided in three working packages, namely: (i) evaluation (modelling) of the mill process and water balance (ii) application and operation of a pilot scale wastewater treatment plant (iii) analysis of the impacts caused by the improvement of the process. The second work package will also be conducted (in parallel) with a lab scale setup in The Netherlands to allow fast adjustments and broaden evaluation of the setup/process performance. The actions will focus on reducing the mill total water consumption in 20%.
The focus of the research is 'Automated Analysis of Human Performance Data'. The three interconnected main components are (i)Human Performance (ii) Monitoring Human Performance and (iii) Automated Data Analysis . Human Performance is both the process and result of the person interacting with context to engage in tasks, whereas the performance range is determined by the interaction between the person and the context. Cheap and reliable wearable sensors allow for gathering large amounts of data, which is very useful for understanding, and possibly predicting, the performance of the user. Given the amount of data generated by such sensors, manual analysis becomes infeasible; tools should be devised for performing automated analysis looking for patterns, features, and anomalies. Such tools can help transform wearable sensors into reliable high resolution devices and help experts analyse wearable sensor data in the context of human performance, and use it for diagnosis and intervention purposes. Shyr and Spisic describe Automated Data Analysis as follows: Automated data analysis provides a systematic process of inspecting, cleaning, transforming, and modelling data with the goal of discovering useful information, suggesting conclusions and supporting decision making for further analysis. Their philosophy is to do the tedious part of the work automatically, and allow experts to focus on performing their research and applying their domain knowledge. However, automated data analysis means that the system has to teach itself to interpret interim results and do iterations. Knuth stated: Science is knowledge which we understand so well that we can teach it to a computer; and if we don't fully understand something, it is an art to deal with it.[Knuth, 1974]. The knowledge on Human Performance and its Monitoring is to be 'taught' to the system. To be able to construct automated analysis systems, an overview of the essential processes and components of these systems is needed.Knuth Since the notion of an algorithm or a computer program provides us with an extremely useful test for the depth of our knowledge about any given subject, the process of going from an art to a science means that we learn how to automate something.
Today, embedded devices such as banking/transportation cards, car keys, and mobile phones use cryptographic techniques to protect personal information and communication. Such devices are increasingly becoming the targets of attacks trying to capture the underlying secret information, e.g., cryptographic keys. Attacks not targeting the cryptographic algorithm but its implementation are especially devastating and the best-known examples are so-called side-channel and fault injection attacks. Such attacks, often jointly coined as physical (implementation) attacks, are difficult to preclude and if the key (or other data) is recovered the device is useless. To mitigate such attacks, security evaluators use the same techniques as attackers and look for possible weaknesses in order to “fix” them before deployment. Unfortunately, the attackers’ resourcefulness on the one hand and usually a short amount of time the security evaluators have (and human errors factor) on the other hand, makes this not a fair race. Consequently, researchers are looking into possible ways of making security evaluations more reliable and faster. To that end, machine learning techniques showed to be a viable candidate although the challenge is far from solved. Our project aims at the development of automatic frameworks able to assess various potential side-channel and fault injection threats coming from diverse sources. Such systems will enable security evaluators, and above all companies producing chips for security applications, an option to find the potential weaknesses early and to assess the trade-off between making the product more secure versus making the product more implementation-friendly. To this end, we plan to use machine learning techniques coupled with novel techniques not explored before for side-channel and fault analysis. In addition, we will design new techniques specially tailored to improve the performance of this evaluation process. Our research fills the gap between what is known in academia on physical attacks and what is needed in the industry to prevent such attacks. In the end, once our frameworks become operational, they could be also a useful tool for mitigating other types of threats like ransomware or rootkits.