In case of a major cyber incident, organizations usually rely on external providers of Cyber Incident Response (CIR) services. CIR consultants operate in a dynamic and constantly changing environment in which they must actively engage in information management and problem solving while adapting to complex circumstances. In this challenging environment CIR consultants need to make critical decisions about what to advise clients that are impacted by a major cyber incident. Despite its relevance, CIR decision making is an understudied topic. The objective of this preliminary investigation is therefore to understand what decision-making strategies experienced CIR consultants use during challenging incidents and to offer suggestions for training and decision-aiding. A general understanding of operational decision making under pressure, uncertainty, and high stakes was established by reviewing the body of knowledge known as Naturalistic Decision Making (NDM). The general conclusion of NDM research is that experts usually make adequate decisions based on (fast) recognition of the situation and applying the most obvious (default) response pattern that has worked in similar situations in the past. In exceptional situations, however, this way of recognition-primed decision-making results in suboptimal decisions as experts are likely to miss conflicting cues once the situation is quickly recognized under pressure. Understanding the default response pattern and the rare occasions in which this response pattern could be ineffective is therefore key for improving and aiding cyber incident response decision making. Therefore, we interviewed six experienced CIR consultants and used the critical decision method (CDM) to learn how they made decisions under challenging conditions. The main conclusion is that the default response pattern for CIR consultants during cyber breaches is to reduce uncertainty as much as possible by gathering and investigating data and thus delay decision making about eradication until the investigation is completed. According to the respondents, this strategy usually works well and provides the most assurance that the threat actor can be completely removed from the network. However, the majority of respondents could recall at least one case in which this strategy (in hindsight) resulted in unnecessary theft of data or damage. Interestingly, this finding is strikingly different from other operational decision-making domains such as the military, police and fire service in which there is a general tendency to act rapidly instead of searching for more information. The main advice is that training and decision aiding of (novice) cyber incident responders should be aimed at the following: (a) make cyber incident responders aware of how recognition-primed decision making works; (b) discuss the default response strategy that typically works well in several scenarios; (c) explain the exception and how the exception can be recognized; (d) provide alternative response strategies that work better in exceptional situations.
DOCUMENT
Objective: To construct the underlying value structure of shared decision making (SDM) models. Method: We included previously identified SDM models (n = 40) and 15 additional ones. Using a thematic analysis, we coded the data using Schwartz’s value theory to define values in SDM and to investigate value relations. Results: We identified and defined eight values and developed three themes based on their relations: shared control, a safe and supportive environment, and decisions tailored to patients. We constructed a value structure based on the value relations and themes: the interplay of healthcare professionals’ (HCPs) and patients’ skills [Achievement], support for a patient [Benevolence], and a good relationship between HCP and patient [Security] all facilitate patients’ autonomy [Self-Direction]. These values enable a more balanced relationship between HCP and patient and tailored decision making [Universalism]. Conclusion: SDM can be realized by an interplay of values. The values Benevolence and Security deserve more explicit attention, and may especially increase vulnerable patients’ Self-Direction. Practice implications: This value structure enables a comparison of values underlying SDM with those of specific populations, facilitating the incorporation of patients’ values into treatment decision making. It may also inform the development of SDM measures, interventions, education programs, and HCPs when practicing.
DOCUMENT
During the past two decades the implementation and adoption of information technology has rapidly increased. As a consequence the way businesses operate has changed dramatically. For example, the amount of data has grown exponentially. Companies are looking for ways to use this data to add value to their business. This has implications for the manner in which (financial) governance needs to be organized. The main purpose of this study is to obtain insight in the changing role of controllers in order to add value to the business by means of data analytics. To answer the research question a literature study was performed to establish a theoretical foundation concerning data analytics and its potential use. Second, nineteen interviews were conducted with controllers, data scientists and academics in the financial domain. Thirdly, a focus group with experts was organized in which additional data were gathered. Based on the literature study and the participants responses it is clear that the challenge of the data explosion consist of converting data into information, knowledge and meaningful insights to support decision-making processes. Performing data analyses enables the controller to support rational decision making to complement the intuitive decision making by (senior) management. In this way, the controller has the opportunity to be in the lead of the information provision within an organization. However, controllers need to have more advanced data science and statistic competences to be able to provide management with effective analysis. Specifically, we found that an important skill regarding statistics is the visualization and communication of statistical analysis. This is needed for controllers in order to grow in their role as business partner..
DOCUMENT
Developing a framework that integrates Advanced Language Models into the qualitative research process.Qualitative research, vital for understanding complex phenomena, is often limited by labour-intensive data collection, transcription, and analysis processes. This hinders scalability, accessibility, and efficiency in both academic and industry contexts. As a result, insights are often delayed or incomplete, impacting decision-making, policy development, and innovation. The lack of tools to enhance accuracy and reduce human error exacerbates these challenges, particularly for projects requiring large datasets or quick iterations. Addressing these inefficiencies through AI-driven solutions like AIDA can empower researchers, enhance outcomes, and make qualitative research more inclusive, impactful, and efficient.The AIDA project enhances qualitative research by integrating AI technologies to streamline transcription, coding, and analysis processes. This innovation enables researchers to analyse larger datasets with greater efficiency and accuracy, providing faster and more comprehensive insights. By reducing manual effort and human error, AIDA empowers organisations to make informed decisions and implement evidence-based policies more effectively. Its scalability supports diverse societal and industry applications, from healthcare to market research, fostering innovation and addressing complex challenges. Ultimately, AIDA contributes to improving research quality, accessibility, and societal relevance, driving advancements across multiple sectors.
National forestry Commission (SBB) and National Park De Biesbosch. Subcontractor through NRITNational parks with large flows of visitors have to manage these flows carefully. Methods of data collection and analysis can be of help to support decision making. The case of the Biesbosch National Park is used to find innovative ways to figure flows of yachts, being the most important component of water traffic, and to create a model that allows the estimation of changes in yachting patterns resulting from policy measures. Recent policies oriented at building additional waterways, nature development areas and recreational concentrations in the park to manage the demands of recreation and nature conservation offer a good opportunity to apply this model. With a geographical information system (GIS), data obtained from aerial photographs and satellite images can be analyzed. The method of space syntax is used to determine and visualize characteristics of the network of leisure routes in the park and to evaluate impacts resulting from expected changes in the network that accompany the restructuring of waterways.
The focus of the research is 'Automated Analysis of Human Performance Data'. The three interconnected main components are (i)Human Performance (ii) Monitoring Human Performance and (iii) Automated Data Analysis . Human Performance is both the process and result of the person interacting with context to engage in tasks, whereas the performance range is determined by the interaction between the person and the context. Cheap and reliable wearable sensors allow for gathering large amounts of data, which is very useful for understanding, and possibly predicting, the performance of the user. Given the amount of data generated by such sensors, manual analysis becomes infeasible; tools should be devised for performing automated analysis looking for patterns, features, and anomalies. Such tools can help transform wearable sensors into reliable high resolution devices and help experts analyse wearable sensor data in the context of human performance, and use it for diagnosis and intervention purposes. Shyr and Spisic describe Automated Data Analysis as follows: Automated data analysis provides a systematic process of inspecting, cleaning, transforming, and modelling data with the goal of discovering useful information, suggesting conclusions and supporting decision making for further analysis. Their philosophy is to do the tedious part of the work automatically, and allow experts to focus on performing their research and applying their domain knowledge. However, automated data analysis means that the system has to teach itself to interpret interim results and do iterations. Knuth stated: Science is knowledge which we understand so well that we can teach it to a computer; and if we don't fully understand something, it is an art to deal with it.[Knuth, 1974]. The knowledge on Human Performance and its Monitoring is to be 'taught' to the system. To be able to construct automated analysis systems, an overview of the essential processes and components of these systems is needed.Knuth Since the notion of an algorithm or a computer program provides us with an extremely useful test for the depth of our knowledge about any given subject, the process of going from an art to a science means that we learn how to automate something.