Living labs are complex multi-stakeholder collaborations that often employ a usercentred and design-driven methodology to foster innovation. Conventional management tools fall short in evaluating them. However, some methods and tools dedicated to living labs' special characteristics and goals have already been developed. Most of them are still in their testing phase. Those tools are not easily accessible and can only be found in extensive research reports, which are difficult to dissect. Therefore, this paper reviews seven evaluation methods and tools specially developed for living labs. Each section of this paper is structured in the following manner: tool’s introduction (1), who uses the tool (2), and how it should be used (3). While the first set of tools, namely “ENoLL 20 Indicators”, “SISCODE Self-assessment”, and “SCIROCCO Exchange Tool” assess a living lab as an organisation and are diving deeper into the organisational activities and the complex context, the second set of methods and tools, “FormIT” and “Living Lab Markers”, evaluate living labs’ methodologies: the process they use to come to innovations. The paper's final section presents “CheRRIes Monitoring and Evaluation Tool” and “TALIA Indicator for Benchmarking Service for Regions”, which assess the regional impact made by living labs. As every living lab is different regarding its maturity (as an organisation and in its methodology) and the scope of impact it wants to make, the most crucial decision when evaluating is to determine the focus of the assessment. This overview allows for a first orientation on worked-out methods and on possible indicators to use. It also concludes that the existing tools are quite managerial in their method and aesthetics and calls for designers and social scientists to develop more playful, engaging and (possibly) learning-oriented tools to evaluate living labs in the future. LinkedIn: https://www.linkedin.com/in/overdiek12345/ https://www.linkedin.com/in/mari-genova-17a727196/?originalSubdomain=nl
Analyzing historical decision-related data can help support actual operational decision-making processes. Decision mining can be employed for such analysis. This paper proposes the Decision Discovery Framework (DDF) designed to develop, adapt, or select a decision discovery algorithm by outlining specific guidelines for input data usage, classifier handling, and decision model representation. This framework incorporates the use of Decision Model and Notation (DMN) for enhanced comprehensibility and normalization to simplify decision tables. The framework’s efficacy was tested by adapting the C4.5 algorithm to the DM45 algorithm. The proposed adaptations include (1) the utilization of a decision log, (2) ensure an unpruned decision tree, (3) the generation DMN, and (4) normalize decision table. Future research can focus on supporting on practitioners in modeling decisions, ensuring their decision-making is compliant, and suggesting improvements to the modeled decisions. Another future research direction is to explore the ability to process unstructured data as input for the discovery of decisions.
MULTIFILE
The focus of this project is on improving the resilience of hospitality Small and Medium Enterprises (SMEs) by enabling them to take advantage of digitalization tools and data analytics in particular. Hospitality SMEs play an important role in their local community but are vulnerable to shifts in demand. Due to a lack of resources (time, finance, and sometimes knowledge), they do not have sufficient access to data analytics tools that are typically available to larger organizations. The purpose of this project is therefore to develop a prototype infrastructure or ecosystem showcasing how Dutch hospitality SMEs can develop their data analytic capability in such a way that they increase their resilience to shifts in demand. The one year exploration period will be used to assess the feasibility of such an infrastructure and will address technological aspects (e.g. kind of technological platform), process aspects (e.g. prerequisites for collaboration such as confidentiality and safety of data), knowledge aspects (e.g. what knowledge of data analytics do SMEs need and through what medium), and organizational aspects (what kind of cooperation form is necessary and how should it be financed).Societal issueIn the Netherlands, hospitality SMEs such as hotels play an important role in local communities, providing employment opportunities, supporting financially or otherwise local social activities and sports teams (Panteia, 2023). Nevertheless, due to their high fixed cost / low variable business model, hospitality SMEs are vulnerable to shifts in consumer demand (Kokkinou, Mitas, et al., 2023; Koninklijke Horeca Nederland, 2023). This risk could be partially mitigated by using data analytics, to gain visibility over demand, and make data-driven decisions regarding allocation of marketing resources, pricing, procurement, etc…. However, this requires investments in technology, processes, and training that are oftentimes (financially) inaccessible to these small SMEs.Benefit for societyThe proposed study touches upon several key enabling technologies First, key enabling technology participation and co-creation lies at the center of this proposal. The premise is that regional hospitality SMEs can achieve more by combining their knowledge and resources. The proposed project therefore aims to give diverse stakeholders the means and opportunity to collaborate, learn from each other, and work together on a prototype collaboration. The proposed study thereby also contributes to developing knowledge with and for entrepreneurs and to digitalization of the tourism and hospitality sector.Collaborative partnersHZ University of Applied Sciences, Hotel Hulst, Hotel/Restaurant de Belgische Loodsensociëteit, Hotel Zilt, DM Hotels, Hotel Charley's, Juyo Analytics, Impuls Zeeland.
Developing a framework that integrates Advanced Language Models into the qualitative research process.Qualitative research, vital for understanding complex phenomena, is often limited by labour-intensive data collection, transcription, and analysis processes. This hinders scalability, accessibility, and efficiency in both academic and industry contexts. As a result, insights are often delayed or incomplete, impacting decision-making, policy development, and innovation. The lack of tools to enhance accuracy and reduce human error exacerbates these challenges, particularly for projects requiring large datasets or quick iterations. Addressing these inefficiencies through AI-driven solutions like AIDA can empower researchers, enhance outcomes, and make qualitative research more inclusive, impactful, and efficient.The AIDA project enhances qualitative research by integrating AI technologies to streamline transcription, coding, and analysis processes. This innovation enables researchers to analyse larger datasets with greater efficiency and accuracy, providing faster and more comprehensive insights. By reducing manual effort and human error, AIDA empowers organisations to make informed decisions and implement evidence-based policies more effectively. Its scalability supports diverse societal and industry applications, from healthcare to market research, fostering innovation and addressing complex challenges. Ultimately, AIDA contributes to improving research quality, accessibility, and societal relevance, driving advancements across multiple sectors.
The focus of the research is 'Automated Analysis of Human Performance Data'. The three interconnected main components are (i)Human Performance (ii) Monitoring Human Performance and (iii) Automated Data Analysis . Human Performance is both the process and result of the person interacting with context to engage in tasks, whereas the performance range is determined by the interaction between the person and the context. Cheap and reliable wearable sensors allow for gathering large amounts of data, which is very useful for understanding, and possibly predicting, the performance of the user. Given the amount of data generated by such sensors, manual analysis becomes infeasible; tools should be devised for performing automated analysis looking for patterns, features, and anomalies. Such tools can help transform wearable sensors into reliable high resolution devices and help experts analyse wearable sensor data in the context of human performance, and use it for diagnosis and intervention purposes. Shyr and Spisic describe Automated Data Analysis as follows: Automated data analysis provides a systematic process of inspecting, cleaning, transforming, and modelling data with the goal of discovering useful information, suggesting conclusions and supporting decision making for further analysis. Their philosophy is to do the tedious part of the work automatically, and allow experts to focus on performing their research and applying their domain knowledge. However, automated data analysis means that the system has to teach itself to interpret interim results and do iterations. Knuth stated: Science is knowledge which we understand so well that we can teach it to a computer; and if we don't fully understand something, it is an art to deal with it.[Knuth, 1974]. The knowledge on Human Performance and its Monitoring is to be 'taught' to the system. To be able to construct automated analysis systems, an overview of the essential processes and components of these systems is needed.Knuth Since the notion of an algorithm or a computer program provides us with an extremely useful test for the depth of our knowledge about any given subject, the process of going from an art to a science means that we learn how to automate something.