Computers are promising tools for providing educational experiences that meet individual learning needs. However, delivering this promise in practice is challenging, particularly when automated feedback is essential and the learning extends beyond using traditional methods such as writing and solving mathematics problems. We hypothesize that interactive knowledge representations can be deployed to address this challenge. Knowledge representations differ markedly from concept maps. Where the latter uses nodes (concepts) and arcs (links between concepts), a knowledge representation is based on an ontology that facilitates automated reasoning. By adjusting this reasoning towards interacting with learners for the benefit of learning, a new class of educational instruments emerges. In this contribution, we present three projects that use an interactive knowledge representation as their foundation. DynaLearn supports learners in acquiring system thinking skills. Minds-On helps learners to deepen their understanding of phenomena while performing experiments. Interactive Concept Cartoons engage learners in a science-based discussion about controversial topics. Each of these approaches has been developed iteratively in collaboration with teachers and tested in real classrooms, resulting in a suite of lessons available online. Evaluation studies involving pre-/post-tests and action-log data show that learners are easily capable of working with these educational instruments and that the instruments thus enable a semi-automated approach to constructive learning.
DOCUMENT
We investigate how interactive representations can be used to support learners while learning about the circular motion of celestial bodies. We present the developed representation and accompanying lesson, and report on the effect.
DOCUMENT
We developed a lesson where students construct a qualitative representation to learn how clock genes are regulated. Qualitative representations provide a non-numerical description of system behavior, focusing on causal relation-ships and system states. They align with human reasoning about system dy-namics and serve as valuable learning tools for understanding both domain-specific systems and developing broader systems thinking skills.The lesson, designed for upper secondary and higher education, is imple-mented in the DynaLearn software at Level 4, where students can model feedback loops. Students construct the representation step by step, guided by a structured workbook and built-in support functions within the software. At each step, they run simulations to examine system behavior and reflect on the results through workbook questions. To ensure scientific accuracy, the representation and workbook were evaluated by domain experts.The lesson begins with modeling how increasing BMAL:CLOCK activity enhances the transcription of PER and CRY genes through binding to the E-box. Next, students explore how mRNA production and degradation—two opposing processes—regulate mRNA levels. This is followed by modeling translation at the ribosomes, where PER and CRY proteins are synthesized and subsequently degraded, again illustrating competing regulatory process-es. Students then model how PER and CRY proteins form a complex that translocates to the nucleus, inhibiting CLOCK:BMAL binding and establish-ing a negative feedback loop. Finally, they extend their understanding by ex-ploring how CLOCK:BMAL also regulates the AVP gene, linking clock genes to broader physiological processes.
MULTIFILE
The IMPULS-2020 project DIGIREAL (BUas, 2021) aims to significantly strengthen BUAS’ Research and Development (R&D) on Digital Realities for the benefit of innovation in our sectoral industries. The project will furthermore help BUas to position itself in the emerging innovation ecosystems on Human Interaction, AI and Interactive Technologies. The pandemic has had a tremendous negative impact on BUas industrial sectors of research: Tourism, Leisure and Events, Hospitality and Facility, Built Environment and Logistics. Our partner industries are in great need of innovative responses to the crises. Data, AI combined with Interactive and Immersive Technologies (Games, VR/AR) can provide a partial solution, in line with the key-enabling technologies of the Smart Industry agenda. DIGIREAL builds upon our well-established expertise and capacity in entertainment and serious games and digital media (VR/AR). It furthermore strengthens our initial plans to venture into Data and Applied AI. Digital Realities offer great opportunities for sectoral industry research and innovation, such as experience measurement in Leisure and Hospitality, data-driven decision-making for (sustainable) tourism, geo-data simulations for Logistics and Digital Twins for Spatial Planning. Although BUas already has successful R&D projects in these areas, the synergy can and should significantly be improved. We propose a coherent one-year Impuls funded package to develop (in 2021): 1. A multi-year R&D program on Digital Realities, that leads to, 2. Strategic R&D proposals, in particular a SPRONG/sleuteltechnologie proposal; 3. Partnerships in the regional and national innovation ecosystem, in particular Mind Labs and Data Development Lab (DDL); 4. A shared Digital Realities Lab infrastructure, in particular hardware/software/peopleware for Augmented and Mixed Reality; 5. Leadership, support and operational capacity to achieve and support the above. The proposal presents a work program and management structure, with external partners in an advisory role.
"Speak the Future" presents a novel test case at the intersection of scientific innovation and public engagement. Leveraging the power of real-time AI image generation, the project empowers festival participants to verbally describe their visions for a sustainable and regenerative future. These descriptions are instantly transformed into captivating imagery using SDXL Turbo, fostering collective engagement and tangible visualisation of abstract sustainability concepts. This unique interplay of speech recognition, AI, and projection technology breaks new ground in public engagement methods. The project offers valuable insights into public perceptions and aspirations for sustainability, as well as understanding the effectiveness of AI-powered visualisation and regenerative applications of AI. Ultimately, this will serve as a springboard for PhD research that will aim to understand How AI can serve as a vehicle for crafting regenerative futures? By employing real-time AI image generation, the project directly tests its effectiveness in fostering public engagement with sustainable futures. Analysing participant interaction and feedback sheds light on how AI-powered visualisation tools can enhance comprehension and engagement. Furthermore, the project fosters public understanding and appreciation of research. The interactive and accessible nature of "Speak the Future" demystifies the research process, showcasing its relevance and impact on everyday life. Moreover, by directly involving the public in co-creating visual representations of their aspirations, the project builds an emotional connection and sense of ownership, potentially leading to continued engagement and action beyond the festival setting. "Speak the Future" promises to be a groundbreaking initiative, bridging the gap between scientific innovation and public engagement in sustainability discourse. By harnessing the power of AI for collective visualisation, the project not only gathers valuable data for researchers but also empowers the public to envision and work towards a brighter, more sustainable future.
The project proposal focuses on Virtual Humans (VHs) emerging as a Key Enabling Technology (KET) for societal prosperity. VHs (or embodied, digital, intelligent agents) are highly realistic and highly interactive digital representations of humans in entertainment of serious applications. Most known examples – beyond video games and virtual media productions – are virtual influencers, virtual instructors, virtual news readers, and virtual doctors/patients in health care or therapy. It is increasingly difficult for academic and applied researchers, let alone for users and policymakers, to keep up with the technological developments, societal uses, and risks of VHs. Due to its expertise in game technology, immersive media, and applied AI, BUas is one of the leading partners of the regional Virtual Human Research, Development and Innovation (RDI) agenda. MindLabs coordinates this agenda with BUas, Fontys Uas, and Tilburg University as principal partners. The multidisciplinary RDI agenda integrates design and engineering research, use case applications and evaluation as well as ethics and critical societal reflection. This regional Virtual Humans agenda, however, is not (yet) linked to the EU RDI agenda. Collaboration on Virtual Humans RDI is not yet well established in EU institutions and networks. The aim of this project is to 1) strengthen (our) European-knowledge position on VHs by joining and building networks to find out what the research and innovation agenda on VHs looks like; 2) Conduct one or more experimental studies on empathic interaction between real- and virtual humans to develop a multidisciplinary R&D agenda (pilot title: 'Virtual Humans – Real Emotions'); 3) Develop the ideas, content and partnerships for strong EU-funded RDI proposals In the VESPER project, we partner up with researchers and knowledge institutes the Humbolt University and the University of Bremen in Germany and Howest in Belgium.