For long flights, the cruise is the longest phase and where the largest amount of fuel is consumed. An in-cruise optimization method has been implemented to calculate the optimal trajectory that reduces the flight cost. A three-dimensional grid has been created, coupling lateral navigation and vertical navigation profiles. With a dynamic analysis of the wind, the aircraft can perform a horizontal deviation or change altitudes via step climbs to reduce fuel consumption. As the number of waypoints and possible step climbs is increased, the number of flight trajectories increases exponentially; thus, a genetic algorithm has been implemented to reduce the total number of calculated trajectories compared to an exhaustive search. The aircraft’s model has been obtained from a performance database, which is currently used in the commercial flight management system studied in this paper. A 5% average flight cost reduction has been obtained.
MULTIFILE
In flexible education, recommender systems that support course selection, are considered a viable means to help students in making informed course selections, especially where curricula offer greater flexibility. However, these recommender systems present both potential benefits and looming risks, such as overdependence on technology, biased recommendations, and privacy issues. User control mechanisms in recommender interfaces (or algorithmic affordances) might offer options to address those risks, but they have not been systematically studied yet. This paper presents the outcomes of a design session conducted during the INTERACT23 workshop on Algorithmic Affordances in Recommender Interfaces. This design session yielded insights in how the design of an interface, and specifically the algorithmic affordances in these interfaces, may address the ethical risks and dilemmas of using a recommender in such an impactful context by potentially vulnerable users. Through design and reflection, we discovered a host of design ideas for the interface of a flexible education interface, that can serve as conversation starters for practitioners implementing flexible education. More research is needed to explore these design directions and to gain insights on how they can help to approximate more ethically operating recommender systems.
LINK
Privacy concerns can potentially make camera-based object classification unsuitable for robot navigation. To address this problem, we propose a novel object classification system using only a 2D-LiDAR sensor on mobile robots. The proposed system enables semantic understanding of the environment by applying the YOLOv8n model to classify objects such as tables, chairs, cupboards, walls, and door frames using only data captured by a 2D-LiDAR sensor. The experimental results show that the resulting YOLOv8n model achieved an accuracy of 83.7% in real-time classification running on Raspberry Pi 5, despite having a lower accuracy when classifying door-frames and walls. This validates our proposed approach as a privacy-friendly alternative to camera-based methods and illustrates that it can run on small computers onboard mobile robots.
DOCUMENT
Drones have been verified as the camera of 2024 due to the enormous exponential growth in terms of the relevant technologies and applications such as smart agriculture, transportation, inspection, logistics, surveillance and interaction. Therefore, the commercial solutions to deploy drones in different working places have become a crucial demand for companies. Warehouses are one of the most promising industrial domains to utilize drones to automate different operations such as inventory scanning, goods transportation to the delivery lines, area monitoring on demand and so on. On the other hands, deploying drones (or even mobile robots) in such challenging environment needs to enable accurate state estimation in terms of position and orientation to allow autonomous navigation. This is because GPS signals are not available in warehouses due to the obstruction by the closed-sky areas and the signal deflection by structures. Vision-based positioning systems are the most promising techniques to achieve reliable position estimation in indoor environments. This is because of using low-cost sensors (cameras), the utilization of dense environmental features and the possibilities to operate in indoor/outdoor areas. Therefore, this proposal aims to address a crucial question for industrial applications with our industrial partners to explore limitations and develop solutions towards robust state estimation of drones in challenging environments such as warehouses and greenhouses. The results of this project will be used as the baseline to develop other navigation technologies towards full autonomous deployment of drones such as mapping, localization, docking and maneuvering to safely deploy drones in GPS-denied areas.
Deploying robots from indoor to outdoor environments (vise versa) with stable and accurate localization is very important for companies to secure the utilization in industrial applications such as delivering harvested fruits from plantations, deploying/docking, navigating under solar panels, passing through tunnels/underpasses and parking in garages. This is because of the sudden changes in operational conditions such as receiving high/low-quality satellite signals, changing field of view, dealing with lighting conditions and addressing different velocities. We observed these limitations especially in indoor-outdoor transitions after conducting different projects with companies and obtaining inaccurate localization using individual Robotics Operating Systems (ROS2) modules. As there are rare commercial solutions for IO-transitions, AlFusIOn is a ROS2-based framework aims to fuse different sensing and data-interpretation techniques (LiDAR, Camera, IMU, GNSS-RTK, Wheel Odometry, Visual Odometry) to guarantee the redundancy and accuracy of the localization system. Moreover, maps will be integrated to robustify the performance and ensure safety by providing geometrical information about the transitioning structures. Furthermore, deep learning will be utilized to understand the operational conditions by labeling indoor and outdoor areas. This information will be encoded in maps to provide robots with expected operational conditions in advance and beyond the current sensing state. Accordingly, this self-awareness capability will be incorporated into the fusion process to control and switch between the localization techniques to achieve accurate and smooth IO-transitions, e.g., GNSS-RTK will be deactivated during the transition. As an urgent and unique demand to have an accurate and continuous IO-transition towards fully autonomous navigation/transportation, Saxion University and the proposal’s partners are determined to design a commercial and modular industrial-based localization system with robust performance, self-awareness about the localization capabilities and less human interference. Furthermore, AlFusIOn will intensively collaborate with MAPS (a RAAKPRO proposed by HAN University) to achieve accurate localization in outdoor environments.
Hoewel drones worden gebruikt in steeds toenemende civiele toepassingen voor een goede daad, zijn kwaadwillende drones ook steeds meer en steeds vaker worden ingezet om schade aan te richten. Huis, tuin en keukendrones zijn in staat om door te dringen tot zwaarbeveiligde gebieden en daar verwoestende schade aan te brengen. Ze zijn goedkoop, precies en kunnen steeds grotere afstanden afleggen. Kwaadwillende drones vormen een groot gevaar voor de nationale veiligheid. In dit KIEM-project onderzoeken wij de vraag in hoeverre is het mogelijk om drones te ontwikkelen die volledig autonoom een ongecontroleerde omgeving (luchtruim) veilig kunnen houden? Counter drones moeten kamikaze-drones kunnen signaleren en uitschakelen. Bestaande systemen zijn nog onvoldoende in staat om kwaadwillende drones op tijd uit te schakelen. Bij Defensie, de Nationale Politie en het gevangeniswezen is dringend behoefte aan systemen die kwaadwillende drones kunnen detecteren en uitschakelen. Er zijn thans enkele (Europese) systemen waarmee drones kunnen worden gedetecteerd, onder andere met radiofrequentiesignalen (voelen), optische- en radartechnologie (zien) en akoestische systemen (horen). Geen van deze systemen vormen de ‘silver bullet’ voor het bestrijden van kwaadwillende drones, vooral kleine en laagvliegende drones. Met een feasibility study wordt nagegaan wat de state-of-the-art is van de huidige counter dronetechnologieën en op welke technologiedomeinen het consortium waarde kan toevoegen aan de ontwikkeling van effectieve counter drones. Saxion en haar partners zet zich de komende jaren in op Sleuteltechnologieën als: Human Robotic Interaction, Perception, Navigation, Systems Development, Mechatronics en Cognition. Technologieën die terugkomen in counter drones, maar ook worden doorontwikkeld voor andere toepassingsgebieden. Het project bestaat uit 4 fasen: een onderzoek naar de huidige counter dronetechnologieën (IST), onderzoek naar gewenste/toekomstige counter dronetechnologieën (SOLL), een gap-analyse (TOR) én een omgevingsanalyse om na te gaan wat er elders in Europa al aan onderzoek plaatsvindt. Tevens wordt een netwerk ontwikkeld om counter droneontwikkeling mogelijk te maken.