Twirre is a new architecture for mini-UAV platforms designed for autonomous flight in both GPS-enabled and GPS-deprived applications. The architecture consists of low-cost hardware and software components. High-level control software enables autonomous operation. Exchanging or upgrading hardware components is straightforward and the architecture is an excellent starting point for building low-cost autonomous mini-UAVs for a variety of applications. Experiments with an implementation of the architecture are in development, and preliminary results demonstrate accurate indoor navigation
MULTIFILE
Twirre V2 is the evolution of an architecture for mini-UAV platforms which allows automated operation in both GPS-enabled and GPSdeprived applications. This second version separates mission logic, sensor data processing and high-level control, which results in reusable software components for multiple applications. The concept of Local Positioning System (LPS) is introduced, which, using sensor fusion, would aid or automate the flying process like GPS currently does. For this, new sensors are added to the architecture and a generic sensor interface together with missions for landing and following a line have been implemented. V2 introduces a software modular design and new hardware has been coupled, showing its extensibility and adaptability
DOCUMENT
This paper describes the concept of a new algorithm to control an Unmanned Aerial System (UAS) for accurate autonomous indoor flight. Inside a greenhouse, Global Positioning System (GPS) signals are not reliable and not accurate enough. As an alternative, Ultra Wide Band (UWB) is used for localization. The noise is compensated by combining the UWB with the delta position signal from a novel optical flow algorithm through a Kalman Filter (KF). The end result is an accurate and stable position signal with low noise and low drift.
DOCUMENT
Drones have been verified as the camera of 2024 due to the enormous exponential growth in terms of the relevant technologies and applications such as smart agriculture, transportation, inspection, logistics, surveillance and interaction. Therefore, the commercial solutions to deploy drones in different working places have become a crucial demand for companies. Warehouses are one of the most promising industrial domains to utilize drones to automate different operations such as inventory scanning, goods transportation to the delivery lines, area monitoring on demand and so on. On the other hands, deploying drones (or even mobile robots) in such challenging environment needs to enable accurate state estimation in terms of position and orientation to allow autonomous navigation. This is because GPS signals are not available in warehouses due to the obstruction by the closed-sky areas and the signal deflection by structures. Vision-based positioning systems are the most promising techniques to achieve reliable position estimation in indoor environments. This is because of using low-cost sensors (cameras), the utilization of dense environmental features and the possibilities to operate in indoor/outdoor areas. Therefore, this proposal aims to address a crucial question for industrial applications with our industrial partners to explore limitations and develop solutions towards robust state estimation of drones in challenging environments such as warehouses and greenhouses. The results of this project will be used as the baseline to develop other navigation technologies towards full autonomous deployment of drones such as mapping, localization, docking and maneuvering to safely deploy drones in GPS-denied areas.
Deploying robots from indoor to outdoor environments (vise versa) with stable and accurate localization is very important for companies to secure the utilization in industrial applications such as delivering harvested fruits from plantations, deploying/docking, navigating under solar panels, passing through tunnels/underpasses and parking in garages. This is because of the sudden changes in operational conditions such as receiving high/low-quality satellite signals, changing field of view, dealing with lighting conditions and addressing different velocities. We observed these limitations especially in indoor-outdoor transitions after conducting different projects with companies and obtaining inaccurate localization using individual Robotics Operating Systems (ROS2) modules. As there are rare commercial solutions for IO-transitions, AlFusIOn is a ROS2-based framework aims to fuse different sensing and data-interpretation techniques (LiDAR, Camera, IMU, GNSS-RTK, Wheel Odometry, Visual Odometry) to guarantee the redundancy and accuracy of the localization system. Moreover, maps will be integrated to robustify the performance and ensure safety by providing geometrical information about the transitioning structures. Furthermore, deep learning will be utilized to understand the operational conditions by labeling indoor and outdoor areas. This information will be encoded in maps to provide robots with expected operational conditions in advance and beyond the current sensing state. Accordingly, this self-awareness capability will be incorporated into the fusion process to control and switch between the localization techniques to achieve accurate and smooth IO-transitions, e.g., GNSS-RTK will be deactivated during the transition. As an urgent and unique demand to have an accurate and continuous IO-transition towards fully autonomous navigation/transportation, Saxion University and the proposal’s partners are determined to design a commercial and modular industrial-based localization system with robust performance, self-awareness about the localization capabilities and less human interference. Furthermore, AlFusIOn will intensively collaborate with MAPS (a RAAKPRO proposed by HAN University) to achieve accurate localization in outdoor environments.
Autonomous Guided Vehicles (AGV) worden hedendaags breed toegepast in verschillende sectoren als agri, logistiek en zorg. De taken die AGV’s verrichten zijn veelal gericht op het indoor transporteren van goederen en vereisen daarom een precieze en robuuste locatiebepaling. Indoor lokalisatie is een ‘key-technology’ daar het in allerlei toepassingsgebieden een fundamentele rol speelt. Tot op heden is er geen algemeen toepasbare techniek voorhanden en is het noodzakelijk om de omgeving uit te rusten met een op maat gemaakt lokalisatiesysteem wat duur, tijdrovend en inflexibel is. Een veelbelovende techniek is Magnetic-Simulataneous-Localisation-And-Mapping (MagSLAM). Deze techniek is berust op een verstoord aardmagnetisch veld door de aanwezigheid van vele ‘indoor’ ferromagnetische structuren. Deze verstoringen zijn specifiek voor de plek binnen het gebouw en zodoende als informatiebron gezien kunnen worden. Deze wijze biedt een aantal fundamentele voordelen ten opzichte van camera, radio of tag gebaseerde lokalisatiesystemen. Het doel van dit KIEM-project is een onderzoek naar de vraag in hoeverre we het magnetisch veld als informatieprovider kunnen gebruiken om het lokalisatievraagstuk voor AGV’s te kunnen helpen. De belangrijkste onderzoekvraag daarbij is “Hoe kunnen we de MagSLAM-technologie opwerken en inpassen in een AGV-systeem?” Daarbij rekening houdend met uitdagingen als kalibratie, fusie van sensordata (bijvoorbeeld odometrie) en het robuust zijn voor grote inductiestromen (bijvoorbeeld motoren en laadcircuits). Saxion en haar partners zetten zich de komende jaren in op de sleuteltechnologieën voor robotica als perception, navigation, cognition en artificial-intelligence welke allen integraal onderdeel vormen in dit KIEM project. Het project zal uit 4 fases bestaan: allereerst een inventarisatie van huidige MagSLAM-algoritmiek en AGVpositioneringssystemen (IST), een systeem- en gebruikerseisen onderzoek (SOLL) en tenslotte een analyse om de technologie op te werken en te passen (GAP).