Autonomous driving in public roads requires precise localization within the range of few centimeters. Even the best localization systems based on GNSS cannot always reach this level of precision, especially in an urban environment, where the signal is disturbed by surrounding buildings and artifacts. Recent works have shown the advantage of using maps as a precise, robust, and reliable way of localization. Typical approaches use the set of current readings from the vehicle sensors to estimate its position on the map. The approach presented in this paper exploits a short-range visual lane marking detector and a dead reckoning system to construct a registry of the detected back lane markings corresponding to the last 240 m driven. This information is used to search in the map the most similar section, to determine the vehicle localization in the map reference. Additional filtering is used to obtain a more robust estimation for the localization. The accuracy obtained is sufficiently high to allow autonomous driving in a narrow road. The system uses a low-cost architecture of sensors and the algorithm is light enough to run on low-power embedded architecture.
DOCUMENT
The importance of specific professions for human rights realization is increasingly recognized. Journalists, teachers, and civil servants are all considered to play a role because their work affects individual rights. This is also the case for social workers. The connection between social work and human rights is evident in the large amount of literature explaining how human rights relate to social work. At the same time there is more attention for human rights localization. These fields of knowledge are related: social workers are local professionals and if they start applying human rights in their work this may influence human rights localization. This article contributes to existing debates on human rights localization by reflecting on the potential role of social workers in local human rights efforts in the Netherlands. Since human rights localization in general and human rights application in social work are recent phenomena in the Netherlands this provides a useful case study for a qualitative analysis on whether and how social workers can be regarded as actors in human rights localization. By connecting different actors that are said to play a role in human rights localization to proposed forms of human rights application by social workers this article identifies three possible roles for social workers in human rights localization: as human rights translators, as human rights advocates, and as human rights practitioners.
MULTIFILE
Autonomous driving in public roads requires precise localization within the range of few centimeters. Even the best current precise localization system based on the Global Navigation Satellite System (GNSS) can not always reach this level of precision, especially in an urban environment, where the signal is disturbed by surrounding buildings and artifacts. Laser range finder and stereo vision have been successfully used for obstacle detection, mapping and localization to solve the autonomous driving problem. Unfortunately, Light Detection and Ranging (LIDARs) are very expensive sensors and stereo vision requires powerful dedicated hardware to process the cameras information. In this context, this article presents a low-cost architecture of sensors and data fusion algorithm capable of autonomous driving in narrow two-way roads. Our approach exploits a combination of a short-range visual lane marking detector and a dead reckoning system to build a long and precise perception of the lane markings in the vehicle’s backwards. This information is used to localize the vehicle in a map, that also contains the reference trajectory for autonomous driving. Experimental results show the successful application of the proposed system on a real autonomous driving situation.
DOCUMENT
Drones have been verified as the camera of 2024 due to the enormous exponential growth in terms of the relevant technologies and applications such as smart agriculture, transportation, inspection, logistics, surveillance and interaction. Therefore, the commercial solutions to deploy drones in different working places have become a crucial demand for companies. Warehouses are one of the most promising industrial domains to utilize drones to automate different operations such as inventory scanning, goods transportation to the delivery lines, area monitoring on demand and so on. On the other hands, deploying drones (or even mobile robots) in such challenging environment needs to enable accurate state estimation in terms of position and orientation to allow autonomous navigation. This is because GPS signals are not available in warehouses due to the obstruction by the closed-sky areas and the signal deflection by structures. Vision-based positioning systems are the most promising techniques to achieve reliable position estimation in indoor environments. This is because of using low-cost sensors (cameras), the utilization of dense environmental features and the possibilities to operate in indoor/outdoor areas. Therefore, this proposal aims to address a crucial question for industrial applications with our industrial partners to explore limitations and develop solutions towards robust state estimation of drones in challenging environments such as warehouses and greenhouses. The results of this project will be used as the baseline to develop other navigation technologies towards full autonomous deployment of drones such as mapping, localization, docking and maneuvering to safely deploy drones in GPS-denied areas.
Deploying robots from indoor to outdoor environments (vise versa) with stable and accurate localization is very important for companies to secure the utilization in industrial applications such as delivering harvested fruits from plantations, deploying/docking, navigating under solar panels, passing through tunnels/underpasses and parking in garages. This is because of the sudden changes in operational conditions such as receiving high/low-quality satellite signals, changing field of view, dealing with lighting conditions and addressing different velocities. We observed these limitations especially in indoor-outdoor transitions after conducting different projects with companies and obtaining inaccurate localization using individual Robotics Operating Systems (ROS2) modules. As there are rare commercial solutions for IO-transitions, AlFusIOn is a ROS2-based framework aims to fuse different sensing and data-interpretation techniques (LiDAR, Camera, IMU, GNSS-RTK, Wheel Odometry, Visual Odometry) to guarantee the redundancy and accuracy of the localization system. Moreover, maps will be integrated to robustify the performance and ensure safety by providing geometrical information about the transitioning structures. Furthermore, deep learning will be utilized to understand the operational conditions by labeling indoor and outdoor areas. This information will be encoded in maps to provide robots with expected operational conditions in advance and beyond the current sensing state. Accordingly, this self-awareness capability will be incorporated into the fusion process to control and switch between the localization techniques to achieve accurate and smooth IO-transitions, e.g., GNSS-RTK will be deactivated during the transition. As an urgent and unique demand to have an accurate and continuous IO-transition towards fully autonomous navigation/transportation, Saxion University and the proposal’s partners are determined to design a commercial and modular industrial-based localization system with robust performance, self-awareness about the localization capabilities and less human interference. Furthermore, AlFusIOn will intensively collaborate with MAPS (a RAAKPRO proposed by HAN University) to achieve accurate localization in outdoor environments.
The CARTS (Collaborative Aerial Robotic Team for Safety and Security) project aims to improve autonomous firefighting operations through an collaborative drone system. The system combines a sensing drone optimized for patrolling and fire detection with an action drone equipped for fire suppression. While current urban safety operations rely on manually operated drones that face significant limitations in speed, accessibility, and coordination, CARTS addresses these challenges by creating a system that enhances operational efficiency through minimal human intervention, while building on previous research with the IFFS drone project. This feasibility study focuses on developing effective coordination between the sensing and action drones, implementing fire detection and localization algorithms, and establishing parameters for autonomous flight planning. Through this innovative collaborative drone approach, we aim to significantly improve both fire detection and suppression capabilities. A critical aspect of the project involves ensuring reliable and safe operation under various environmental conditions. This feasibility study aims to explore the potential of a sensing drone with detection capabilities while investigating coordination mechanisms between the sensing and action drones. We will examine autonomous flight planning approaches and test initial prototypes in controlled environments to assess technical feasibility and safety considerations. If successful, this exploratory work will provide valuable insights for future research into autonomous collaborative drone systems, currently focused on firefighting. This could lead to larger follow-up projects expanding the concept to other safety and security applications.