Accurate localization in autonomous robots enables effective decision-making within their operating environment. Various methods have been developed to address this challenge, encompassing traditional techniques, fiducial marker utilization, and machine learning approaches. This work proposes a deep-learning solution employing Convolutional Neural Networks (CNN) to tackle the localization problem, specifically in the context of the RobotAtFactory 4.0 competition. The proposed approach leverages transfer learning from the pre-trained VGG16 model to capitalize on its existing knowledge. To validate the effectiveness of the approach, a simulated scenario was employed. The experimental results demonstrated an error within the millimeter scale and rapid response times in milliseconds. Notably, the presented approach offers several advantages, including a consistent model size regardless of the number of training images utilized and the elimination of the need to know the absolute positions of the fiducial markers.
DOCUMENT
Localization is a crucial skill in mobile robotics because the robot needs to make reasonable navigation decisions to complete its mission. Many approaches exist to implement localization, but artificial intelligence can be an interesting alternative to traditional localization techniques based on model calculations. This work proposes a machine learning approach to solve the localization problem in the RobotAtFactory 4.0 competition. The idea is to obtain the relative pose of an onboard camera with respect to fiducial markers (ArUcos) and then estimate the robot pose with machine learning. The approaches were validated in a simulation. Several algorithms were tested, and the best results were obtained by using Random Forest Regressor, with an error on the millimeter scale. The proposed solution presents results as high as the analytical approach for solving the localization problem in the RobotAtFactory 4.0 scenario, with the advantage of not requiring explicit knowledge of the exact positions of the fiducial markers, as in the analytical approach.
DOCUMENT
To understand how transition across different thermal zones in a building impacts the thermal perception of occupants, the current work examines occupant feedback in two work environments — nursing staff in hospital wards and the workers in an office. Both studies used a mix of subjective surveys and objective measurements. A total of 96 responses were collected from the hospital wards while 142 were collected from the office. The thermal environment in the hospital wards was perceived as slightly warm on the ASHRAE thermal sensation scale (mean TSV = 1.2), while the office workers rated their environment on the cool side (mean TSV = 0.15). The results also show that when the transitions were across temperature differences within 2 °C, the thermal perception was not impacted by the magnitude of the temperature difference — as reflected in occupant thermal sensation and thermal comfort/thermal acceptability vote. This would imply that the effect of temperature steps on thermal perception, if any, within these boundaries, was extremely short lived. These findings go towards establishing the feasibility of heterogeneous indoor thermal environments and thermal zoning of workspaces for human comfort.
DOCUMENT
Drones have been verified as the camera of 2024 due to the enormous exponential growth in terms of the relevant technologies and applications such as smart agriculture, transportation, inspection, logistics, surveillance and interaction. Therefore, the commercial solutions to deploy drones in different working places have become a crucial demand for companies. Warehouses are one of the most promising industrial domains to utilize drones to automate different operations such as inventory scanning, goods transportation to the delivery lines, area monitoring on demand and so on. On the other hands, deploying drones (or even mobile robots) in such challenging environment needs to enable accurate state estimation in terms of position and orientation to allow autonomous navigation. This is because GPS signals are not available in warehouses due to the obstruction by the closed-sky areas and the signal deflection by structures. Vision-based positioning systems are the most promising techniques to achieve reliable position estimation in indoor environments. This is because of using low-cost sensors (cameras), the utilization of dense environmental features and the possibilities to operate in indoor/outdoor areas. Therefore, this proposal aims to address a crucial question for industrial applications with our industrial partners to explore limitations and develop solutions towards robust state estimation of drones in challenging environments such as warehouses and greenhouses. The results of this project will be used as the baseline to develop other navigation technologies towards full autonomous deployment of drones such as mapping, localization, docking and maneuvering to safely deploy drones in GPS-denied areas.
Due to the exponential growth of ecommerce, the need for automated Inventory management is crucial to have, among others, up-to-date information. There have been recent developments in using drones equipped with RGB cameras for scanning and counting inventories in warehouse. Due to their unlimited reach, agility and speed, drones can speed up the inventory process and keep it actual. To benefit from this drone technology, warehouse owners and inventory service providers are actively exploring ways for maximizing the utilization of this technology through extending its capability in long-term autonomy, collaboration and operation in night and weekends. This feasibility study is aimed at investigating the possibility of developing a robust, reliable and resilient group of aerial robots with long-term autonomy as part of effectively automating warehouse inventory system to have competitive advantage in highly dynamic and competitive market. To that end, the main research question is, “Which technologies need to be further developed to enable collaborative drones with long-term autonomy to conduct warehouse inventory at night and in the weekends?” This research focusses on user requirement analysis, complete system architecting including functional decomposition, concept development, technology selection, proof-of-concept demonstrator development and compiling a follow-up projects.