##plugins.themes.bootstrap3.article.main##

Jian-Xun Wu Yuan-Pao Hsu

Abstract

This work implements the Real-Time Appearance Map (RTAB-MAP) algorithm on a unmanned aerial vehicle (UAV) to perform indoor localization task. The RTAB-MAP, based on a RGB-D camera, estimates the camera moving trajectory, mileage and local map according to the feature points between adjacent images to obtain globally consistent map information and camera locations. However, when using a camera for simultaneous localization and mapping (SLAM), images are prone to blurred by the fast motion of the vehicle, on which the camera is mounted, or the overlapping area of two consecutive image frames is too small, causing the feature matching failed. Therefore, this study combines an inertial measurement unit (IMU) to provide odometry data to solve the problem. In this study, when the communication between the experimental drone and ground station is well established, the ground station collects the sensory data from the drone and builds the map of the indoor environment through the RTAB-MAP method. According to the map, the system plans a path and sends waypoints to test the localization of the drone. Simulation and experimental results reveal that average trajectory errors are within ± 5 cm.

Download Statistics

##plugins.themes.bootstrap3.article.details##

Keywords
References
Citation Format
Section
Advances in Grounded and Aerial Unmanned Robots