Authors:
Yerai Berenguer
;
Luis Payá
;
Adrián Peidró
and
Oscar Reinoso
Affiliation:
Miguel Hernández University, Spain
Keyword(s):
SLAM, Global Appearance, Omnidirectional Images, Vision systems, Image description, Loop Closure
Related
Ontology
Subjects/Areas/Topics:
Image Processing
;
Informatics in Control, Automation and Robotics
;
Mobile Robots and Autonomous Systems
;
Robotics and Automation
;
Vision, Recognition and Reconstruction
Abstract:
This work presents a SLAM algorithm to estimate the position and orientation of a mobile robot while simultaneously
creating the map of the environment. It uses only visual information provided by a catadioptric system
mounted on the robot formed by a camera pointing towards a convex mirror. It provides the robot with omnidirectional
images that contain information with a field of view of 360 degrees around the camera-mirror axis.
Each omnidirectional scene acquired by the robot is described using global appearance descriptors. Thanks to
their compactness, this kind of descriptors permits running the algorithm in real time. The method consists of
three different steps. First, the robot calculates the pose of the robot (location and orientation) and creates a
new node in the map. This map is formed by connected nodes between them. Second, it detects loop closures
between the new node and the nodes of the map. Finally, the map is optimized by using an optimization
algorithm and the det
ected loop closures. Two different sets of images have been used to test the effectiveness
of the method. They were captured in two real environments, while the robot traversed two paths. The results
of the experiments show the effectiveness of our method.
(More)