Authors:
María Flores
1
;
David Valiente
2
;
Juan José Cabrera
1
;
Oscar Reinoso
1
and
Luis Payá
1
Affiliations:
1
Department of Systems Engineering and Automation, Miguel Hernandez University, Elche, Spain
;
2
Department of Communications Engineering, Miguel Hernandez University, Elche, Spain
Keyword(s):
Dual Fisheye Images, 360-degree View, Stitching Process.
Abstract:
360-degree views are beneficial in robotic tasks because they provide a compact view of the whole scenario. Among the different vision systems to generate this image, we use a back-to-back pair of fisheye lens cameras by Garmin (VIRB 360). The objectives of this work are twofold: generating a high-quality 360-degree view using different algorithms and performing an analytic evaluation. To provide a consistent evaluation and comparison of algorithms, we propose an automatic method that determines the similarity of the overlapping area of the generated views as regards a reference image, in terms of a global descriptor. These descriptors are obtained from one of the Convolutional Neural Network layers. As a result, the study reveals that an accurate stitching process can be achieved when a high number of feature points are detected and uniformly distributed in the overlapping area. In this case, the 360-degree view generated by the algorithm which employs the camera model provides more
efficient stitching than the algorithm which considers the angular fisheye projection. This outcome demonstrates the wrong effects of the fisheye projection, which presents high distortion in the top and bottom parts. Likewise, both algorithms have been also compared with the view generated by the camera.
(More)