Abstract
Visual navigation in robotics is one of the challenging issues, and many navigation approaches are based on localization of a mobile robot in the environment. The snapshot model is a biologically inspired model of insect behaviour to return home and it shows a simple algorithm to compare the snapshot images at the current position and the destination, instead of complex localization process. Here, we propose a new homing navigation method based on a moment measure to characterize the snapshot image efficiently. The method uses range values or pixel values of surrounding landmarks. Then it defines a moment measure to evaluate the environmental features, or landmark distributions, and the measure forms a convex shape of landscape with respect to robot positions in the environment. Based on the landscape, the mobile robot can return home successfully. Range sensors or image sensors can sufficiently provide the landscape information. Our experimental results demonstrate that the method is effective even in real environments.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Cartwright, B., Collett, T.: Landmark learning in bees. J. Comp. Physiol. A 151(4), 521–543 (1983)
Collett, M., Collett, T.: How do insects use path integration for their navigation? Biol. Cybern. 83(3), 245–259 (2000)
Etienne, A., Jeffery, K.: Path integration in mammals. Hippocampus 14(2), 180–192 (2004)
Etienne, A., Maurer, R., Seguinot, V.: Path integration in mammals and its interaction with visual landmarks. J. Exp. Biol. 199, 201–209 (1996)
Franz, M., Scholkopf, B., Mallot, H., Bulthoff, H.: Where did I take that snapshot? Scene-based homing by image matching. Biol. Cybern. 79(3), 191–202 (1998)
Franz, M.: Minimalistic visual navigation = Minimalistische visuelle navigation. Ph.D. thesis, Universitat Tubingen (1999)
Garm, A., Oskarsson, M., Nilsson, D.: Box jellyfish use terrestrial visual cues for navigation. Curr. Biol. (2011)
Hong, J., Tan, X., Pinette, B., Weiss, R., Riseman, E.: Image-based homing. IEEE Control Syst. Mag. 12(1), 38–45 (1992)
Kimchi, T., Etienne, A., Terkel, J.: A subterranean mammal uses the magnetic compass for path integration. Proc. Natl. Acad. Sci. U.S.A. 101(4), 1105 (2004)
Kirchner, W., Braun, U.: Dancing honey bees indicate the location of food sources using path integration rather than cognitive maps. Anim. Behav. 48(6), 1437–1441 (1994)
Kwon, T., Song, J.: A new feature commonly observed from air and ground for outdoor localization with elevation map built by aerial mapping system. J. Field Robot. 28(2), 227–240 (2011)
Labrosse, F.: Short and long-range visual navigation using warped panoramic images. Robot. Auton. Syst. 55(9), 675–684 (2007)
Lambrinos, D., Moller, R., Labhart, T., Pfeifer, R., Wehner, R.: A mobile robot employing insect strategies for navigation. Robot. Auton. Syst. 30(1–2), 39–64 (2000)
Lent, D., Graham, P., Collett, T.: Image-matching during ant navigation occurs through saccade-like body turns controlled by learned visual features. Proc. Natl. Acad. Sci. 107(37), 16348–16353 (2010)
Luschi, P., Papi, F., Liew, H., Chan, E., Bonadonna, F.: Long-distance migration and homing after displacement in the green turtle (Chelonia mydas): a satellite tracking study. J. Comp. Physiol. A 178(4), 447–452 (1996)
Mather, J.: Navigation by spatial memory and use of visual landmarks in octopuses. J. Comp. Physiol. A: Neuroethology Sens. Neural Behav. Physiol. 168(4), 491–497 (1991)
Moller, R., Vardy, A.: Local visual homing by matched-filter descent in image distances. Biol. Cybern. 95(5), 413–430 (2006)
Moller, R., Vardy, A., Kreft, S., Ruwisch, S.: Visual homing in environments with anisotropic landmark distribution. Auton. Robots 23(3), 231–245 (2007)
Moller, R.: Local visual homing by warping of two-dimensional images. Robot. Auton. Syst. 57(1), 87–101 (2009)
Moller, R., Krzykawski, M., Gerstmayr, L.: Three 2D-warping schemes for visual robot navigation. Auton. Robots 29(3), 253–291 (2010)
Ramisa, A., Goldhoorn, A., Aldavert, D., Toledo, R., de Mantaras, R.: Combining invariant features and the ALV homing method for autonomous robot navigation based on panoramas. J. Intell. Robot. Syst. 1–25 (2011)
Smith, L., Philippides, A., Graham, P., Baddeley, B., Husbands, P.: Linked local navigation for visual route guidance. Adapt. Behav. 15(3), 257–271 (2007)
Srinivasan, M.: Honey bees as a model for vision, perception, and cognition. Ann. Rev. Entomol. 55, 267–284 (2010)
Steck, K., Knaden, M., Hansson, B.: Do desert ants smell the scenery in stereo? Anim. Behav. 79(4), 939–945 (2010)
Ugolini, A., Borgioli, G., Galanti, G., Mercatelli, L., Hariyama, T.: Photoresponses of the compound eye of the sandhopper talitrus saltator (Crustacea, Amphipoda) in the ultraviolet-blue range. Biol. Bull. 219(1), 72–79 (2010)
Weber, K., Venkatesh, S., Srinivasan, M.: Insect-inspired robotic homing. Adapt. Behav. 7(1), 65–97 (1999)
Zeil, J.: Visual homing: an insect perspective. Curr. Opin. Neurobiol
Zeil, J., Hemmi, J.: The visual ecology of fiddler crabs. J. Comp. Physiol. A 192(1), 1–25 (2006)
Zeil, J., Hofmann, M., Chahl, J.: Catchment areas of panoramic snapshots in outdoor scenes. J. Opt. Soc. Am. A 20(3), 450–469 (2003)
Acknowledgement
This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MEST) (No. 2014R1A2A1A11053839).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Lee, C., Kim, D. (2016). A Moment Measure Model of Landmarks for Local Homing Navigation. In: Tuci, E., Giagkos, A., Wilson, M., Hallam, J. (eds) From Animals to Animats 14. SAB 2016. Lecture Notes in Computer Science(), vol 9825. Springer, Cham. https://doi.org/10.1007/978-3-319-43488-9_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-43488-9_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-43487-2
Online ISBN: 978-3-319-43488-9
eBook Packages: Computer ScienceComputer Science (R0)