Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Implementation of wide-field integration of optic flow for autonomous quadrotor navigation

  • Published:
Autonomous Robots Aims and scope Submit manuscript

An Erratum to this article was published on 12 September 2009

Abstract

Insects are capable of robust visual navigation in complex environments using efficient information extraction and processing approaches. This paper presents an implementation of insect inspired visual navigation that uses spatial decompositions of the instantaneous optic flow to extract local proximity information. The approach is demonstrated in a corridor environment on an autonomous quadrotor micro-air-vehicle (MAV) where all the sensing and processing, including altitude, attitude, and outer loop control is performed on-board. The resulting methodology has the advantages of computation speed and simplicity, hence are consistent with the stringent size, weight, and power requirements of MAVs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Amidi, O., Mesaki, Y., & Kanade, T. (1999). A visual odometer for autonomous helicopter flight. Robotics & Autonomous Systems, 28, 185.

    Article  Google Scholar 

  • Argyros, A., Tsakiris, D., & Groyer, C. (2004). Biomimetic centering behavior for mobile robots with panoramic sensors. IEEE Robotics and Automation Magazine, 21–30.

  • Barrows, G., Chahl, J., & Srinivasan, M. (2003). Biologically inspired visual sensing and flight control. The Aeronautical Journal, 107, 159–168.

    Google Scholar 

  • Borst, A., & Haag, J. (2002). Neural networks in the cockpit of the fly. Journal of Comparative Physiology A, 188, 419–437.

    Article  Google Scholar 

  • Braybrook, R. (2008). Air ops. Armada International.

  • Collogan, D. (2006). UAVs on the horizon. Business & Commercial Aviation.

  • Conroy, J., & Pines, D. (2007). System identification of a miniature electric helicopter using mems inertial, optic flow, and sonar sensing. In Proceedings of the American Helicopter Society, Virginia Beach, VA.

  • Coombs, D., Herman, M., Hong, T., & Nashman, M. (1998). Real-time obstacle avoidance using central flow divergence, and peripheral flow. IEEE Transactions on Robotics and Automation, 14, 49–59.

    Article  Google Scholar 

  • Egelhaaf, M., Kern, R., Krapp, H., Kretzberg, J., Kurtz, R., & Warzecha, A. (2002). Neural encoding of behaviourally relevant visual-motion information in the fly. Trends in Neurosciences, 25, 96–102.

    Article  Google Scholar 

  • Evers, J. (2007). Biological inspiration for agile autonomous air vehicles. In Platform innovations and system integration for unmanned air, land, and sea vehicles, Neuilly-sur-Seine, France.

  • Franz, M., & Mallot, H. (2000). Biomimetic robot navigation. Robotics and Autonomous Systems, 30, 133–153.

    Article  Google Scholar 

  • Frye, M., & Dickinson, M. (2001). Fly flight: A model for the neural control of complex behavior. Neuron, 32, 385–388.

    Article  Google Scholar 

  • Garcia-Pardo, P. J., Sukhatme, G. S., & Montgomery, J. F. (2002). Towards vision-based safe landing for an autonomous helicopter. Robotics and Autonomous Systems, 38(1), 19–29.

    Article  MATH  Google Scholar 

  • Garratt, M., & Chahl, J. (2003). Visual control of an autonomous helicopter. In Proceedings of the 41st aerospace sciences meeting and exhibit, Reno, Nevada.

  • Green, W.E., Oh, P. Y., & Barrows, G. (2004). Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments. In Proceedings of the IEEE international conference on robotics and automation, New Orleans, LA.

  • Griffiths, S., Saunders, J., Curtis, A., Barber, B., McLain, T., & Beard, R. (2006). Maximizing miniature aerial vehicles: Obstacle and terrain avoidance for mavs. IEEE Robotics and Automation Magazine, 34–42 (September).

  • Gurfil, P., & Rotstein, H. (2001). Partial aircraft state estimation from visual motion using the subspace constraints approach. Journal of Guidance, Control, and Dynamics, 24, 1016–1028.

    Article  Google Scholar 

  • Harrison, R. (2005). A biologically inspired analog ic for visual collision detection. IEEE Transactions on Circuits and Systems, 52(11), 2308–2318.

    Article  Google Scholar 

  • Herisse, B., Russotto, F. X., Hamel, T., & Mahony, R. (2008). Hovering flight and vertical landing control of a vtol unmanned aerial vehicle using optical flow. In IEEE/RSJ international conference on intelligent robots and systems, Acropolis Convention Center, Nice, France (pp. 1404–1409).

  • Hrabar, S., & Sukhatme, G. S. (2003). Omnidirectional vision for an autonomous helicopter. In IEEE international conference on robotics and automation (pp. 558–563).

  • Hrabar, S., & Sukhatme, G. S. (2004). A comparison of two camera configurations for optic-flow based navigation of a uav through urban canyons. In EEE/RSJ international conference on intelligent robots and systems (pp. 2673–2680).

  • Humbert, J. S., & Hyslop, A. H. (2009). Bio-inspired visuomotor convergence. IEEE Transactions on Robotics (in press).

  • Humbert, J. S., Murray, R. M., & Dickinson, M. H. (2005a). A control-oriented analysis of bio-inspired visuomotor convergence. In Proceedings of the 44th IEEE conference on decision and control, Seville, Spain.

  • Humbert, J. S., Murray, R. M., & Dickinson, M. H. (2005b). Sensorimotor convergence in visual navigation and flight control systems. In Proceedings of the 16th IFAC world congress, Praha, Czech Republic.

  • Humbert, J. S., Hyslop, A. M., & Chinn, M. W. (2007). Experimental validation of wide-field integration methods for autonomous navigation. In Proceedings of the IEEE conference on intelligent robots and systems (IROS), San Diego, CA.

  • Hyslop, A., & Humbert, J. S. (2009). Autonomous navigation in 3-d urban environments using wide-field integration of optic flow. AIAA Journal of Guidance, Control, and Dynamics (submitted).

  • Kearney, J.K., Thompson, W.B., & Boley, D.L. (1987). Optical flow estimation: An error analysis of gradient-based methods with local optimization. IEEE Transactions on Pattern Analysis and Machine Intelligence, 9(2), 229–244.

    Article  Google Scholar 

  • Kendoula, F., Fantoni, I., & Nonamib, K. (2009). Optic flow-based vision system for autonomous 3D localization and control of small aerial vehicles. Robotics and Autonomous Systems, 57(6–7), 591–602.

    Article  Google Scholar 

  • Krapp, H., Hengstenberg, B., & Hengstenberg, R. (1998). Dendritic structure and receptive-field organization of optic flow processing interneurons in the fly. Journal of Neurophysiology, 79, 1902–1917.

    Google Scholar 

  • Li, J., & Chellappa, R. (2005). A factorization method for structure from planar motion. In Motion and video computing, 2005. WACV/MOTIONS ’05 Volume 2. IEEE workshop on (pp. 154–159).

  • Mulrine, A. (2008). Drones fill the troops gap in Afghanistan (U.S. News & World Report).

  • Muratet, L., Doncieux, S., Briere, Y., & Meyer, J. (2005). A contribution to vision-based autonomous helicopter flight in urban environments. Robotics and Autonomous Systems, 50(4), 195–209.

    Article  Google Scholar 

  • Netter, T., & Franceschini, N. (2002). A robotic aircraft that follows terrain using a neuromorphic eye. In Proceedings of the IEEE/RSJ IROS conference on robots and systems, Lausanne, Switzerland.

  • Santos-Victor, J., & Sandini, G. (1997). Embedded visual behaviors for navigation. Robotics and Autonomous Systems, 19, 299–313.

    Article  Google Scholar 

  • Santos-Victor, J., Sandini, G., Curroto, F., & Garibaldi, S. (1995). Divergent stereo in autonomous navigation—from bees to robots. International Journal of Computer Vision, 14, 159–177.

    Article  Google Scholar 

  • Serres, J., Dray, D., Ruffier, F., & Franceschini, N. (2008). A vision-based autopilot for a miniature air vehicle: Joint speed control and lateral obstacle avoidance. Autonomous Robots, 25, 103–122.

    Article  Google Scholar 

  • Serres, J., Ruffier, F., & Franceschini, N. (2005). Two optic flow regulators for speed control and obstacle avoidance. In Proceedings of the IEEE international conference on medical robotics and biomechatronics, Pisa, Italy.

  • Srinivasan, M., Chahl, J., Weber, K., Nagle, S. V. M., & Zhang, S. (1999). Robot navigation inspired by principles of insect vision. Robotics and Autonomous Systems, 26, 203–216.

    Article  Google Scholar 

  • Srinivasan, M., & Zhang, S. (2004). Visual motor computations in insects. Annual Review if Neuroscience, 27, 679–696.

    Article  Google Scholar 

  • Stevens, B., & Lewis, F. (2003). Aircraft control and simulation. Hoboken: Wiley.

    Google Scholar 

  • Tammero, L. F., & Dickinson, M. H. (2002). The influence of visual landscape on the free flight behavior of the fruit fly Drosophila melanogaster. Journal of Experimental Biology, 205, 327–343.

    Google Scholar 

  • Weber, K., Venkatesh, S., & Srinivasan, M. (1999). Robot navigation inspired by principles of insect vision. Robotics and Autonomous Systems, 26, 203–216.

    Article  Google Scholar 

  • Zhu, W.-H., & Lamarche, T. (2007). Velocity estimation by using position and acceleration sensors. IEEE Transactions on Industrial Electronics, 54, 2706–2715.

    Article  Google Scholar 

  • Zufferey, J. C., & Floreano, D. (2006). Fly-inspired visual steering of an ultralight indoor aircraft. IEEE Transactions on Robotics, 22, 137–146.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joseph Conroy.

Additional information

An erratum to this article can be found at http://dx.doi.org/10.1007/s10514-009-9154-7

Rights and permissions

Reprints and permissions

About this article

Cite this article

Conroy, J., Gremillion, G., Ranganathan, B. et al. Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Auton Robot 27, 189–198 (2009). https://doi.org/10.1007/s10514-009-9140-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-009-9140-0

Keywords

Navigation