Abstract
The human-machine interface is an essential part of intelligent robotic system. Through the human-machine interface, human being can interact with the robot. Especially, in tele-robotics environment, the human-machine interface can be developed with remarkable extended functionality. In this paper, we propose a human-machine interface with augmented reality for the network based mobile robot. Generally, we can take some meaningful information from human’s motion such as movement of head or fingers. So, it is very useful to take these motions as input for systems. We synchronize head motion of human being and the camera motion of the mobile robot using visual information. So user of the mobile robot can monitor environment of the mobile robot as eyesight of mobile robot. Then we use gesture recognition for control the mobile robot. In the implemented framework, the user can monitor what happens in environment as eyesight of mobile robot and control the mobile robot easily and intuitively by using gesture.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Fiala, M.: Pano-presence for teleoperation. Intelligent Robots and Systems. In: Fiala, M. (ed.) 2005 IEEE/RSJ International Conference, August 2-6, 2005, pp. 3798–3802. IEEE, Los Alamitos (2005)
Koeda, M., Matsumoto, Y., Ogasawara, T.: Annotation-based rescue assistance system for teleoperated unmanned helicopter with wearable augmented reality environment, pp. 120–124, Digital Object Identifier 10.1109/SSRR.2005.1501250
Birkfellner, W., Figl, M., Huber, K., Watzinger, F., Wanschitz, F., Hummel, J., Hanel, R., Greimel, W., Homolka, P., Ewers, R., Bergmann, H.: A head-mounted operating binocular for augmented reality visualization in medicine - design and initial evaluation, pp. 991–997, Digital Object Identifier 10.1109 /TMI.2002.803099
Omata, M., Go, K., Imamiya, A.: Augmented Reality Clipboard with the twist-information presentation method. In: Proceedings of the 17th International Conference on Advanced Information Networking and Applications, Xi’an, China pp. 392–400. Piekarski, W. & Thomas, B (2002)
Wei Ji Williams, R.L. Howell, J.N. Conatser Jr., R.R.: 3D Stereo Viewing Evaluation for the Virtual Haptic Back Project. Haptic Interfaces for Virtual Environment and Teleoperator Systems. In: 14th Symposium, 25-26 March 2006 On pp. 251–258, ISBN: 1-4244-0226-3 (2006)
Yao, Y., Zhu, M., Jiang, Y., Lu, G.: A bare hand controlled AR map navigation system. October 10-13, 2004, vol. 3, pp. 2635–2639 ISSN: 1062-922X
Al-Mouhamed, M., Toker, O., Iqbal, A., Nazeeruddin, M.: A distributed framework for relaying stereo vision for telerobotics. July 19-23, 2004, pp. 221–225 (2004) ISBN: 0-7803-8577-2
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lee, HD., Lee, HG., Kim, JH., Park, MC., Park, GT. (2007). Human Machine Interface with Augmented Reality for the Network Based Mobile Robot. In: Apolloni, B., Howlett, R.J., Jain, L. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2007. Lecture Notes in Computer Science(), vol 4694. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74829-8_8
Download citation
DOI: https://doi.org/10.1007/978-3-540-74829-8_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-74828-1
Online ISBN: 978-3-540-74829-8
eBook Packages: Computer ScienceComputer Science (R0)