Abstract
In the 3D object controlling or virtual space wandering tasks, it is necessary to provide the efficient zoom operation. The common method is using the combination of the mouse and keyboard. This method requires users familiar with the operation which needs much time to practice. This paper presents two methods to recognize the zoom operation by sensing users’ pull and push movement. People only need to hold a camera in hand and when they pull or push hands, our approach will sense the proximity and translate it into the zoom operation in the tasks. By user studies, we have compared different methods’ correct rate and analyzed the factors which will affect the approach’s performance. The results show that our methods are real-time and high accurate.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Bouguet, J.V.: Pyramidal implementation of the Lucas Kanade Feature Tracker Description of the algorithm. Intel. Corporation Microprocessor Research Labs (1999)
Harrison, C., Anind, K.D.: Lean and Zoom: Proximity-Aware UserInterface and Content Magnification. In: Proc. CHI, pp. 507–510 (2008)
ISO. Ergonomic requirements for office work with visual display terminals (VDTs) - Requirements for nonkeyboard input devices. ISO 9241-9 (2000)
Shi, J., Tomasi, C.: Good features to track. In: Proc. IEEE Comput. Soc. Conf. Comput. Vision and Pattern Recogn., pp. 593–600 (1994)
Sohn, M., Lee, G.: ISeeU: camera-based User interface for a handheld computer. In: ACM MobilCHI 2005, pp. 299–302 (2005)
Wang, J., Canny, J.: TinyMotion: Camera Phone Based Interaction Methods. In: Proc. CHI 2006, pp. 339–344 (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Fan, M., Shi, Y. (2009). Pull and Push: Proximity-Aware User Interface for Navigating in 3D Space Using a Handheld Camera. In: Jacko, J.A. (eds) Human-Computer Interaction. Ambient, Ubiquitous and Intelligent Interaction. HCI 2009. Lecture Notes in Computer Science, vol 5612. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02580-8_15
Download citation
DOI: https://doi.org/10.1007/978-3-642-02580-8_15
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-02579-2
Online ISBN: 978-3-642-02580-8
eBook Packages: Computer ScienceComputer Science (R0)