Abstract
This paper proposes the “Substitute Interface” to utilize the flat surfaces of objects around us as part of an ad hoc mobile device. The substitute interface is established by the combination of wearable devices such as a head-mounted display with camera and a ring-type microphone. The camera recognizes which object the user intends to employ. When the user picks up and taps the object, such as a notebook, a virtual display is overlaid on the object, and the user can operate the ad hoc mobile device as if the object were part of the device. Display size can be changed easily by selecting a larger object. The user’s pointing/selection action is recognized by the combination of the camera and the ring-type microphone. We first investigate the usage scene of tablet devices and create a prototype that can operate as a tablet device. Experiments on the prototype confirm that the proposal functions as intended.
Similar content being viewed by others
References
Nakanishi M, Horikoshi T (2008) Physical object interaction using a glasses-type display. In: SIGGRAPH 2008 posters
Kölsch M, Turk M, Höllerer T (2004) Vision-based interfaces for mobility. In: Proceedings of the MobiQuitous, pp 86–94
Kurata T, Okuma T, Kourogi M, Sakaue K (2001) The hand mouse: GMM hand-color classification and mean shift tracking. In: 2nd international workshop on recognition, analysis and tracking of faces and gestures in real-time system, pp 119–124
Smith R, Piekarski W, Wigley G (2005) Hand tracking for low powered mobile AR user interfaces. In: Proceedings of the Australasian user interface conference, pp 7–16
Ishii H, Ullmer B (2007) Tangible bits: towards seamless interfaces between people, bits and atoms. In: Proceedings of the CHI ‘97, pp 234–241
Kato H, Billinghurst M, Poupyrev I, Imamoto K, Tachibana K (2000) Virtual object manipulation on a table-top AR environment. In: Proceedings of the ISMAR, pp 111–119
Holman D, Vertegaal R, Altosaar M, Troje N, Johns D (2005) Paper windows: interaction techniques for digital paper. In: Proceedings of the CHI ‘05, pp 591–599
Kim K, Lepetit V, Woo W (2010) Scalable real-time planar targets tracking for digilog books. Vis Comput 26(6–8):1145–1154
Lima J, Simões F, Figueiredo L, Teichrieb V, Kelner J, Santos I (2009) Model based 3D tracking techniques for marker less augmented reality. In: XI symposium on virtual and augmented reality, pp 37–47
Wagner D, Reitmayr G, Mulloni A, Drummond T, Schmalstieg D (2008) Pose tracking from natural features on mobile phones. In: Proceedings of the ISMAR, pp 125–134
Klein G, Murray D (2007) Parallel tracking and mapping for small AR workspaces. In: Proceedings of the ISMAR 2007, pp 225–234
Harrison C, Benko H, Wilson AD (2011) OmniTouch: wearable multitouch interaction everywhere. In: Proceedings of the UIST ‘11, pp 441–450
Mistry P, Maes P, Chang L (2009) WUW—wear Ur world: a wearable gestural interface. In: Proceedings of the CHI ‘09, pp 4111–4116 (2009)
Harrison C, Tan D, Morris D (2010) Skinput: appropriating the body as an input surface. In: Proceedings of the CHI ‘10, pp 453–462
Acknowledgments
We thank Yoshiyuki Sukedai from PieCake, Inc., for system development of this project. Several discussions with members of NTT DOCOMO, Inc., Research Laboratories were also very helpful.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Nakanishi, M., Horikoshi, T. Intuitive substitute interface. Pers Ubiquit Comput 17, 1797–1805 (2013). https://doi.org/10.1007/s00779-013-0651-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00779-013-0651-5