Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Intuitive substitute interface

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

This paper proposes the “Substitute Interface” to utilize the flat surfaces of objects around us as part of an ad hoc mobile device. The substitute interface is established by the combination of wearable devices such as a head-mounted display with camera and a ring-type microphone. The camera recognizes which object the user intends to employ. When the user picks up and taps the object, such as a notebook, a virtual display is overlaid on the object, and the user can operate the ad hoc mobile device as if the object were part of the device. Display size can be changed easily by selecting a larger object. The user’s pointing/selection action is recognized by the combination of the camera and the ring-type microphone. We first investigate the usage scene of tablet devices and create a prototype that can operate as a tablet device. Experiments on the prototype confirm that the proposal functions as intended.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20

Similar content being viewed by others

References

  1. Nakanishi M, Horikoshi T (2008) Physical object interaction using a glasses-type display. In: SIGGRAPH 2008 posters

  2. Kölsch M, Turk M, Höllerer T (2004) Vision-based interfaces for mobility. In: Proceedings of the MobiQuitous, pp 86–94

  3. Kurata T, Okuma T, Kourogi M, Sakaue K (2001) The hand mouse: GMM hand-color classification and mean shift tracking. In: 2nd international workshop on recognition, analysis and tracking of faces and gestures in real-time system, pp 119–124

  4. Smith R, Piekarski W, Wigley G (2005) Hand tracking for low powered mobile AR user interfaces. In: Proceedings of the Australasian user interface conference, pp 7–16

  5. Ishii H, Ullmer B (2007) Tangible bits: towards seamless interfaces between people, bits and atoms. In: Proceedings of the CHI ‘97, pp 234–241

  6. ARToolkit http://www.hitl.washington.edu/artoolkit/

  7. Kato H, Billinghurst M, Poupyrev I, Imamoto K, Tachibana K (2000) Virtual object manipulation on a table-top AR environment. In: Proceedings of the ISMAR, pp 111–119

  8. Holman D, Vertegaal R, Altosaar M, Troje N, Johns D (2005) Paper windows: interaction techniques for digital paper. In: Proceedings of the CHI ‘05, pp 591–599

  9. Kim K, Lepetit V, Woo W (2010) Scalable real-time planar targets tracking for digilog books. Vis Comput 26(6–8):1145–1154

    Article  Google Scholar 

  10. Lima J, Simões F, Figueiredo L, Teichrieb V, Kelner J, Santos I (2009) Model based 3D tracking techniques for marker less augmented reality. In: XI symposium on virtual and augmented reality, pp 37–47

  11. Wagner D, Reitmayr G, Mulloni A, Drummond T, Schmalstieg D (2008) Pose tracking from natural features on mobile phones. In: Proceedings of the ISMAR, pp 125–134

  12. Klein G, Murray D (2007) Parallel tracking and mapping for small AR workspaces. In: Proceedings of the ISMAR 2007, pp 225–234

  13. Harrison C, Benko H, Wilson AD (2011) OmniTouch: wearable multitouch interaction everywhere. In: Proceedings of the UIST ‘11, pp 441–450

  14. Mistry P, Maes P, Chang L (2009) WUW—wear Ur world: a wearable gestural interface. In: Proceedings of the CHI ‘09, pp 4111–4116 (2009)

  15. Harrison C, Tan D, Morris D (2010) Skinput: appropriating the body as an input surface. In: Proceedings of the CHI ‘10, pp 453–462

Download references

Acknowledgments

We thank Yoshiyuki Sukedai from PieCake, Inc., for system development of this project. Several discussions with members of NTT DOCOMO, Inc., Research Laboratories were also very helpful.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mikiko Nakanishi.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Nakanishi, M., Horikoshi, T. Intuitive substitute interface. Pers Ubiquit Comput 17, 1797–1805 (2013). https://doi.org/10.1007/s00779-013-0651-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-013-0651-5

Keywords

Navigation