Nothing Special   »   [go: up one dir, main page]

Skip to main content

Bare Hand Interface for Interaction in the Video See-Through HMD Based Wearable AR Environment

  • Conference paper
Entertainment Computing - ICEC 2006 (ICEC 2006)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 4161))

Included in the following conference series:

Abstract

In this paper, we propose a natural and intuitive bare hand interface for wearable augmented reality environment using the video see-through HMD. The proposed methodology automatically learned color distribution of the hand object through the template matching and tracking the hand objects by using the Meanshift algorithm under the dynamic background and moving camera. Furthermore, even though users are not wearing gloves, extracting of the hand object from arm is enabled by applying distance transform and using radius of palm. The fingertip points are extracted by convex hull processing and assigning constraint to the radius of palm area. Thus, users don’t need attaching fiducial markers on fingertips. Moreover, we implemented several applications to demonstrate the usefulness of proposed algorithm. For example, “AR-Memo" can help user to memo in the real environment by using a virtual pen which is augmented on the user’s finger, and user can also see the saved memo on his/her palm by augmenting it while moving around anywhere. Finally, we experimented performance and did usability studies.

This research is supported by the UCN Project, the MIC 21C Frontier R&D Program in Korea and GIST ICRC & CTRC.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. ARToolKit, http://www.hitl.washington.edu/artoolkit/

  2. Hu, M.K.: Visual pattern recognition by moment invariants. IEEE Transactions on Information Theory IT-8, 179–187 (1962)

    MATH  Google Scholar 

  3. Shapiro, L., Stockman, G.: Computer Vision, pp. 196–197. Prentice-Hall, Englewood Cliffs (2001)

    Google Scholar 

  4. Bradski, G.R.: Computer Vision Face Tracking For Use in a Perceptual User Interface, Microcomputer Research Lab. Santa Clara, Intel Corporation

    Google Scholar 

  5. Avis, D., Toussaint: An optimal algorithm for determining the visibility of a polygon from an edge. IEEE Trans. Comp. C-30, 910–914 (1981)

    Article  MathSciNet  Google Scholar 

  6. Daeyang i-visor DH-4400VP, http://www.mpcclub.ru/index.php?action=product&id=3340

  7. Open Source Computer Vision Library, http://www.intel.com/research/mrl/research/opencv

  8. Smith, R., Piekarski, W., Wigley, G.: Hand Tracking For Low Powered Mobile AR User Interfaces. In: 6th Australasian User Interface Conference, Newcastle, NSW (January 2005)

    Google Scholar 

  9. Manresa, C., et al.: Hand Tracking and Gesture Recognition for Human-Computer Interaction. Electronic Letters on Computer Vision and Image Analysis 5(3), 96–104 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 IFIP International Federation for Information Processing

About this paper

Cite this paper

Ha, T., Woo, W. (2006). Bare Hand Interface for Interaction in the Video See-Through HMD Based Wearable AR Environment. In: Harper, R., Rauterberg, M., Combetto, M. (eds) Entertainment Computing - ICEC 2006. ICEC 2006. Lecture Notes in Computer Science, vol 4161. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11872320_48

Download citation

  • DOI: https://doi.org/10.1007/11872320_48

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-45259-1

  • Online ISBN: 978-3-540-45261-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics