Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3131542.3140267acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiotConference Proceedingsconference-collections
poster

HoloLens is more than air Tap: natural and intuitive interaction with holograms

Published: 22 October 2017 Publication History

Abstract

Augmented Reality (AR) is becoming more and more popular and many applications across multiple domains are developed on AR hardware such as the Microsoft HoloLens or similar Head-Mounted Displays (HMD). Most of the AR applications are visualizing information that was not visible before and enable interaction with this information using voice input, gaze tracking, and gesture interaction. However, to be consistent across all applications running on an AR device, the gestures that are available for developers are very limited. In our use case, using a Microsoft HoloLens, this is just an Air Tap and a Manipulation gesture. While this is great for users, as they only have to learn a defined amount of gestures, it is not always easy for developers to create a natural interaction experience, as the gestures that are considered natural, depend on the scenario. In this paper, we are using an additional sensor, a Microsoft Kinect, in order to allow users to interact naturally and intuitively with holographic content that is displayed on a HoloLens. As a proof-of-concept we give five examples of gestures using natural interaction for HoloLens.

References

[1]
Volkert Buchmann, Stephen Violich, Mark Billinghurst, and Andy Cockburn. 2004. FingARtips: gesture based direct manipulation in Augmented Reality. In Proc. GRAPHITE'04. ACM, 212--221.
[2]
Tilman Dingler, Markus Funk, and Florian Alt. 2015. Interaction proxemics: Combining physical spaces for seamless gesture interaction. In Proceedings of the 4th International Symposium on Pervasive Displays. ACM, 107--114.
[3]
Thammathip Piumsomboon, Adrian Clark, Mark Billinghurst, and Andy Cockburn. 2013. User-defined gestures for augmented reality. In CHI'13 EA. ACM, 955--960.

Cited By

View all
  • (2023)Evaluating Interaction Challenges of Head-Mounted Device-Based Augmented Reality Applications for First-Time Users at Museums and ExhibitionsCulture and Computing10.1007/978-3-031-34732-0_11(150-163)Online publication date: 9-Jul-2023
  • (2022)Web XR User Interface Research: Design 3D Layout Framework in Static WebsitesApplied Sciences10.3390/app1211560012:11(5600)Online publication date: 31-May-2022
  • (2022)Interacting With New York City Data by HoloLens Through Remote RenderingIEEE Consumer Electronics Magazine10.1109/MCE.2022.316596111:5(64-72)Online publication date: 1-Sep-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
IoT '17: Proceedings of the Seventh International Conference on the Internet of Things
October 2017
211 pages
ISBN:9781450353182
DOI:10.1145/3131542
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 October 2017

Check for updates

Author Tags

  1. augmented reality
  2. gesture interfaces
  3. head-mounted displays
  4. natural interaction

Qualifiers

  • Poster

Conference

IoT '17

Acceptance Rates

Overall Acceptance Rate 28 of 84 submissions, 33%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)37
  • Downloads (Last 6 weeks)8
Reflects downloads up to 18 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Evaluating Interaction Challenges of Head-Mounted Device-Based Augmented Reality Applications for First-Time Users at Museums and ExhibitionsCulture and Computing10.1007/978-3-031-34732-0_11(150-163)Online publication date: 9-Jul-2023
  • (2022)Web XR User Interface Research: Design 3D Layout Framework in Static WebsitesApplied Sciences10.3390/app1211560012:11(5600)Online publication date: 31-May-2022
  • (2022)Interacting With New York City Data by HoloLens Through Remote RenderingIEEE Consumer Electronics Magazine10.1109/MCE.2022.316596111:5(64-72)Online publication date: 1-Sep-2022
  • (2022)Learning and Teaching Fluid Dynamics using Augmented and Mixed Reality2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct57072.2022.00186(865-869)Online publication date: Oct-2022
  • (2021)GesturAR: An Authoring System for Creating Freehand Interactive Augmented Reality ApplicationsThe 34th Annual ACM Symposium on User Interface Software and Technology10.1145/3472749.3474769(552-567)Online publication date: 10-Oct-2021
  • (2021)A comparison of natural user interface and graphical user interface for narrative in HMD-based augmented realityMultimedia Tools and Applications10.1007/s11042-021-11723-0Online publication date: 30-Dec-2021
  • (2021)Surgical navigation system for brachytherapy based on mixed reality using a novel stereo registration methodVirtual Reality10.1007/s10055-021-00503-8Online publication date: 15-Feb-2021
  • (2020)RESHAPING HUMAN INTENTION ON HUMAN-MACHINE INTERACTION BY USING HOLOGRAMSİnsan - Makine Etkileşiminde Hologram Kullanılarak İnsan Niyetinin YönlendirilmesiKonya Journal of Engineering Sciences10.36306/konjes.8221268(1-8)Online publication date: 31-Dec-2020
  • (2020)A REMOTE SHARING METHOD USING MIXED REALITY FOR 3D PHYSICAL OBJECTS THAT ENABLES HIGH-SPEED POINT CLOUD SEGMENTATION AND RECEIVER'S OBJECT MANIPULATION高速点群セグメンテーションと受信者のオブジェクト操作を可能とする3次元実物体の複合現実による遠隔共有手法Journal of Environmental Engineering (Transactions of AIJ)10.3130/aije.85.101785:778(1017-1026)Online publication date: 2020
  • (2020)Exploring mixed reality based on self-efficacy and motivation of usersResearch in Learning Technology10.25304/rlt.v28.233128Online publication date: 21-Feb-2020
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media