Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3586182.3615792acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
demonstration

LensTouch: Touch Input on Lens Surfaces of Smart Glasses

Published: 29 October 2023 Publication History

Abstract

Smart glasses are promising devices that allow the user to experience augmented reality (AR) and watch movies as on a big screen. As their design is primarily focused on their function as output devices, their input functionality is limited. We propose LensTouch, which enhances the input vocabulary by using touches on the lens of the smart glasses as inputs. The user can place his/her finger on the lens while viewing both the finger and the image displayed. An experiment shows the user can select the target quickly and/or accurately depending on the setting.

Supplemental Material

MP4 File
Demo Video
ZIP File
Supplemental File

References

[1]
Chun Yu et al.2016. One-Dimensional Handwriting: Inputting Letters and Words on Smart Glasses. In Proc.CHI ’16 (San Jose, California, USA). 71–82. https://doi.org/10.1145/2858036.2858542
[2]
Jan Gugenheimer et al.2016. FaceTouch: Enabling Touch Interaction in Display Fixed UIs for Mobile Virtual Reality. In Proc.UIST ’16 (Tokyo, Japan). 49–60. https://doi.org/10.1145/2984511.2984576
[3]
Yueting Weng et al.2021. FaceSight: Enabling Hand-to-Face Gesture Interaction on AR Glasses with a Downward-Facing Camera Vision. In Proc.CHI ’21 (Yokohama, Japan). Article 10, 14 pages. https://doi.org/10.1145/3411764.3445484

Cited By

View all
  • (2024)TeethFa: Real-Time, Hand-Free Teeth Gestures Interaction Using Fabric SensorsIEEE Internet of Things Journal10.1109/JIOT.2024.343465711:21(35223-35237)Online publication date: 1-Nov-2024

Index Terms

  1. LensTouch: Touch Input on Lens Surfaces of Smart Glasses

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '23 Adjunct: Adjunct Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology
    October 2023
    424 pages
    ISBN:9798400700965
    DOI:10.1145/3586182
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 29 October 2023

    Check for updates

    Author Tags

    1. augmented reality
    2. smart glasses
    3. touch input

    Qualifiers

    • Demonstration
    • Research
    • Refereed limited

    Conference

    UIST '23

    Acceptance Rates

    Overall Acceptance Rate 355 of 1,733 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)135
    • Downloads (Last 6 weeks)8
    Reflects downloads up to 25 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)TeethFa: Real-Time, Hand-Free Teeth Gestures Interaction Using Fabric SensorsIEEE Internet of Things Journal10.1109/JIOT.2024.343465711:21(35223-35237)Online publication date: 1-Nov-2024

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media