Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3174910.3174953acmotherconferencesArticle/Chapter ViewAbstractPublication PagesahConference Proceedingsconference-collections
demonstration

Automated Data Gathering and Training Tool for Personalized "Itchy Nose"

Published: 06 February 2018 Publication History

Abstract

In "Itchy Nose" we proposed a sensing technique for detecting finger movements on the nose for supporting subtle and discreet interaction. It uses the electrooculography sensors embedded in the frame of a pair of eyeglasses for data gathering and uses machine-learning technique to classify different gestures. Here we further propose an automated training and visualization tool for its classifier. This tool guides the user to make the gesture in proper timing and records the sensor data. It automatically picks the ground truth and trains a machine-learning classifier with it. With this tool, we can quickly create trained classifier that is personalized for the user and test various gestures.

References

[1]
Ashbrook, D. et al. 2016. Bitey. Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI '16 (New York, New York, USA, 2016), 158--169.
[2]
Bragi Dash: https://www.bragi.com/thedash/.
[3]
Harrison, C. et al. 2010. Skinput. Proceedings of the 28th international conference on Human factors in computing systems - CHI '10 (New York, New York, USA, 2010), 453.
[4]
Jota, R. and Wigdor, D. 2015. Palpebrae superioris: exploring the design space of eyelid gestures. Proceedings of Graphics Interface 2015 (Toronto, Ontario, Canada, 2015), 273--280.
[5]
Kikuchi, T. et al. 2017. EarTouch. Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI '17 (New York, New York, USA, 2017), 1--6.
[6]
Lee, J. et al. 2017. Itchy nose. Proceedings of the 2017 ACM International Symposium on Wearable Computers - ISWC '17 (New York, New York, USA, 2017), 94--97.
[7]
Lissermann, R. et al. 2014. EarPut. Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures the Future of Design - OzCHI '14 (New York, New York, USA, 2014), 300--307.
[8]
Yamashita, K. et al. 2017. CheekInput. Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology - VRST '17 (New York, New York, USA, 2017), 1--8.
[9]
Zhang, C. et al. 2016. TapSkin. Proceedings of the 2016 ACM on Interactive Surfaces and Spaces - ISS '16 (New York, New York, USA, 2016), 13--22.

Cited By

View all
  • (2020)A Gesture Elicitation Study of Nose-Based GesturesSensors10.3390/s2024711820:24(7118)Online publication date: 11-Dec-2020
  • (2019)Head and Shoulders Gestures: Exploring User-Defined Gestures with Upper BodyDesign, User Experience, and Usability. User Experience in Advanced Technological Environments10.1007/978-3-030-23541-3_15(192-213)Online publication date: 3-Jul-2019

Index Terms

  1. Automated Data Gathering and Training Tool for Personalized "Itchy Nose"

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AH '18: Proceedings of the 9th Augmented Human International Conference
    February 2018
    229 pages
    ISBN:9781450354158
    DOI:10.1145/3174910
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 February 2018

    Check for updates

    Author Tags

    1. EOG
    2. Nose gesture
    3. Training tool
    4. online classification
    5. smart eyeglasses
    6. smart eyewear
    7. subtle interaction
    8. wearable computer

    Qualifiers

    • Demonstration
    • Research
    • Refereed limited

    Conference

    AH2018
    AH2018: The 9th Augmented Human International Conference
    February 7 - 9, 2018
    Seoul, Republic of Korea

    Acceptance Rates

    Overall Acceptance Rate 121 of 306 submissions, 40%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)5
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 16 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2020)A Gesture Elicitation Study of Nose-Based GesturesSensors10.3390/s2024711820:24(7118)Online publication date: 11-Dec-2020
    • (2019)Head and Shoulders Gestures: Exploring User-Defined Gestures with Upper BodyDesign, User Experience, and Usability. User Experience in Advanced Technological Environments10.1007/978-3-030-23541-3_15(192-213)Online publication date: 3-Jul-2019

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media