Nothing Special   »   [go: up one dir, main page]

Logo des Repositoriums
 
Konferenzbeitrag

KnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning

Vorschaubild nicht verfügbar

Volltext URI

Dokumententyp

Text/Conference Paper

Zusatzinformation

Datum

2019

Zeitschriftentitel

ISSN der Zeitschrift

Bandtitel

Verlag

ACM

Zusammenfassung

While mobile devices have become essential for social communication and have paved the way for work on the go, their interactive capabilities are still limited to simple touch input. A promising enhancement for touch interaction is knuckle input but recognizing knuckle gestures robustly and accurately remains challenging. We present a method to differentiate between 17 finger and knuckle gestures based on a long short-term memory (LSTM) machine learning model. Furthermore, we introduce an open source approach that is ready-to-deploy on commodity touch-based devices. The model was trained on a new dataset that we collected in a mobile interaction study with 18 participants. We show that our method can achieve an accuracy of 86.8% on recognizing one of the 17 gestures and an accuracy of 94.6% to differentiate between finger and knuckle. In our evaluation study, we validate our models and found that the LSTM gestures recognizing archived an accuracy of 88.6%. We show that KnuckleTouch can be used to improve the input expressiveness and to provide shortcuts to frequently used functions.

Beschreibung

Schweigert, Robin; Leusmann, Jan; Hagenmayer, Simon; Weiß, Maximilian; Le, Huy Viet; Mayer, Sven; Bulling, Andreas (2019): KnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning. Mensch und Computer 2019 - Tagungsband. DOI: 10.1145/3340764.3340767. New York: ACM. MCI: Full Paper. Hamburg. 8.-11. September 2019

Zitierform

Tags