Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- research-articleOctober 2024
The Impact of Gaze and Hand Gesture Complexity on Gaze-Pinch Interaction Performances
UbiComp '24: Companion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous ComputingPages 622–626https://doi.org/10.1145/3675094.3678990This paper investigates user performance and gaze-hand coordination errors in gaze-pinch interaction across different task complexities involving gaze and hand gestures. We designed gaze-based single target pointing tasks with varying levels of gaze ...
- research-articleJuly 2024
GraspUI: Seamlessly Integrating Object-Centric Gestures within the Seven Phases of Grasping
DIS '24: Proceedings of the 2024 ACM Designing Interactive Systems ConferencePages 1275–1289https://doi.org/10.1145/3643834.3661551Objects are indispensable tools in our daily lives. Recent research has demonstrated their potential to act as conduits for digital interactions with microgestures, however, the primary focus was on situations where the hand firmly grasps an object. We ...
- research-articleMarch 2024
Practical approaches to group-level multi-objective Bayesian optimization in interaction technique design
- Yi-Chi Liao,
- George B Mo,
- John J Dudley,
- Chun-Lien Cheng,
- Liwei Chan,
- Per Ola Kristensson,
- Antti Oulasvirta
Designing interaction techniques for end-users often involves exploring vast design spaces while balancing many objectives. Bayesian optimization offers a principled human-in-the-loop method for selecting designs for evaluation to efficiently explore such ...
- research-articleOctober 2023
How Do People with Limited Movement Personalize Upper-Body Gestures? Considerations for the Design of Personalized and Accessible Gesture Interfaces
ASSETS '23: Proceedings of the 25th International ACM SIGACCESS Conference on Computers and AccessibilityArticle No.: 1, Pages 1–15https://doi.org/10.1145/3597638.3608430Always-on, upper-body input from sensors like accelerometers, infrared cameras, and electromyography hold promise to enable accessible gesture input for people with upper-body motor impairments. When these sensors are distributed across the person’s body,...
-
- ArticleAugust 2023
Mapping Virtual Reality Controls to Inform Design of Accessible User Experiences
AbstractA lack of accessible controls remains a barrier to disabled users engaging in virtual reality experiences. This paper presents a modified cognitive walkthrough of 120 virtual reality applications to identify 2,284 pairs of operant and resultant ...
- research-articleApril 2023
ThumbPitch: Enriching Thumb Interaction on Mobile Touchscreens using Deep Learning
OzCHI '22: Proceedings of the 34th Australian Conference on Human-Computer InteractionPages 58–66https://doi.org/10.1145/3572921.3572925Today touchscreens are one of the most common input devices for everyday ubiquitous interaction. Yet, capacitive touchscreens are limited in expressiveness; thus, a large body of work has focused on extending the input capabilities of touchscreens. One ...
- research-articleAugust 2022
Validity Research of Interactive Audio-visual English Learning Software in Language Input and Output
ICDEL '22: Proceedings of the 7th International Conference on Distance Education and LearningPages 1–5https://doi.org/10.1145/3543321.3543322An Interactive Audio-visual English Learning Software is developed based on the theories of language input and output as well as constructivism, which is applied in English speaking and listening teaching. Validity of the software is explored through ...
- research-articleFebruary 2022
How Output Can Become Input: Using Corpus Linguistics tools to assess translation major students’ writing in English and collect models of written discourse
ICETC '21: Proceedings of the 13th International Conference on Education Technology and ComputersPages 327–333https://doi.org/10.1145/3498765.3498816English writing composition is a mandatory course in the Translation Studies at tertiary level in China. This paper presents reflective steps taken in a 32-hour English composition course to 75 second-year Translation Major students at a university in ...
- research-articleMay 2021
BezelGlide: Interacting with Graphs on Smartwatches with Minimal Screen Occlusion
CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing SystemsArticle No.: 501, Pages 1–13https://doi.org/10.1145/3411764.3445201We present BezelGlide, a novel suite of bezel interaction techniques, designed to minimize screen occlusion and ‘fat finger’ effects, when interacting with common graphs on smartwatches. To explore the design of BezelGlide, we conducted two user ...
- research-articleNovember 2020
Imprint-Based Input Techniques for Touch-Based Mobile Devices
MUM '20: Proceedings of the 19th International Conference on Mobile and Ubiquitous MultimediaPages 32–41https://doi.org/10.1145/3428361.3428393Touchscreens translate touches of all kinds into 2D coordinates. This limits the input vocabulary and constrains effective interaction to touches by the fingertip. Previous tabletop research extended the input vocabulary with a myriad of promising ...
- research-articleSeptember 2020
Prediction Stability: A New Metric for Quantitatively Evaluating DNN Outputs
GLSVLSI '20: Proceedings of the 2020 on Great Lakes Symposium on VLSIPages 537–542https://doi.org/10.1145/3386263.3407600In many realistic applications, the collected inputs of DNN face a big challenge: perturbations. Although the perturbations are imperceptible, they may cause incorrect prediction results. This paper proposes prediction stability to quantitatively ...
- short-paperSeptember 2020
Wearable magnetic field sensing for finger tracking
ISWC '20: Proceedings of the 2020 ACM International Symposium on Wearable ComputersPages 63–67https://doi.org/10.1145/3410531.3414304Obtaining a signal useful for continuous pointing input is still an open problem for wearables. While magnetic field sensing is one promising approach, there are significant limitations. Our key contribution in this work is a simulation of a system that ...
- research-articleApril 2020
Walk The Line: Leveraging Lateral Shifts of the Walking Path as an Input Modality for Head-Mounted Displays
CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing SystemsPages 1–15https://doi.org/10.1145/3313831.3376852Recent technological advances have made head-mounted displays (HMDs) smaller and untethered, fostering the vision of ubiquitous interaction in a digitally augmented physical world. Consequently, a major part of the interaction with such devices will ...
- research-articleSeptember 2019
KnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning
- Robin Schweigert,
- Jan Leusmann,
- Simon Hagenmayer,
- Maximilian Weiß,
- Huy Viet Le,
- Sven Mayer,
- Andreas Bulling
MuC '19: Proceedings of Mensch und Computer 2019Pages 387–397https://doi.org/10.1145/3340764.3340767While mobile devices have become essential for social communication and have paved the way for work on the go, their interactive capabilities are still limited to simple touch input. A promising enhancement for touch interaction is knuckle input but ...
- research-articleJune 2019
Non-Rigid HCI: A Review of Deformable Interfaces and Input
DIS '19: Proceedings of the 2019 on Designing Interactive Systems ConferencePages 885–906https://doi.org/10.1145/3322276.3322347Deformable interfaces are emerging in HCI and prototypes show potential for non-rigid interactions. Previous reviews looked at deformation as a material property of shape-changing interfaces and concentrated on output. As such,deformable input was under-...
- research-articleMay 2019
Tool Extension in Human-Computer Interaction
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsPaper No.: 568, Pages 1–11https://doi.org/10.1145/3290605.3300798Tool use extends people's representations of the immediately actionable space around them. Physical tools thereby become integrated in people's body schemas. We introduce a measure for tool extension in HCI by using a visual-tactile interference ...
- research-articleMay 2019
On the Latency of USB-Connected Input Devices
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsPaper No.: 420, Pages 1–12https://doi.org/10.1145/3290605.3300650We propose a method for accurately and precisely measuring the intrinsic latency of input devices and document measurements for 36 keyboards, mice and gamepads connected via USB. Our research shows that devices differ not only in average latency, but ...
- research-articleMay 2019
Investigating the Effect of Orientation and Visual Style on Touchscreen Slider Performance
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsPaper No.: 189, Pages 1–9https://doi.org/10.1145/3290605.3300419Sliders are one of the most fundamental components used in touchscreen user interfaces (UIs). When entering data using a slider, errors occur due e.g. to visual perception, resulting in inputs not matching what is intended by the user. However, it is ...
- Work in ProgressMarch 2019
Extending Input Space of Tangible Dials and Sliders for Uncertain Input
TEI '19: Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied InteractionPages 189–196https://doi.org/10.1145/3294109.3300985Uncertainty is common when working with data and becomes more important as processing big data gains attention. However, no standard tangible interface element exists for inputting uncertain data. In this article, we extend the input space of two ...