Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- abstractApril 2020
A Comparison of Surface and Motion User-Defined Gestures for Mobile Augmented Reality
CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing SystemsPages 1–8https://doi.org/10.1145/3334480.3382883Advancements in Augmented Reality (AR) technologies and processing power of mobile devices have created a surge in the number of mobile AR applications. Nevertheless, many AR applications have adopted surface gestures as the default method for ...
- research-articleDecember 2019
Measuring the effect of public vs private context on the kinematics of smartphone motion gestures
IHM '19: Proceedings of the 31st Conference on l'Interaction Homme-MachineArticle No.: 14, Pages 1–6https://doi.org/10.1145/3366550.3372260In this paper, the effect of social exposure on smartphone motion gestures is investigated. A within-subject repeated measures experiment was conducted where participants performed sets of motion gestures on a smartphone in both private and public ...
- short-paperJune 2017
Continuous tilting interaction techniques on mobile devices for controlling public displays
EICS '17: Proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing SystemsPages 21–26https://doi.org/10.1145/3102113.3102120The use of mobile devices to interact with public or semi-public displays has been widely studied. Researchers have investigated the potential use of motion gestures in such settings to allow users to control the display without having to shift either ...
- research-articleDecember 2016
Mogeste: A Mobile Tool for In-Situ Motion Gesture Design
IndiaHCI '16: Proceedings of the 8th Indian Conference on Human-Computer InteractionPages 35–43https://doi.org/10.1145/3014362.3014365Motion gestures can be expressive, fast to access and perform, and facilitated by ubiquitous inertial sensors. However, implementing a gesture recognizer requires substantial programming and pattern recognition expertise. Although several graphical ...
- demonstrationSeptember 2016
Mogeste: mobile tool for in-situ motion gesture design
UbiComp '16: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: AdjunctPages 345–348https://doi.org/10.1145/2968219.2971395We present Mogeste, a smartphone-based tool, to enable rapid, iterative, in-situ motion gesture design by interaction designers. It supports development of in-air gestural interaction with existing inertial sensors on commodity wearable and mobile ...
-
- Work in ProgressApril 2015
Towards Accurate Automatic Segmentation of IMU-Tracked Motion Gestures
CHI EA '15: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing SystemsPages 1337–1342https://doi.org/10.1145/2702613.2732922We present our ongoing research on automatic segmentation of motion gestures tracked by IMUs. We postulate that by recognizing gesture execution phases from motion data that we may be able to auto-delimit user gesture entries. We demonstrate that ...
- research-articleOctober 2014
Detecting tapping motion on the side of mobile devices by probabilistically combining hand postures
UIST '14: Proceedings of the 27th annual ACM symposium on User interface software and technologyPages 215–219https://doi.org/10.1145/2642918.2647363We contribute a novel method for detecting finger taps on the different sides of a smartphone, using the built-in motion sensors of the device. In particular, we discuss new features and algorithms that infer side taps by probabilistically combining ...
- research-articleJune 2014
Leap gestures for TV: insights from an elicitation study
TVX '14: Proceedings of the ACM International Conference on Interactive Experiences for TV and Online VideoPages 131–138https://doi.org/10.1145/2602299.2602316We present insights from a gesture elicitation study in the context of interacting with TV, during which 18 participants contributed and rated the execution difficulty and recall likeliness of free-hand gestures for 21 distinct TV tasks. Our study ...
- research-articleApril 2014
Only for casual players?: investigating player differences in full-body game interaction
Chinese CHI '14: Proceedings of the Second International Symposium of Chinese CHIPages 57–65https://doi.org/10.1145/2592235.2592244Full-body motion gestures enable realistic and intuitive input in video games. However, little is known regarding how different kinds of players engage/disengage with full-body game interaction. In this paper, adopting a user-typing approach, we explore ...
- posterApril 2014
Using audio cues to support motion gesture interaction on mobile devices
CHI EA '14: CHI '14 Extended Abstracts on Human Factors in Computing SystemsPages 1621–1626https://doi.org/10.1145/2559206.2581236Motion gestures are an underutilized input modality for mobile interaction, despite numerous potential advantages. Negulescu et al. found that the lack of feedback on attempted motion gestures made it difficult for participants to diagnose and correct ...
- research-articleApril 2014
Jump and shoot!: prioritizing primary and alternative body gestures for intense gameplay
CHI '14: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsPages 951–954https://doi.org/10.1145/2556288.2557107Motion gestures enable natural and intuitive input in video games. However, game gestures designed by developers may not always be the optimal gestures for players. A key challenge in designing appropriate game gestures lies in the interaction-intensive ...
- research-articleFebruary 2014
Teaching motion gestures via recognizer feedback
IUI '14: Proceedings of the 19th international conference on Intelligent User InterfacesPages 73–82https://doi.org/10.1145/2557500.2557521When using motion gestures, 3D movements of a mobile phone, as an input modality, one significant challenge is how to teach end users the movement parameters necessary to successfully issue a command. Is a simple video or image depicting movement of a ...
- research-articleSeptember 2013
Interacting with a self-portrait camera using motion-based hand gestures
APCHI '13: Proceedings of the 11th Asia Pacific Conference on Computer Human InteractionPages 93–101https://doi.org/10.1145/2525194.2525206Taking self-portraits with a digital camera is a popular way to present oneself through photography. Traditional techniques for taking self-portraits, such as use of self-timers or face detection, provide only a modest degree of interaction between the ...
- research-articleMarch 2013
Combining acceleration and gyroscope data for motion gesture recognition using classifiers with dimensionality constraints
IUI '13: Proceedings of the 2013 international conference on Intelligent user interfacesPages 173–178https://doi.org/10.1145/2449396.2449419Motivated by the addition of gyroscopes to a large number of new smart phones, we study the effects of combining accelerometer and gyroscope data on the recognition rate of motion gesture recognizers with dimensionality constraints. Using a large data ...
- research-articleSeptember 2012
A recognition safety net: bi-level threshold recognition for mobile motion gestures
MobileHCI '12: Proceedings of the 14th international conference on Human-computer interaction with mobile devices and servicesPages 147–150https://doi.org/10.1145/2371574.2371598Designers of motion gestures for mobile devices face the difficult challenge of building a recognizer that can separate gestural input from motion noise. A threshold value is often used to classify motion and effectively balances the rates of false ...
- research-articleAugust 2012
User-defined surface+motion gestures for 3d manipulation of objects at a distance through a mobile device
APCHI '12: Proceedings of the 10th asia pacific conference on Computer human interactionPages 299–308https://doi.org/10.1145/2350046.2350098One form of input for interacting with large shared surfaces is through mobile devices. These personal devices provide interactive displays as well as numerous sensors to effectuate gestures for input. We examine the possibility of using surface and ...
- research-articleMay 2012
Tap, swipe, or move: attentional demands for distracted smartphone input
AVI '12: Proceedings of the International Working Conference on Advanced Visual InterfacesPages 173–180https://doi.org/10.1145/2254556.2254589Smartphones are frequently used in environments where the user is distracted by another task, for example by walking or by driving. While the typical interface for smartphones involves hardware and software buttons and surface gestures, researchers have ...
- research-articleMay 2011
DoubleFlip: a motion gesture delimiter for mobile interaction
CHI '11: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsPages 2717–2720https://doi.org/10.1145/1978942.1979341To make motion gestures more widely adopted on mobile devices it is important that devices be able to distinguish between motion intended for mobile interaction and every-day motion. In this paper, we present DoubleFlip, a unique motion gesture designed ...
- research-articleMay 2011
User-defined motion gestures for mobile interaction
CHI '11: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsPages 197–206https://doi.org/10.1145/1978942.1978971Modern smartphones contain sophisticated sensors to monitor three-dimensional movement of the device. These sensors permit devices to recognize motion gestures - deliberate movements of the device by end-users to invoke commands. However, little is ...
- posterOctober 2010
DoubleFlip: a motion gesture delimiter for interaction
UIST '10: Adjunct proceedings of the 23nd annual ACM symposium on User interface software and technologyPages 449–450https://doi.org/10.1145/1866218.1866265In order to use motion gestures with mobile devices it is imperative that the device be able to distinguish between input motion and everyday motion. In this abstract we present DoubleFlip, a unique motion gesture designed to act as an input delimiter ...