Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1731903.1731924acmconferencesArticle/Chapter ViewAbstractPublication PagesissConference Proceedingsconference-collections
research-article

Enhancing input on and above the interactive surface with muscle sensing

Published: 23 November 2009 Publication History

Abstract

Current interactive surfaces provide little or no information about which fingers are touching the surface, the amount of pressure exerted, or gestures that occur when not in contact with the surface. These limitations constrain the interaction vocabulary available to interactive surface systems. In our work, we extend the surface interaction space by using muscle sensing to provide complementary information about finger movement and posture. In this paper, we describe a novel system that combines muscle sensing with a multi-touch tabletop, and introduce a series of new interaction techniques enabled by this combination. We present observations from an initial system evaluation and discuss the limitations and challenges of utilizing muscle sensing for tabletop applications.

Supplementary Material

JPG File (107.jpg)
WMV File (107.wmv)

References

[1]
Benko, H. & Feiner, S. Balloon Selection: A Multi-finger Technique for Accurate Low-fatigue 3D Selections. In Proc. of Symp. on 3D User Interfaces '07. 79--86. 2007.
[2]
Benko, H. & Wilson, A. DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface. Microsoft Research Technical Report MSR-TR-2009-23. 2009
[3]
Costanza, E., Inverso, S. A., Allen, R., & Maes, P. Intimate Interfaces in Action: Assessing the Usability and Subtlety of EMG-based Motionless Gestures. In Proc. of ACM CHI 2007.
[4]
Cutler, L. D., Fröhlich, B., & Hanrahan, P. Two-handed Direct Manipulation on the Responsive Workbench. In Proc. of Symp. on Interactive 3D Graphics (I3D '92). 107--114, 1992.
[5]
Dietz, P. & Leigh, D. DiamondTouch: A Multi-user Touch Technology. In Proc. of ACM UIST '01. 219--226, 2001.
[6]
Farry K., Walker I., & Baraniuk R. G. Myoelectric Teleoperation of a Complex Robotic Hand. In Proc. of IEEE Intl Conf Robot Automation '96. 775--788, 1996.
[7]
Han, J. Low-cost Multi-touch Sensing through Frustrated Total Internal Reflection. In Proc. of ACM UIST '05, 115--118, 2005.
[8]
Harada, S., Saponas, T. S., & Landay, J. A. VoicePen: Augmenting Pen Input with Simultaneous Non-linguistic Vocalization. In Proc. of Intl. Conf. on Multimodal Interfaces, 2007.
[9]
Izadi, S., Hodges, S., Taylor, S., Rosenfeld, D., Villar, N., Butler, A., & Westhues, J. Going Beyond the Display: A Surface Technology with an Electronically Switchable Diffuser. In Proc. of ACM UIST '08, 269--278, 2008.
[10]
Ju, P., Kaelbling, L. P., & Singer, Y. State-based Classification of Finger Gestures from Electromyographic Signals. In Proc. of ICML '00, 439--446, 2000.
[11]
Julia, L. & Faure, C. Pattern Recognition and Beautification for a Pen-based Interface. In Proc. of Intl. Conf. on Document Analysis and Recognition (Vol. 1), 58, 1995.
[12]
Kattinakere, R. S., Grossman, T., & Subramanian, S. Modeling Steering within Above-the-surface Interaction layers. In Proc. of ACM CHI '07, 317--326, 2007.
[13]
Malik, S., Ranjan, A., & Balakrishnan, R. Interacting with Large Displays from a Distance with Vision-tracked Multi-finger Gestural Input. In Proc. of ACM UIST '05, 43--52, 2005.
[14]
Merletti, R. & Parker, P. A. Electromyography: Physiology, Engineering, and Noninvasive Applications. John Wiley & Sons: Hoboken, New Jersey, 2004.
[15]
Oviatt, S., Cohen, P., Wu, L., Vergo, J., Duncan, L., Suhm, B., Bers, J., Holzman, T., Winograd, T., Landay, J. A., Larson, J., & Ferro, D. Designing the User Interface for Multimodal Speech and Pen-based Gesture Applications: State-of-the-art Systems and Future Research Directions. HCI 15, 263--322, 2000.
[16]
Peleg, D., Braiman, E., Yom-Tov, E., & Inbar G. F. Classification of Finger Activation for Use in a Robotic Prosthesis Arm. Trans. Neural Syst. Rehabil Eng., 10(4), 2002.
[17]
Saponas, T. S., Tan, D. S., Morris, D. & Balakrishnan, R. Demonstrating the Feasibility of Using Forearm Electromyography for Muscle-Computer Interfaces. In Proc. of ACM CHI '08, 515--524, 2008.
[18]
Saponas, T. S., Tan, D. S., Morris, D, Balakrishnan, R., Landay, J. A., & Turner, J. Enabling Always-available Input with Muscle-Computer Interfaces. In Proc. of ACM UIST 2009.
[19]
Sugiura, A. and Koseki, Y. A user interface using fingerprint recognition: holding commands and data objects on fingers. In Proc. of ACM UIST '98, 71--79, 1998.
[20]
Tenore, F., Ramos, A., Fahmy, A., Acharya, S., Etienne-Cummings, R., & Thakor, N. Towards the Control of Individual Fingers of a Prosthetic Hand Using Surface EMG Signals. In Proc. of IEEE EMBS 2007.
[21]
Wilson, A. TouchLight: An Imaging Touch Screen and Display for Gesture-based Interaction. In Proc. of ICMI '04. 69--76, 2004.
[22]
Wilson, A. PlayAnywhere: A Compact Tabletop Computer Vision System. In Proc. of ACM UIST '05. 83--92, 2005.
[23]
Wilson, A. Robust Computer Vision-Based Detection of Pinching for One and Two-Handed Gesture Input. In Proc. of ACM UIST '06. 255--258., 2006.
[24]
Wilson, A. Depth-Sensing Video Cameras for 3D Tangible Tabletop Interaction. In Proc. of IEEE Tabletop '07, 201--204, 2007.
[25]
Wheeler, K. R, Chang M. H., & Knuth K. H. Gesture-Based Control and EMG Decomposition. IEEE Trans on Systems, Man, and Cybernetics, 36(4), 2006.
[26]
Yatsenko, D., McDonnall D., & Guillory, S. Simultaneous, Proportional, Multi-axis Prosthesis Control using Multichannel Surface EMG. In Proc. IEEE EMBS 2007.

Cited By

View all
  • (2024)SpeciFingersProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435598:1(1-28)Online publication date: 6-Mar-2024
  • (2023)Visualizing Hand Force with Wearable Muscle Sensing for Enhanced Mixed Reality Remote CollaborationIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332021029:11(4611-4621)Online publication date: 1-Nov-2023
  • (2023)Analysis and Considerations of the Controllability of EMG-Based Force InputHuman-Computer Interaction10.1007/978-3-031-35596-7_36(563-572)Online publication date: 9-Jul-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ITS '09: Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
November 2009
240 pages
ISBN:9781605587332
DOI:10.1145/1731903
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 November 2009

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. EMG
  2. muscle sensing
  3. surface computing
  4. tabletops

Qualifiers

  • Research-article

Conference

ITS'09
Sponsor:

Acceptance Rates

Overall Acceptance Rate 119 of 418 submissions, 28%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)0
Reflects downloads up to 29 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)SpeciFingersProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435598:1(1-28)Online publication date: 6-Mar-2024
  • (2023)Visualizing Hand Force with Wearable Muscle Sensing for Enhanced Mixed Reality Remote CollaborationIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332021029:11(4611-4621)Online publication date: 1-Nov-2023
  • (2023)Analysis and Considerations of the Controllability of EMG-Based Force InputHuman-Computer Interaction10.1007/978-3-031-35596-7_36(563-572)Online publication date: 9-Jul-2023
  • (2022)The effects of touchless interaction on usability and sense of presence in a virtual environmentVirtual Reality10.1007/s10055-022-00647-126:4(1551-1571)Online publication date: 23-Apr-2022
  • (2021)A Multi-DoF Prosthetic Hand Finger Joint Controller for Wearable sEMG Sensors by Nonlinear Autoregressive Exogenous ModelSensors10.3390/s2108257621:8(2576)Online publication date: 7-Apr-2021
  • (2021)Force-Based Foot Gesture Navigation in Virtual RealityProceedings of the 27th ACM Symposium on Virtual Reality Software and Technology10.1145/3489849.3489945(1-3)Online publication date: 8-Dec-2021
  • (2021)Identifying Contact Fingers on Touch Sensitive Surfaces by Ring-Based Vibratory CommunicationThe 34th Annual ACM Symposium on User Interface Software and Technology10.1145/3472749.3474745(208-222)Online publication date: 10-Oct-2021
  • (2021)TapID: Rapid Touch Interaction in Virtual Reality using Wearable Sensing2021 IEEE Virtual Reality and 3D User Interfaces (VR)10.1109/VR50410.2021.00076(519-528)Online publication date: Mar-2021
  • (2020)DeepFisheyeProceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology10.1145/3379337.3415818(1132-1146)Online publication date: 20-Oct-2020
  • (2020)MagTouch: Robust Finger Identification for a Smartwatch Using a Magnet Ring and a Built-in MagnetometerProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376234(1-13)Online publication date: 21-Apr-2020
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media