Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2505483.2505487acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
research-article

A smart watch-based gesture recognition system for assisting people with visual impairments

Published: 22 October 2013 Publication History

Abstract

Modern mobile devices provide several functionalities and new ones are being added at a breakneck pace. Unfortunately browsing the menu and accessing the functions of a mobile phone is not a trivial task for visual impaired users. Low vision people typically rely on screen readers and voice commands. However, depending on the situations, screen readers are not ideal because blind people may need their hearing for safety, and automatic recognition of voice commands is challenging in noisy environments. Novel smart watches technologies provides an interesting opportunity to design new forms of user interaction with mobile phones. We present our first works towards the realization of a system, based on the combination of a mobile phone and a smart watch for gesture control, for assisting low vision people during daily life activities. More specifically we propose a novel approach for gesture recognition which is based on global alignment kernels and is shown to be effective in the challenging scenario of user independent recognition. This method is used to build a gesture-based user interaction module and is embedded into a system targeted to visually impaired which will also integrate several other modules. We present two of them: one for identifying wet floor signs, the other for automatic recognition of predefined logos.

References

[1]
http://developer.sonymobile.com/services/open-smartwatch-project/smartwatch-hacker-guide/.
[2]
http://myfreevox.com/en/.
[3]
A. Akl and S. Valaee. Accelerometer-based gesture recognition via dynamic-time warping, affinity propagation, & compressive sensing. In ICASSP, pages 2270--2273, 2010.
[4]
R. Amar, S. Dow, R. Gordon, M. R. Hamid, and C. Sellers. Mobile advice: an accessible device for visually impaired capability enhancement. In CHI '03 Extended Abstracts on Human Factors in Computing Systems, CHI EA '03, pages 918--919, New York, NY, USA, 2003. ACM.
[5]
G. Bieber, T. Kirste, and B. Urban. Ambient interaction by smart watches. In Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments, PETRA '12, pages 39:1--39:6, New York, NY, USA, 2012. ACM.
[6]
M. Cuturi. Fast global alignment kernels. In ICML, pages 929--936, 2011.
[7]
C. Joder, S. Essid, and G. Richard. Temporal integration for audio classification with application to musical instrument classification. Audio, Speech, and Language Processing, IEEE Transactions on, 17(1):174--186, 2009.
[8]
H. Junker, O. Amft, P. Lukowicz, and G. Tröster. Gesture spotting with body-worn inertial sensors to detect user activities. Pattern Recognition, 41(6):2010--2024, 2008.
[9]
J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino, L. Jozzo, and D. Marca. Accelerometer-based gesture control for a design environment. Personal and Ubiquitous Computing, 10(5):285--299, 2006.
[10]
M. Khan, S. Ahamed, M. Rahman, and J.-J. Yang. Gesthaar: An accelerometer-based gesture recognition method and its application in nui driven pervasive healthcare. In Emerging Signal Processing Applications (ESPA), 2012 IEEE International Conference on, pages 163--166, 2012.
[11]
F. C. Y. Li, D. Dearman, and K. N. Truong. Leveraging proprioception to make mobile phones more accessible to users with visual impairments. In ASSETS, pages 187--194, 2010.
[12]
J. Liu, L. Zhong, J. Wickramasuriya, and V. Vasudevan. uwave: Accelerometer-based personalized gesture recognition and its applications. Pervasive and Mobile Computing, 5(6):657--675, 2009.
[13]
R. Manduchi and J. M. Coughlan. (computer) vision without sight. Commun. ACM, 55(1):96--104, 2012.
[14]
S. Mitra and T. Acharya. Gesture recognition: A survey. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 37(3):311--324, 2007.
[15]
R. Perfetti, E. Ricci. Reduced complexity RBF classifiers with support vector centres and dynamic decay adjustment. Neurocomputing, 69(16--18):2446--2450, 2006.
[16]
T. Pylvänäinen. Accelerometer based gesture recognition using continuous hmms. In IbPRIA (1), pages 639--646, 2005.
[17]
G. Raffa, J. Lee, L. Nachman, and J. Song. Don't slow me down: Bringing energy efficiency to continuous gesture recognition. In ISWC, pages 1--8, 2010.
[18]
E. Ricci, F. Tobia, and G. Zen. Learning pedestrian trajectories with kernels. In ICPR, pages 149--152, 2010.
[19]
Sung-Jung. Two-stage recognition of raw acceleration signals for 3-D Gesture-Understanding cell phones. In Tenth International Workshop on Frontiers in Handwriting Recognition, 2006.
[20]
J. Wu, G. Pan, D. Zhang, G. Qi, and S. Li. Gesture recognition with a 3-d accelerometer. In UIC, pages 25--38, 2009.

Cited By

View all
  • (2024)Observations and Considerations for Implementing Vibration Signals as an Input Technique for Mobile DevicesMultimodal Technologies and Interaction10.3390/mti80900768:9(76)Online publication date: 2-Sep-2024
  • (2024)Hand Gesture Recognition for Blind Users by Tracking 3D Gesture TrajectoryProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642602(1-15)Online publication date: 11-May-2024
  • (2023)A Large-Scale Mixed-Methods Analysis of Blind and Low-vision Research in ACM and IEEEProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608412(1-20)Online publication date: 22-Oct-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
IMMPD '13: Proceedings of the 3rd ACM international workshop on Interactive multimedia on mobile & portable devices
October 2013
50 pages
ISBN:9781450323994
DOI:10.1145/2505483
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 October 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. accelerometer-based gesture recognition
  2. dynamic time warping
  3. smart watch
  4. visual impairments

Qualifiers

  • Research-article

Conference

MM '13
Sponsor:
MM '13: ACM Multimedia Conference
October 22, 2013
Barcelona, Spain

Acceptance Rates

IMMPD '13 Paper Acceptance Rate 7 of 14 submissions, 50%;
Overall Acceptance Rate 7 of 14 submissions, 50%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)74
  • Downloads (Last 6 weeks)11
Reflects downloads up to 10 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Observations and Considerations for Implementing Vibration Signals as an Input Technique for Mobile DevicesMultimodal Technologies and Interaction10.3390/mti80900768:9(76)Online publication date: 2-Sep-2024
  • (2024)Hand Gesture Recognition for Blind Users by Tracking 3D Gesture TrajectoryProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642602(1-15)Online publication date: 11-May-2024
  • (2023)A Large-Scale Mixed-Methods Analysis of Blind and Low-vision Research in ACM and IEEEProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608412(1-20)Online publication date: 22-Oct-2023
  • (2023)AccessWear: Making Smartphone Applications Accessible to Blind UsersProceedings of the 29th Annual International Conference on Mobile Computing and Networking10.1145/3570361.3592495(1-16)Online publication date: 2-Oct-2023
  • (2023)Gesture-Based Human–Machine Interaction: Taxonomy, Problem Definition, and AnalysisIEEE Transactions on Cybernetics10.1109/TCYB.2021.312911953:1(497-513)Online publication date: Jan-2023
  • (2023)An LSTM-based Gesture-to-Speech Recognition System2023 IEEE 11th International Conference on Healthcare Informatics (ICHI)10.1109/ICHI57859.2023.00062(430-438)Online publication date: 26-Jun-2023
  • (2023)Leveraging a Smartwatch for Activity Recognition in SalatIEEE Access10.1109/ACCESS.2023.331126111(97284-97317)Online publication date: 2023
  • (2023)From Impossible to Unnoticed: Wearable Technologies and The Miniaturization of Grand ScienceFoot and Ankle Biomechanics10.1016/B978-0-12-815449-6.00041-X(229-242)Online publication date: 2023
  • (2023)Review of substitutive assistive tools and technologies for people with visual impairments: recent advancements and prospectsJournal on Multimodal User Interfaces10.1007/s12193-023-00427-418:1(135-156)Online publication date: 19-Dec-2023
  • (2022)Real-Time Gesture Recognition with Virtual Glove MarkersProceedings of the 15th International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3529190.3534749(402-406)Online publication date: 29-Jun-2022
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media