Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2984511.2984582acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

ViBand: High-Fidelity Bio-Acoustic Sensing Using Commodity Smartwatch Accelerometers

Published: 16 October 2016 Publication History

Abstract

Smartwatches and wearables are unique in that they reside on the body, presenting great potential for always-available input and interaction. Their position on the wrist makes them ideal for capturing bio-acoustic signals. We developed a custom smartwatch kernel that boosts the sampling rate of a smartwatch's existing accelerometer to 4 kHz. Using this new source of high-fidelity data, we uncovered a wide range of applications. For example, we can use bio-acoustic data to classify hand gestures such as flicks, claps, scratches, and taps, which combine with on-device motion tracking to create a wide range of expressive input modalities. Bio-acoustic sensing can also detect the vibrations of grasped mechanical or motor-powered objects, enabling passive object recognition that can augment everyday experiences with context-aware functionality. Finally, we can generate structured vibrations using a transducer, and show that data can be transmitted through the human body. Overall, our contributions unlock user interface techniques that previously relied on special-purpose and/or cumbersome instrumentation, making such interactions considerably more feasible for inclusion in future consumer devices.

Supplementary Material

suppl.mov (uist4645-file3.mp4)
Supplemental video
MP4 File (p321-laput.mp4)
MP4 File (uist-0410.mp4)

References

[1]
Akl, A., Feng, C. and Valaee, S. A Novel Accelerometer-Based Gesture Recognition System. IEEE Transactions on Signal Processing '11, 12: 6197--6205.
[2]
Amento, B., Hill, W. and Terveen, L. The sound of one hand: a wrist-mounted bio-acoustic fingertip gesture interface. In Proc. CHI EA '02, 724--725.
[3]
AT&T Labs. AT&T research on the next secure data transfer device: your body. June 1, 2012. http://innovationblogarchive.att.com/innovation/story/a
[4]
Bellamy, J.C. Digital Telephony, 3rd Edition. Chapter 6.1 Digital Modulation, pp. 279--308.
[5]
Bernaerts, Y., Druwé, M., Steensels, S., Ermeulen, J. and Schöning, J. The office smartwatch: development and design of a smartwatch app to digitally augment interactions in an office environment. In Proc. DIS '14, 41--44.
[6]
Buchler, M.C. Algorithms for Sound Classification in Hearing Instruments. PhD thesis, ETH Zurich, 2002.
[7]
Buettner, M., Prasad, R., Philipose, M. and Wetherall, D. Recognizing daily activities with RFID-based sensors. In Proc. UbiComp '09, 51--60.
[8]
Clarkson, B., Sawhney, N. and Pentland, A. Auditory context awareness in wearable computing. In Workshop on Perceptual User Interfaces, November 1998.
[9]
Cornelius, C., Peterson, R., Skinner, J., Halter, R. and Kotz, D. A wearable system that knows who wears it. In Proc. MobiSys '14, 55--67.
[10]
Dementyev, A. and Paradiso, J.A. WristFlex: Lowpower gesture input with wrist-worn pressure sensors. In Proc. UIST '14, 161--166.
[11]
Deyle, T., Palinko, S., Poole, E.S. and Starner, T. Hambone: A Bio-Acoustic Gesture Interface. In Proc. ISWC '07, 1--8.
[12]
Dimoulas, C., Kalliris, G., Papanikolaou, G. and Kalampakas, A. Long-term signal detection, segmentation and summarization using wavelets and fractal dimension: A bio-acoustics application in gastrointestinal-motility monitoring. Comput. Biol. Med. 37, 4 (April 2007), 438--462.
[13]
de Castro Lopo, E. libsamplerate (software). http://www.mega-nerd.com/SRC/index.html
[14]
Fogarty, J., Au, C. and Hudson, S.E.: Sensing from the Basement: A Feasibility Study of Unobtrusive and Low-Cost Home Activity Recognition. In Proc. UIST '06, 91--100.
[15]
Fry, W.J. (1958). Biological and medical acoustics. The Journal of the Acoustical Society of America, 30(5), 387--393.
[16]
Fukui, R., Watanabe, M., Gyota, T., Shimosaka, M. and Sato, T. Hand shape classification with a wrist contour sensor: development of a prototype device. In Proc. Ubicomp '11, 311--314.
[17]
Google, Inc. Compatibility Definition for Android 6.0. October 16, 2015. http://source.android.com/compatibility/androidcdd.pdf
[18]
Hansen, P. 2001. Recent Bio-acoustical Publications, 1999 and earlier. Part 1: Invertebrates - birds. Bioacoustics 11(3), 223--262
[19]
Harrison, C., Tan, D. and Morris, D. Skinput: Appropriating the Body as an Input Surface. In Proc. CHI '10, 453--462.
[20]
Hodges, S., Thorne, A., Mallinson, H. and Floerkemeier, C. Assessing and optimizing the range of UHF RFID to enable real-world pervasive computing applications. In Pervasive Computing '07. 280--297.
[21]
Iglewicz, B. and Hoaglin, D. (1993), Volume 16: How to Detect and Handle Outliers., The ASQC Basic References in Quality Control: Statistical Techniques, Edward F. Mykytka, Ph.D., Editor.
[22]
InvenSense, Inc. MPU-6500 Product Specification Revision 1.0. September 18, 2013. https://store.invensense.com/datasheets/invensense/MP U_6500_Rev1.0.pdf
[23]
InvenSense, Inc. MPU-6500 Register Map And Descriptions Revision 2.1. September 16, 2013. https://www.invensense.com/wpcontent/uploads/2015/02/MPU-6500-RegisterMap2.pdf
[24]
Jung, P.G., Lim, G., Kim, S. and Kong, K. A Wearable Gesture Recognition Device for Detecting Muscular Activities Based on Air-Pressure Sensors. IEEE Trans. on Industrial Informatics, 11(2), 485--494. Feb. 2015.
[25]
Kim, D., Hilliges, O., Izadi, S., Butler, A., Chen, J., Oikonomidis, I. and Olivier, P. Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor. In Proc. UIST '12, 167--176.
[26]
Lange, B.M., Jones, M.A. and Meyers, J.L. Insight lab: an immersive team environment linking paper, displays, and data. In Proc. CHI '98, 550--557.
[27]
Laput, G., Yang, C., Xiao, R., Sample, A. and Harrison, C. EM-Sense: Touch Recognition of Uninstrumented, Electrical and Electromechanical Objects. In Proc. UIST '15, 157--166.
[28]
Li, H., Ye, C. and Sample, A. IDSense: A Human Object Interaction Detection System Based on Passive UHF RFID. In Proc. CHI '15, 2555--2564
[29]
Lowe, D.G. Object recognition from local scaleinvariant features. In Proc. ICCV '99, 1150--1157.
[30]
Maekawa, T., Kishino, Y., Sakurai, Y. and Suyama, T. Recognizing the use of portable electrical devices with hand-worn magnetic sensors. In Proc. Pervasive '11. 276--293
[31]
Maekawa, T., Kishino, Y., Yanagisawa, Y. and Sakurai, Y. Recognizing handheld electrical device usage with hand-worn coil of wire. In Proc. Pervasive'12. 234--252.
[32]
Medwin, H., Clay C.S. 1998. Fundamentals of Acoustical Oceanography, Academic Press.
[33]
Mujibiya, A., Cao, X., Tan, D.S., Morris, D., Patel, S.N. and Rekimoto, J. The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation. In Proc. ITS '13, 189--198.
[34]
Ogata, M. and Imai, M. SkinWatch: skin gesture interaction for smart watch. In Proc. AH '15, 21--24.
[35]
Ogata, M., Sugiura, Y., Makino, Y., Inami, M. and Imai, M. SenSkin: adapting skin as a soft interface. In Proc. UIST '13, 539--544.
[36]
Peltonen, V., Tuomi, J., Klapuri, A., Huopaniemi, J. and Sorsa, T. Computational auditory scene recognition. In IEEE Acoust, Speech, and Signal Proc. 15206149.
[37]
Perng, J.K., Fisher, B., Hollar, S. and Pister, K.S. Acceleration sensing glove. In Proc. ISWC '09, 178--180.
[38]
Qian, Z., Sagers, R.D. and Pitt, W.G. (1999), Investigation of the mechanism of the bio-acoustic effect. In J. Biomed. Mater. Res., 44: 198--205.
[39]
Reed, I.S. and Solomon, G. Polynomial Codes over Certain Finite Fields. J. Soc. Ind. Appl. Math. 8(2), 300--304 (1960).
[40]
Rekimoto J. and Ayatsuka, Y. CyberCode: designing augmented reality environments with visual tags. In Proc. DARE '00, 1--10
[41]
Rekimoto, J. Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In Proc. ISWC '01, 21--27.
[42]
Ren, X. Egocentric recognition of handled objects: Benchmark and analysis. In Proc. CVPR Workshops '09, 2160--7508.
[43]
Roy, N. and Choudhury, R. Ripple II: Faster Communication through Physical Vibration. In Proc. NSDI '16, USENIX Association, 671--684.
[44]
Roy, N. Gowda, M. and Choudhury, R. Ripple: communicating through physical vibration. In Proc. NSDI'15, USENIX Association, 265--278.
[45]
Saponas, T.S., Tan, D.S., Morris, D. and Balakrishnan, R. Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces. In Proc. CHI '08, 515--524.
[46]
Saponas, T.S., Tan, D.S., Morris, D., Balakrishnan, R., Turner, J. and Landay, J. A. Enabling always-available input with muscle-computer interfaces. In Proc. UIST '09, 167--176.
[47]
Sato, M., Poupyrev, I. and Harrison, C. Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects. In Proc. CHI '12, 483--492.
[48]
Simmonds, J. and MacLennan, D. 2005. Fisheries Acoustics: Theory and Practice, second edition. Blackwell Press.
[49]
Torralba, A., Murphy, K.P., Freeman, W.T. and Rubin, M.A. Context-based vision system for place and object recognition. In Proc. ICCV '03, 273--280.
[50]
Wang, E.J., Lee, T.J., Mariakakis, A., Goel, M., Gupta, S. and Patel, S.N. MagnifiSense: inferring device interaction using wrist-worn passive magneto-inductive sensors. In Proc. UbiComp '15, 15--26.
[51]
Ward, J.A., Lukowicz, P., Tröster, G. and Starner, T.E. Activity recognition of assembly tasks using bodyworn microphones and accelerometers. IEEE Trans. on Pattern Analysis & Mach. Intelligence '06. 1553--1567.
[52]
Way, D. and Paradiso, J. A usability user study concerning free-hand microgesture and wrist-worn sensors. In Proc. BSN '14, 138--142.
[53]
Wen, H., Rojas, J.R. and Dey, A. Serendipity: Finger Gesture Recognition using an Off-the-Shelf Smartwatch. In Proc. CHI '16, 3847--3851.
[54]
Xu, C., Pathak, P.H. and Mohapatra, P. Finger-writing with Smartwatch: A Case for Finger and Hand Gesture Recognition using Smartwatch. In Proc. HotMobile '15, 9--14.
[55]
Zhang, Y. and Harrison, C. Tomo: Wearable, LowCost, Electrical Impedance Tomography for Hand Gesture Recognition. In Proc. UIST '15, 167--173.
[56]
Zhao, N., Dublon, G., Gillian, N., Dementyev, A. and Paradiso, J.A. EMI Spy: Harnessing electromagnetic interference for low-cost, rapid prototyping of proxemic interaction. In Proc. Wearable and Implantable Body Sensor Networks 2015, 1--6.
[57]
Zhao, W., Chellappa, R., Phillips, P. and Rosenfeld, R. Face recognition: A literature survey. ACM Comput. Surv. '03 Vol. 35--4. 399--458.
[58]
Zhao, Y., LaMarca, A. and Smith, J.R. A battery-free object localization and motion sensing platform. In Proc. UbiComp '14. 255--259.
[59]
Zhong, L., El-Daye, D., Kaufman, B., Tobaoda, N., Mohamed, T. and Libschner, M. OsteoConduct: wireless body-area communication based on bone conduction. In Proc. BodyNets '07. ICST, Article 9, 8 pages.
[60]
Zimmerman, T.G. 1996. Personal area networks: nearfield intrabody communication. IBM Syst. J. 35, 3--4 (September 1996), 609--617.

Cited By

View all
  • (2024)Thumb-to-Finger Gesture Recognition Using COTS Smartwatch AccelerometersProceedings of the International Conference on Mobile and Ubiquitous Multimedia10.1145/3701571.3701600(184-195)Online publication date: 1-Dec-2024
  • (2024)PrISM-Q&A: Step-Aware Voice Assistant on a Smartwatch Enabled by Multimodal Procedure Tracking and Large Language ModelsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997598:4(1-26)Online publication date: 21-Nov-2024
  • (2024)ActSonic: Recognizing Everyday Activities from Inaudible Acoustic Wave Around the BodyProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997528:4(1-32)Online publication date: 21-Nov-2024
  • Show More Cited By

Index Terms

  1. ViBand: High-Fidelity Bio-Acoustic Sensing Using Commodity Smartwatch Accelerometers

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '16: Proceedings of the 29th Annual Symposium on User Interface Software and Technology
    October 2016
    908 pages
    ISBN:9781450341899
    DOI:10.1145/2984511
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 October 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gestures
    2. object detection
    3. vibro-tags
    4. wearables

    Qualifiers

    • Research-article

    Conference

    UIST '16

    Acceptance Rates

    UIST '16 Paper Acceptance Rate 79 of 384 submissions, 21%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)405
    • Downloads (Last 6 weeks)44
    Reflects downloads up to 30 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Thumb-to-Finger Gesture Recognition Using COTS Smartwatch AccelerometersProceedings of the International Conference on Mobile and Ubiquitous Multimedia10.1145/3701571.3701600(184-195)Online publication date: 1-Dec-2024
    • (2024)PrISM-Q&A: Step-Aware Voice Assistant on a Smartwatch Enabled by Multimodal Procedure Tracking and Large Language ModelsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997598:4(1-26)Online publication date: 21-Nov-2024
    • (2024)ActSonic: Recognizing Everyday Activities from Inaudible Acoustic Wave Around the BodyProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997528:4(1-32)Online publication date: 21-Nov-2024
    • (2024)CARDinality: Interactive Card-shaped Robots with Locomotion and Haptics using VibrationProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676421(1-14)Online publication date: 13-Oct-2024
    • (2024)Vision-Based Hand Gesture Customization from a Single DemonstrationProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676378(1-14)Online publication date: 13-Oct-2024
    • (2024)picoRing: battery-free rings for subtle thumb-to-index inputProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676365(1-11)Online publication date: 13-Oct-2024
    • (2024)PrISM-Observer: Intervention Agent to Help Users Perform Everyday Procedures Sensed using a SmartwatchProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676350(1-16)Online publication date: 13-Oct-2024
    • (2024)WatchLink: Enhancing Smartwatches with Sensor Add-Ons via ECG InterfaceProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676329(1-13)Online publication date: 13-Oct-2024
    • (2024)Between Wearable and Spatial Computing: Exploring Four Interaction Techniques at the Intersection of Smartwatches and Head-mounted DisplaysProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656365(1-7)Online publication date: 4-Jun-2024
    • (2024)ViObjectProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435478:1(1-26)Online publication date: 6-Mar-2024
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media