Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3544548.3580775acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

LipIO: Enabling Lips as both Input and Output Surface

Published: 19 April 2023 Publication History

Abstract

Abstract. We engineered LipIO, a novel device enabling the lips to be used simultaneously as an input and output surface. LipIO comprises two overlapping flexible electrode arrays: an outward-facing array for capacitive touch and a lip-facing array for electrotactile stimulation. While wearing LipIO, users feel the interface's state via lip stimulation and respond by touching their lip with their tongue or opposing lip. More importantly, LipIO provides co-located tactile feedback that allows users to feel where in the lip they are touching—this is key to enabling eyes- and hands-free interactions. Our three studies verified participants perceived electrotactile output on their lips and subsequently touched the target location with their tongue with an average accuracy of 93%, while wearing LipIO with five I/O electrodes with co-located feedback. Finally, we demonstrate the potential of LipIO in four exemplary applications that illustrate how it enables new types of eyes- and hands-free micro-interactions.

Supplementary Material

MP4 File (3544548.3580775-video-figure.mp4)
Video Figure
MP4 File (3544548.3580775-video-preview.mp4)
Video Preview
MP4 File (3544548.3580775-talk-video.mp4)
Pre-recorded Video Presentation

References

[1]
Michael Barnett-Cowan, Matin Soeizi, and Joseph F. X. DeSouza. 2015. Visual attention at the tip of the tongue. i-Perception 6, 1: 1–4. https://doi.org/10.1068/i0697sas
[2]
Bruno Bordoni, Bruno Morabito, Roberto Mitrano, Marta Simonelli, and Anastasia Toccafondi. The Anatomical Relationships of the Tongue with the Body System. Cureus 10, 12: e3695. https://doi.org/10.7759/cureus.3695
[3]
Jas Brooks, Shan-Yuan Teng, Jingxuan Wen, Romain Nith, Jun Nishida, and Pedro Lopes. 2021. Stereo-Smell via Electrical Trigeminal Stimulation. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ’21), 1–13. https://doi.org/10.1145/3411764.3445300
[4]
Richard Byrne, Joe Marshall, and Florian “Floyd” Mueller. 2016. Balance Ninja: Towards the Design of Digital Vertigo Games via Galvanic Vestibular Stimulation. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play (CHI PLAY ’16), 159–170. https://doi.org/10.1145/2967934.2968080
[5]
Victor Chen, Xuhai Xu, Richard Li, Yuanchun Shi, Shwetak Patel, and Yuntao Wang. 2021. Understanding the Design Space of Mouth Microgestures. In Designing Interactive Systems Conference 2021 (DIS ’21), 1068–1081. https://doi.org/10.1145/3461778.3462004
[6]
Jingyuan Cheng, Ayano Okoso, Kai Kunze, Niels Henze, Albrecht Schmidt, Paul Lukowicz, and Koichi Kise. 2014. On the tip of my tongue: a non-invasive pressure-based tongue interface. In Proceedings of the 5th Augmented Human International Conference (AH ’14), 1–4. https://doi.org/10.1145/2582051.2582063
[7]
Youngkyung Choi, Neung Ryu, Myung Jin Kim, Artem Dementyev, and Andrea Bianchi. 2020. BodyPrinter: Fabricating Circuits Directly on the Skin at Arbitrary Locations Using a Wearable Compact Plotter. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, 554–564. https://doi.org/10.1145/3379337.3415840
[8]
Piotr Dalka and Andrzej Czyżewski. 2010. Human-Computer Interface Based on Visual Lip Movement and Gesture Recognition. International Journal of Computing Science and Mathematics 7: 124–139.
[9]
Tim Duente, Justin Schulte, Max Pfeiffer, and Michael Rohs. 2018. MuscleIO: Muscle-Based Input and Output for Casual Notifications. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2, 2: 1–21. https://doi.org/10.1145/3214267
[10]
Pablo Gallego Cascón, Denys J.C. Matthies, Sachith Muthukumarana, and Suranga Nanayakkara. 2019. ChewIt. An Intraoral Interface for Discreet Interactions. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 1–13. https://doi.org/10.1145/3290605.3300556
[11]
M. Gentil and C. L. Tournier. 1998. Differences in fine control of forces generated by the tongue, lips and fingers in humans. Archives of Oral Biology 43, 7: 517–523. https://doi.org/10.1016/s0003-9969(98)00042-9
[12]
Mayank Goel, Chen Zhao, Ruth Vinisha, and Shwetak N. Patel. 2015. Tongue-in-Cheek: Using Wireless Signals to Enable Non-Intrusive and Flexible Facial Gestures Detection. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 255–258. https://doi.org/10.1145/2702123.2702591
[13]
Daniel Groeger, Martin Feick, Anusha Withana, and Jürgen Steimle. 2019. Tactlets: Adding Tactile Feedback to 3D Objects Using Custom Printed Controls. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST ’19), 923–936. https://doi.org/10.1145/3332165.3347937
[14]
Takuma Hashimoto, Suzanne Low, Koji Fujita, Risa Usumi, Hiroshi Yanagihara, Chihiro Takahashi, Maki Sugimoto, and Yuta Sugiura. 2018. TongueInput: Input Method by Tongue Gestures Using Optical Sensors Embedded in Mouthpiece. In 2018 57th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), 1219–1224. https://doi.org/10.23919/SICE.2018.8492690
[15]
Yuki Hashimoto, Naohisa Nagaya, Minoru Kojima, Satoru Miyajima, Junichiro Ohtaki, Akio Yamamoto, Tomoyasu Mitani, and Masahiko Inami. 2006. Straw-like user interface: virtual experience of the sensation of drinking using a straw. In Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology (ACE ’06), 50-es. https://doi.org/10.1145/1178823.1178882
[16]
Hui Tang and D.J. Beebe. 2006. An oral tactile interface for blind navigation. IEEE Transactions on Neural Systems and Rehabilitation Engineering 14, 1: 116–123. https://doi.org/10.1109/TNSRE.2005.862696
[17]
Xueliang Huo, Chihwen Cheng, and Maysam Ghovanloo. 2009. Evaluation of the Tongue Drive System by Individuals with High-Level Spinal Cord Injury. Conference proceedings: ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference 2009: 555–558. https://doi.org/10.1109/IEMBS.2009.5334555
[18]
Xueliang Huo, Jia Wang, and Maysam Ghovanloo. 2007. A Magnetic Wireless Tongue-Computer Interface. In 2007 3rd International IEEE/EMBS Conference on Neural Engineering, 322–326. https://doi.org/10.1109/CNE.2007.369676
[19]
Xueliang Huo, Jia Wang, and Maysam Ghovanloo. 2007. A Wireless Tongue-Computer Interface Using Stereo Differential Magnetic Field Measurement. In 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 5723–5726. https://doi.org/10.1109/IEMBS.2007.4353646
[20]
Arata Jingu, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shinoda. 2021. Tactile Perception Characteristics of Lips Stimulated by Airborne Ultrasound. In 2021 IEEE World Haptics Conference (WHC), 607–612. https://doi.org/10.1109/WHC49131.2021.9517161
[21]
Arata Jingu, Takaaki Kamigaki, Masahiro Fujiwara, Yasutoshi Makino, and Hiroyuki Shinoda. 2021. LipNotif: Use of Lips as a Non-Contact Tactile Notification Interface Based on Ultrasonic Tactile Presentation. In The 34th Annual ACM Symposium on User Interface Software and Technology, 13–23. https://doi.org/10.1145/3472749.3474732
[22]
Marcelo Archajo José and Roseli de Deus Lopes. 2015. Human–Computer Interface Controlled by the Lip. IEEE Journal of Biomedical and Health Informatics 19, 1: 302–308. https://doi.org/10.1109/JBHI.2014.2305103
[23]
Jingun Jung, Sunmin Son, Sangyoon Lee, Yeonsu Kim, and Geehyuk Lee. 2021. ThroughHand: 2D Tactile Interaction to Simultaneously Recognize and Touch Multiple Objects. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ’21), 1–13. https://doi.org/10.1145/3411764.3445530
[24]
K. A. Kaczmarek. 2011. The tongue display unit (TDU) for electrotactile spatiotemporal pattern presentation. Scientia Iranica 18, 6: 1476–1485. https://doi.org/10.1016/j.scient.2011.08.020
[25]
Hiroyuki Kajimoto. 2012. Skeletouch: transparent electro-tactile display for mobile surfaces. In SIGGRAPH Asia 2012 Emerging Technologies (SA ’12), 1–3. https://doi.org/10.1145/2407707.2407728
[26]
Hiroyuki Kajimoto. 2016. Electro-tactile Display: Principle and Hardware. In Pervasive Haptics: Science, Design, and Application, Hiroyuki Kajimoto, Satoshi Saga and Masashi Konyo (eds.). Springer Japan, Tokyo, 79–96. https://doi.org/10.1007/978-4-431-55772-2_5
[27]
Hiroyuki Kajimoto. 2021. Electro-Tactile Display Kit for Fingertip*. In 2021 IEEE World Haptics Conference (WHC), 587–587. https://doi.org/10.1109/WHC49131.2021.9517192
[28]
Hsin-Liu (Cindy) Kao, Christian Holz, Asta Roseway, Andres Calvo, and Chris Schmandt. 2016. DuoSkin: rapidly prototyping on-skin user interfaces using skin-friendly materials. In Proceedings of the 2016 ACM International Symposium on Wearable Computers (ISWC ’16), 16–23. https://doi.org/10.1145/2971763.2971777
[29]
Jeonghee Kim, Hangue Park, Joy Bruce, Erica Sutton, Diane Rowles, Deborah Pucci, Jaimee Holbrook, Julia Minocha, Beatrice Nardone, Dennis West, Anne Laumann, Eliot Roth, Mike Jones, Emir Veledar, and Maysam Ghovanloo. 2013. The Tongue Enables Computer and Wheelchair Control for People with Spinal Cord Injury. Science translational medicine 5, 213: 213ra166. https://doi.org/10.1126/scitranslmed.3006296
[30]
Naoki Kimura, Tan Gemicioglu, Jonathan Womack, Richard Li, Yuhui Zhao, Abdelkareem Bedri, Alex Olwal, Jun Rekimoto, and Thad Starner. 2021. Mobile, Hands-free, Silent Speech Texting Using SilentSpeller. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, 1–5. https://doi.org/10.1145/3411763.3451552
[31]
Naoki Kimura, Michinari Kono, and Jun Rekimoto. 2019. SottoVoce: An Ultrasound Imaging-Based Silent Speech Interaction Using Deep Neural Networks. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 1–11. https://doi.org/10.1145/3290605.3300376
[32]
Richard Li, Jason Wu, and Thad Starner. 2019. TongueBoard: An Oral Interface for Subtle Input. In Proceedings of the 10th Augmented Human International Conference 2019 (AH2019), 1–9. https://doi.org/10.1145/3311823.3311831
[33]
Weikang Lin, Dongsheng Zhang, Wang Wei Lee, Xuelong Li, Ying Hong, Qiqi Pan, Ruirui Zhang, Guoxiang Peng, Hong Z. Tan, Zhengyou Zhang, Lei Wei, and Zhengbao Yang. 2022. Super-resolution wearable electrotactile rendering system. Science Advances 8, 36: eabp8738. https://doi.org/10.1126/sciadv.abp8738
[34]
Li Liu, Shuo Niu, Jingjing Ren, and Jingyuan Zhang. 2012. Tongible: a non-contact tongue-based interaction technique. In Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility - ASSETS ’12, 233. https://doi.org/10.1145/2384916.2384969
[35]
Wei Liu and Hui Tang. 2005. An initial study on lip perception of electrotactile array stimulation. The Journal of Rehabilitation Research and Development 42, 5: 705. https://doi.org/10.1682/JRRD.2005.02.0051
[36]
Pedro Lopes, Alexandra Ion, Willi Mueller, Daniel Hoffmann, Patrik Jonell, and Patrick Baudisch. 2015. Proprioceptive Interaction. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI ’15), 939–948. https://doi.org/10.1145/2702123.2702461
[37]
Ewa Luger and Abigail Sellen. 2016. “Like Having a Really Bad PA”: The Gulf between User Expectation and Experience of Conversational Agents. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16), 5286–5297. https://doi.org/10.1145/2858036.2858288
[38]
Takuro Nakao, Yun Suen Pai, Megumi Isogai, Hideaki Kimata, and Kai Kunze. 2018. Make-a-face: a hands-free, non-intrusive device for tongue/mouth/cheek input using EMG. In ACM SIGGRAPH 2018 Posters (SIGGRAPH ’18), 1–2. https://doi.org/10.1145/3230744.3230784
[39]
Phuc Nguyen, Nam Bui, Anh Nguyen, Hoang Truong, Abhijit Suresh, Matt Whitlock, Duy Pham, Thang Dinh, and Tam Vu. 2018. TYTH-Typing On Your Teeth: Tongue-Teeth Localization for Human-Computer Interface. In Proceedings of the 16th Annual International Conference on Mobile Systems, Applications, and Services, 269–282. https://doi.org/10.1145/3210240.3210322
[40]
Jun Nishida and Kenji Suzuki. 2017. bioSync: A Paired Wearable Device for Blending Kinesthetic Experience. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17), 3316–3327. https://doi.org/10.1145/3025453.3025829
[41]
Shuo Niu, Li Liu, and D. Scott McCrickard. 2019. Tongue-able interfaces: Prototyping and evaluating camera based tongue gesture input system. Smart Health 11: 16–28. https://doi.org/10.1016/j.smhl.2018.03.001
[42]
Viktorija Paneva, Sofia Seinfeld, Michael Kraiczi, and Jörg Müller. 2020. HaptiRead: Reading Braille as Mid-Air Haptic Information. In Proceedings of the 2020 ACM Designing Interactive Systems Conference (DIS ’20), 13–20. https://doi.org/10.1145/3357236.3395515
[43]
Hangue Park, Mehdi Kiani, Hyung-Min Lee, Jeonghee Kim, Jacob Block, Benoit Gosselin, and Maysam Ghovanloo. 2012. A Wireless Magnetoresistive Sensing System for an Intraoral Tongue-Computer Interface. IEEE transactions on biomedical circuits and systems 6, 6: 571–585. https://doi.org/10.1109/TBCAS.2012.2227962
[44]
Wilder Penfield and Theodore Rasmussen. 1950. The cerebral cortex of man; a clinical study of localization of function. Macmillan, Oxford, England.
[45]
Martin Porcheron, Joel E. Fischer, Stuart Reeves, and Sarah Sharples. 2018. Voice Interfaces in Everyday Life. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18), 1–12. https://doi.org/10.1145/3173574.3174214
[46]
Martin Porcheron, Joel E. Fischer, and Sarah Sharples. 2017. “Do Animals Have Accents?”: Talking with Agents in Multi-Party Conversation. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, 207–219. https://doi.org/10.1145/2998181.2998298
[47]
Nimesha Ranasinghe and Ellen Yi-Luen Do. 2016. Digital Lollipop: Studying Electrical Stimulation on the Human Tongue to Simulate Taste Sensations. ACM Transactions on Multimedia Computing, Communications, and Applications 13, 1: 5:1-5:22. https://doi.org/10.1145/2996462
[48]
Nimesha Ranasinghe, Pravar Jain, Shienny Karwita, David Tolley, and Ellen Yi-Luen Do. 2017. Ambiotherm: Enhancing Sense of Presence in Virtual Reality by Simulating Real-World Environmental Conditions. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 1731–1742. https://doi.org/10.1145/3025453.3025723
[49]
J.M. Romano, S.R. Gray, N.T. Jacobs, and K.J. Kuchenbecker. 2009. Toward tactilely transparent gloves: Collocated slip sensing and vibrotactile actuation. In World Haptics 2009 - Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 279–284. https://doi.org/10.1109/WHC.2009.4810815
[50]
Anne Roudaut, Andreas Rau, Christoph Sterz, Max Plauth, Pedro Lopes, and Patrick Baudisch. 2013. Gesture output: eyes-free output using a force feedback touch surface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’13), 2547–2556. https://doi.org/10.1145/2470654.2481352
[51]
Hooman Aghaebrahimi Samani, Rahul Parsani, Lenis Tejada Rodriguez, Elham Saadatian, Kumudu Harshadeva Dissanayake, and Adrian David Cheok. 2012. Kissenger: design of a kiss transmission device. In Proceedings of the Designing Interactive Systems Conference (DIS ’12), 48–57. https://doi.org/10.1145/2317956.2317965
[52]
T. Scott Saponas, Daniel Kelly, Babak A. Parviz, and Desney S. Tan. 2009. Optically sensing tongue gestures for computer input. In Proceedings of the 22nd annual ACM symposium on User interface software and technology - UIST ’09, 177. https://doi.org/10.1145/1622176.1622209
[53]
M. Sasaki, K. Onishi, T. Arakawa, A. Nakayama, D. Stefanov, and M. Yamaguchi. 2013. Real-time estimation of tongue movement based on suprahyoid muscle activity. In 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 4605–4608. https://doi.org/10.1109/EMBC.2013.6610573
[54]
K. Sathian and A. Zangaladze. 1996. Tactile spatial acuity at the human fingertip and lip: Bilateral symmetry and interdigit variability. Neurology 46, 5: 1464–1464. https://doi.org/10.1212/WNL.46.5.1464
[55]
Sue Ann Seah, Marianna Obrist, Anne Roudaut, and Sriram Subramanian. 2015. Need for Touch in Human Space Exploration: Towards the Design of a Morphing Haptic Glove – ExoSkin. In Human-Computer Interaction – INTERACT 2015 (Lecture Notes in Computer Science), 18–36. https://doi.org/10.1007/978-3-319-22723-8_3
[56]
Vivian Shen, Craig Shultz, and Chris Harrison. Mouth Haptics in VR using a Headset Ultrasound Phased Array. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (CHI ’22).
[57]
Maureen Stone, Jonghye Woo, Junghoon Lee, Tera Poole, Amy Seagraves, Michael Chung, Eric Kim, Emi Z. Murano, Jerry L. Prince, and Silvia S. Blemker. 2018. Structure and variability in human tongue muscle anatomy. Computer methods in biomechanics and biomedical engineering. Imaging & visualization 6, 5: 499–507. https://doi.org/10.1080/21681163.2016.1162752
[58]
Hui Tang and D.J. Beebe. 2003. Design and microfabrication of a flexible oral electrotactile display. Journal of Microelectromechanical Systems 12, 1: 29–36. https://doi.org/10.1109/JMEMS.2002.807478
[59]
R. W. Van Boven and K. O. Johnson. 1994. The limit of tactile spatial resolution in humans: grating orientation discrimination at the lip, tongue, and finger. Neurology 44, 12: 2361–2366. https://doi.org/10.1212/wnl.44.12.2361
[60]
Daniel Vogel and Patrick Baudisch. 2007. Shift: a technique for operating pen-based interfaces using touch. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 657–666. https://doi.org/10.1145/1240624.1240727
[61]
Martin Weigel, Aditya Shekhar Nittala, Alex Olwal, and Jürgen Steimle. 2017. SkinMarks: Enabling Interactions on Body Landmarks Using Conformal Skin Electronics. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17), 3095–3105. https://doi.org/10.1145/3025453.3025704
[62]
Alexander Wilberz, Dominik Leschtschow, Christina Trepkowski, Jens Maiero, Ernst Kruijff, and Bernhard Riecke. 2020. FaceHaptics: Robot Arm based Versatile Facial Haptics for Immersive Environments. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–14. https://doi.org/10.1145/3313831.3376481
[63]
Anusha Withana, Daniel Groeger, and Jürgen Steimle. 2018. Tacttoo: A Thin and Feel-Through Tattoo for On-Skin Tactile Output. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (UIST ’18), 365–378. https://doi.org/10.1145/3242587.3242645
[64]
Xuhai Xu, Chun Yu, Anind K. Dey, and Jennifer Mankoff. 2019. Clench Interface: Novel Biting Input Techniques. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19), 1–12. https://doi.org/10.1145/3290605.3300505
[65]
Qiao Zhang, Shyamnath Gollakota, Ben Taskar, and Raj P.N. Rao. 2014. Non-intrusive tongue machine interface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’14), 2555–2558. https://doi.org/10.1145/2556288.2556981
[66]
Ziliang Zhou, Yicheng Yang, and Honghai Liu. 2022. A Braille Reading System Based on Electrotactile Display With Flexible Electrode Array. IEEE/CAA Journal of Automatica Sinica 9, 4: 735–737. https://doi.org/10.1109/JAS.2022.105476

Cited By

View all
  • (2024)Ultrasonic Mid-Air Haptics on the Face: Effects of Lateral Modulation Frequency and Amplitude on Users’ ResponsesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642417(1-12)Online publication date: 11-May-2024
  • (2024)Shaping Compliance: Inducing Haptic Illusion of Compliance in Different Shapes with Electrotactile GrainsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641907(1-13)Online publication date: 11-May-2024
  • (2024)Experience Haptics Seamlessly Across Virtual and Real Worlds2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00109(573-577)Online publication date: 16-Mar-2024
  • Show More Cited By

Index Terms

  1. LipIO: Enabling Lips as both Input and Output Surface

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems
    April 2023
    14911 pages
    ISBN:9781450394215
    DOI:10.1145/3544548
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 April 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Haptics
    2. lips
    3. on-skin interface
    4. tongue

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • NSF (Career)

    Conference

    CHI '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)308
    • Downloads (Last 6 weeks)38
    Reflects downloads up to 02 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Ultrasonic Mid-Air Haptics on the Face: Effects of Lateral Modulation Frequency and Amplitude on Users’ ResponsesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642417(1-12)Online publication date: 11-May-2024
    • (2024)Shaping Compliance: Inducing Haptic Illusion of Compliance in Different Shapes with Electrotactile GrainsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641907(1-13)Online publication date: 11-May-2024
    • (2024)Experience Haptics Seamlessly Across Virtual and Real Worlds2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00109(573-577)Online publication date: 16-Mar-2024
    • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
    • (2023)ChromaNails: Re-Programmable Multi-Colored High-Resolution On-Body Interfaces using Photochromic Nail PolishAdjunct Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586182.3615824(1-5)Online publication date: 29-Oct-2023
    • (2023)TongueTap: Multimodal Tongue Gesture Recognition with Head-Worn DevicesProceedings of the 25th International Conference on Multimodal Interaction10.1145/3577190.3614120(564-573)Online publication date: 9-Oct-2023

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    Full Text

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media