Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3411763.3451741acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
poster

JINSense: Repurposing Electrooculography Sensors on Smart Glass for Midair Gesture and Context Sensing

Published: 08 May 2021 Publication History

Abstract

In this work, we explore a new sensing technique for smart eyewear equipped with Electrooculography (EOG) sensors. We repurpose the EOG sensors embedded in a JINS MEME smart eyewear, originally designed to detect eye movement, to detect midair hand gestures. We also explore the potential of sensing human proximity, rubbing action and to differentiate materials and objects using this sensor. This new found sensing capabilities enable a various types of novel input and interaction scenarios for such wearable eyewear device, whether it is worn on body or resting on a desk.

Supplemental Material

MP4 File
Supplemental video

References

[1]
Andreas Bulling, Daniel Roggen, and Gerhard Tröster. 2009. Wearable EOG Goggles: Eye-Based Interaction in Everyday Environments. In CHI ’09 Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA) (CHI EA ’09). Association for Computing Machinery, New York, NY, USA, 3259–3264. https://doi.org/10.1145/1520340.1520468
[2]
George Chernyshov, Kirill Ragozin, Benjamin Tag, and Kai Kunze. 2019. EOG Glasses: An Eyewear Platform for Cognitive and Social Interaction Assessments in the Wild. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services (Taipei, Taiwan) (MobileHCI ’19). Association for Computing Machinery, New York, NY, USA, Article 67, 5 pages. https://doi.org/10.1145/3338286.3344418
[3]
Murtaza Dhuliawala, Juyoung Lee, Junichi Shimizu, Andreas Bulling, Kai Kunze, Thad Starner, and Woontack Woo. 2016. Smooth Eye Movement Interaction Using EOG Glasses. In Proceedings of the 18th ACM International Conference on Multimodal Interaction (Tokyo, Japan) (ICMI ’16). Association for Computing Machinery, New York, NY, USA, 307–311. https://doi.org/10.1145/2993148.2993181
[4]
Google. 2019. Explorers. Retrieved December 26, 2019 from https://sites.google.com/site/glasscomms/glass-explorers.
[5]
Google. 2019. Glass. Retrieved December 26, 2019 from https://www.google.com/glass/start/.
[6]
Shoya Ishimaru, Kai Kunze, Koichi Kise, and Andreas Dengel. 2016. The Wordometer 2.0: Estimating the Number of Words You Read in Real Life Using Commercial EOG Glasses. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (Heidelberg, Germany) (UbiComp ’16). Association for Computing Machinery, New York, NY, USA, 293–296. https://doi.org/10.1145/2968219.2971398
[7]
Shoya Ishimaru, Kai Kunze, Yuji Uema, Koichi Kise, Masahiko Inami, and Katsuma Tanaka. 2014. Smarter Eyewear: Using Commercial EOG Glasses for Activity Recognition. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (Seattle, Washington) (UbiComp ’14 Adjunct). Association for Computing Machinery, New York, NY, USA, 239–242. https://doi.org/10.1145/2638728.2638795
[8]
JINS. 2019. JINS MEME. Retrieved December 26, 2019 from https://jins-meme.com/en/.
[9]
Kai Kunze, Kazutaka Inoue, Katsutoshi Masai, Yuji Uema, Sean Shao-An Tsai, Shoya Ishimaru, Katsuma Tanaka, Koichi Kise, and Masahiko Inami. 2015. MEME: Smart Glasses to Promote Healthy Habits for Knowledge Workers. In ACM SIGGRAPH 2015 Emerging Technologies (Los Angeles, California) (SIGGRAPH ’15). Association for Computing Machinery, New York, NY, USA, Article 17, 1 pages. https://doi.org/10.1145/2782782.2792491
[10]
Kai Kunze, Katsuma Tanaka, Shoya Ishimaru, Yuji Uema, Koichi Kise, and Masahiko Inami. 2015. MEME: Eye Wear Computing to Explore Human Behavior. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers(Osaka, Japan) (UbiComp/ISWC’15 Adjunct). Association for Computing Machinery, New York, NY, USA, 361–363. https://doi.org/10.1145/2800835.2800900
[11]
Juyoung Lee, Hui-Shyong Yeo, Murtaza Dhuliawala, Jedidiah Akano, Junichi Shimizu, Thad Starner, Aaron Quigley, Woontack Woo, and Kai Kunze. 2017. Itchy Nose: Discreet Gesture Interaction Using EOG Sensors in Smart Eyewear. In Proceedings of the 2017 ACM International Symposium on Wearable Computers (Maui, Hawaii) (ISWC ’17). Association for Computing Machinery, New York, NY, USA, 94–97. https://doi.org/10.1145/3123021.3123060
[12]
Jaime Lien, Nicholas Gillian, M. Emre Karagozler, Patrick Amihood, Carsten Schwesig, Erik Olson, Hakim Raja, and Ivan Poupyrev. 2016. Soli: Ubiquitous Gesture Sensing with Millimeter Wave Radar. ACM Trans. Graph. 35, 4, Article 142 (July 2016), 19 pages. https://doi.org/10.1145/2897824.2925953
[13]
North. 2019. Focals. Retrieved December 26, 2019 from https://www.bynorth.com/focals.
[14]
Soha Rostaminia, Alexander Lamson, Subhransu Maji, Tauhidur Rahman, and Deepak Ganesan. 2019. W!NCE: Unobtrusive Sensing of Upper Facial Action Units with EOG-Based Eyewear. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 3, 1, Article 23 (March 2019), 26 pages. https://doi.org/10.1145/3314410
[15]
Junichi Shimizu, Juyoung Lee, Murtaza Dhuliawala, Andreas Bulling, Thad Starner, Woontack Woo, and Kai Kunze. 2016. Solar System: Smooth Pursuit Interactions Using EOG Glasses. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (Heidelberg, Germany) (UbiComp ’16). Association for Computing Machinery, New York, NY, USA, 369–372. https://doi.org/10.1145/2968219.2971376
[16]
Benjamin Tag, Junichi Shimizu, Chi Zhang, Naohisa Ohta, Kai Kunze, and Kazunori Sugiura. 2016. Eye Blink as an Input Modality for a Responsive Adaptable Video System. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (Heidelberg, Germany) (UbiComp ’16). Association for Computing Machinery, New York, NY, USA, 205–208. https://doi.org/10.1145/2968219.2971449
[17]
Benjamin Tag, Andrew W. Vargo, Aman Gupta, George Chernyshov, Kai Kunze, and Tilman Dingler. 2019. Continuous Alertness Assessments: Using EOG Glasses to Unobtrusively Monitor Fatigue Levels In-The-Wild. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, Article 464, 12 pages. https://doi.org/10.1145/3290605.3300694
[18]
Yuji Uema and Kazutaka Inoue. 2017. JINS MEME Algorithm for Estimation and Tracking of Concentration of Users. In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers(Maui, Hawaii) (UbiComp ’17). Association for Computing Machinery, New York, NY, USA, 297–300. https://doi.org/10.1145/3123024.3123189
[19]
Yuji Uema, Takashi Ohnishi, and Kazutaka Inoue. 2018. Measuring Study Activity across High School Students Using Commercial EOG Glasses. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers (Singapore, Singapore) (UbiComp ’18). Association for Computing Machinery, New York, NY, USA, 275–278. https://doi.org/10.1145/3267305.3267670
[20]
Hui-Shyong Yeo, Gergely Flamich, Patrick Schrempf, David Harris-Birtill, and Aaron Quigley. 2016. RadarCat: Radar Categorization for Input & Interaction. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (Tokyo, Japan) (UIST ’16). Association for Computing Machinery, New York, NY, USA, 833–841. https://doi.org/10.1145/2984511.2984515

Cited By

View all
  • (2024)Controlled and Real-Life Investigation of Optical Tracking Sensors in Smart Glasses for Monitoring Eating Behaviour using Deep Learning: Cross-Sectional Study (Preprint)JMIR mHealth and uHealth10.2196/59469Online publication date: 12-Apr-2024
  • (2023)Towards smart glasses for facial expression recognition using OMG and machine learningScientific Reports10.1038/s41598-023-43135-513:1Online publication date: 25-Sep-2023
  • (2022)SmartLensProceedings of the 28th Annual International Conference on Mobile Computing And Networking10.1145/3495243.3560532(473-486)Online publication date: 14-Oct-2022

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems
May 2021
2965 pages
ISBN:9781450380959
DOI:10.1145/3411763
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 May 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Context sensing
  2. Electrooculography
  3. Midair gesture
  4. Smart eyewear

Qualifiers

  • Poster
  • Research
  • Refereed limited

Conference

CHI '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,558 of 5,301 submissions, 29%

Upcoming Conference

CHI '25
CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)27
  • Downloads (Last 6 weeks)4
Reflects downloads up to 16 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Controlled and Real-Life Investigation of Optical Tracking Sensors in Smart Glasses for Monitoring Eating Behaviour using Deep Learning: Cross-Sectional Study (Preprint)JMIR mHealth and uHealth10.2196/59469Online publication date: 12-Apr-2024
  • (2023)Towards smart glasses for facial expression recognition using OMG and machine learningScientific Reports10.1038/s41598-023-43135-513:1Online publication date: 25-Sep-2023
  • (2022)SmartLensProceedings of the 28th Annual International Conference on Mobile Computing And Networking10.1145/3495243.3560532(473-486)Online publication date: 14-Oct-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media