Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2700648.2811369acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
poster

"Read That Article": Exploring Synergies between Gaze and Speech Interaction

Published: 26 October 2015 Publication History

Abstract

Gaze information has the potential to benefit Human-Computer Interaction (HCI) tasks, particularly when combined with speech. Gaze can improve our understanding of the user intention, as a secondary input modality, or it can be used as the main input modality by users with some level of permanent or temporary impairments. In this paper we describe a multimodal HCI system prototype which supports speech, gaze and the combination of both. The system has been developed for Active Assisted Living scenarios.

References

[1]
Acartürk, C., et al. Elderly speech-gaze interaction: State of the art and challenges for interaction design. Proc. HCII 2015, (2015), (to appear).
[2]
Almeida, N. and Teixeira, A. Enhanced interaction for the Elderly supported by the W3C Multimodal Architecture. Proc. Conf. Nacional sobre Interacção, (2013).
[3]
Hakkani-Tür, D., et al. Eye Gaze for Spoken Language Understanding in Multi-modal Conversational Interactions. Conf. on Multimodal Interaction, ACM (2014), 263--266.
[4]
Heck, L.P., et al. Multi-Modal Conversational Search and Browse. SLAM@ INTERSPEECH, (2013), 96--101.
[5]
McGurk, H. and MacDonald, J.Hearing lips and seeing voices. Nature 264, (1976), 746--748.
[6]
Oviatt, S.Ten myths of multimodal interaction. Commun. ACM 42, 11 (1999), 74--81.
[7]
Quek, F., et al.Multimodal human discourse: gesture and speech. ACM Trans. on Comp.-Human Interaction (TOCHI) 9, 3 (2002), 171--193.
[8]
Slaney, M., et al.Gaze-enhanced speech recognition. Proc. ICASSP, (2014), 3236--3240.
[9]
Teixeira, A., et al. Speech-centric Multimodal Interaction for Easy-to-access Online Services-A Personal Life Assistant for the Elderly. Procedia Computer Science 27, (2014), 389--397.
[10]
Bolt,. Richard, "Put-that-there, Voice and Gesture at the Graphics Interface", ACM Proceedings SIGGRAH (1980)

Cited By

View all
  • (2022)Integrating Gaze and Speech for Enabling Implicit InteractionsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502134(1-14)Online publication date: 29-Apr-2022
  • (2020)Digital ethnography of home use of digital personal assistantsBehaviour & Information Technology10.1080/0144929X.2020.183462041:4(740-758)Online publication date: 17-Oct-2020
  • (2019)Data analysis from cognitive games interaction in Smart TV applications for patients with Parkinson's, Alzheimer's, and other types of dementiaArtificial Intelligence for Engineering Design, Analysis and Manufacturing10.1017/S089006041900038633:4(442-457)Online publication date: 31-Dec-2019
  • Show More Cited By

Index Terms

  1. "Read That Article": Exploring Synergies between Gaze and Speech Interaction

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ASSETS '15: Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility
      October 2015
      466 pages
      ISBN:9781450334006
      DOI:10.1145/2700648
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 26 October 2015

      Check for updates

      Author Tags

      1. fusion
      2. gaze
      3. multimodal
      4. speech

      Qualifiers

      • Poster

      Funding Sources

      • Marie Curie Industry-Academia Partnerships and Pathways: FP7-PEOPLE-2013-IAPP

      Conference

      ASSETS '15
      Sponsor:

      Acceptance Rates

      ASSETS '15 Paper Acceptance Rate 30 of 127 submissions, 24%;
      Overall Acceptance Rate 436 of 1,556 submissions, 28%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)7
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 18 Nov 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2022)Integrating Gaze and Speech for Enabling Implicit InteractionsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502134(1-14)Online publication date: 29-Apr-2022
      • (2020)Digital ethnography of home use of digital personal assistantsBehaviour & Information Technology10.1080/0144929X.2020.183462041:4(740-758)Online publication date: 17-Oct-2020
      • (2019)Data analysis from cognitive games interaction in Smart TV applications for patients with Parkinson's, Alzheimer's, and other types of dementiaArtificial Intelligence for Engineering Design, Analysis and Manufacturing10.1017/S089006041900038633:4(442-457)Online publication date: 31-Dec-2019
      • (2017)“Tell Your Day”: Developing Multimodal Interaction Applications for Children with ASDUniversal Access in Human–Computer Interaction. Design and Development Approaches and Methods10.1007/978-3-319-58706-6_43(525-544)Online publication date: 16-May-2017
      • (2016)Multi-Device Applications Using the Multimodal ArchitectureMultimodal Interaction with W3C Standards10.1007/978-3-319-42816-1_17(367-383)Online publication date: 18-Nov-2016
      • (2016)Applications of the Multimodal Interaction Architecture in Ambient Assisted LivingMultimodal Interaction with W3C Standards10.1007/978-3-319-42816-1_12(271-291)Online publication date: 18-Nov-2016
      • (2016)On the Creation of a Persona to Support the Development of Technologies for Children with Autism Spectrum DisorderUniversal Access in Human-Computer Interaction. Users and Context Diversity10.1007/978-3-319-40238-3_21(213-223)Online publication date: 21-Jun-2016
      • (2016)Interactive, Multi-device Visualization Supported by a Multimodal Interaction Framework: Proof of ConceptHuman Aspects of IT for the Aged Population. Design for Aging10.1007/978-3-319-39943-0_27(279-289)Online publication date: 21-Jun-2016

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media