Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2157689.2157746acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract

Vision-based attention estimation and selection for social robot to perform natural interaction in the open world

Published: 05 March 2012 Publication History

Abstract

In this paper, a novel vision system is proposed to estimate attention of people from rich visual clues for social robot to perform natural interactions with multiple participants in public environments. The vision detection and recognition modules include multi-person detection and tracking, upper-body pose recognition, face and gaze detection, lip motion analysis for speaking recognition, and facial expression recognition. A computational approach is proposed to generate a quantitative estimation of human attention. The vision system is implemented on a robotic receptionist "EVE" and encouraging results have been obtained.

Supplementary Material

JPG File (lbr139.jpg)
WMV File (lbr139.wmv)

References

[1]
X. Feng, M. Pietikäinen, and A. Hadid. Facial expression recognition with local binary patterns and linear programming. In 7th International Conference on Pattern Recognition and Image Analysis: New Information Technologies (PRIA-7), 2004.
[2]
L. Li, K. E. Hoe, S. Yan, and X. Yu. Ml-fusion based multi-model human detection and tracking for robust human-robot interfaces. In IEEE Workshop on Applications of Computer Vision (WACV), 2009.
[3]
L. Li, K. E. Hoe, X. Yu, L. Dong, and X. Chu. Human upper body pose recognition using adaboost template for natural human robot interaction. In Seventh Canadian Conference on Computer and Robot Vision (CRV), 2010.
[4]
X. Yu, W. Han, L. Li, J. Y. Shi, and G. Wang. An eye detection and localization system for natural human and robot interaction without face detection. In 12th Conference Towards Autonomous Robotic Systems (TAROS), 2011.

Cited By

View all
  • (2023)LiteGaze: Neural architecture search for efficient gaze estimationPLOS ONE10.1371/journal.pone.028481418:5(e0284814)Online publication date: 1-May-2023
  • (2023)L2CS-Net : Fine-Grained Gaze Estimation in Unconstrained Environments2023 8th International Conference on Frontiers of Signal Processing (ICFSP)10.1109/ICFSP59764.2023.10372944(98-102)Online publication date: 23-Oct-2023
  • (2021)Effect of the projection of robot’s talk information on the perception of communicating humanAdvanced Robotics10.1080/01691864.2021.1964597(1-14)Online publication date: 23-Aug-2021
  • Show More Cited By

Index Terms

  1. Vision-based attention estimation and selection for social robot to perform natural interaction in the open world

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      HRI '12: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
      March 2012
      518 pages
      ISBN:9781450310635
      DOI:10.1145/2157689

      Sponsors

      In-Cooperation

      • IEEE-RAS: Robotics and Automation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 March 2012

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. attention estimation
      2. social robots
      3. vision understanding

      Qualifiers

      • Abstract

      Conference

      HRI'12
      Sponsor:
      HRI'12: International Conference on Human-Robot Interaction
      March 5 - 8, 2012
      Massachusetts, Boston, USA

      Acceptance Rates

      Overall Acceptance Rate 268 of 1,124 submissions, 24%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)4
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 14 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)LiteGaze: Neural architecture search for efficient gaze estimationPLOS ONE10.1371/journal.pone.028481418:5(e0284814)Online publication date: 1-May-2023
      • (2023)L2CS-Net : Fine-Grained Gaze Estimation in Unconstrained Environments2023 8th International Conference on Frontiers of Signal Processing (ICFSP)10.1109/ICFSP59764.2023.10372944(98-102)Online publication date: 23-Oct-2023
      • (2021)Effect of the projection of robot’s talk information on the perception of communicating humanAdvanced Robotics10.1080/01691864.2021.1964597(1-14)Online publication date: 23-Aug-2021
      • (2017)Sensing and Handling Engagement Dynamics in Human-Robot Interaction Involving Peripheral Computing DevicesProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3025469(556-567)Online publication date: 2-May-2017
      • (2013)Designing engagement-aware agents for multiparty conversationsProceedings of the SIGCHI Conference on Human Factors in Computing Systems10.1145/2470654.2481308(2233-2242)Online publication date: 27-Apr-2013
      • (2013)Building Companionship through Human-Robot CollaborationProceedings of the 5th International Conference on Social Robotics - Volume 823910.1007/978-3-319-02675-6_1(1-7)Online publication date: 27-Oct-2013
      • (2012)Towards more engaging telepresence by face trackingProceedings of the Workshop at SIGGRAPH Asia10.1145/2425296.2425320(137-141)Online publication date: 26-Nov-2012
      • (2012)Attention-based addressee selection for service and social robots to interact with multiple personsProceedings of the Workshop at SIGGRAPH Asia10.1145/2425296.2425319(131-136)Online publication date: 26-Nov-2012

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media