Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3171221.3171276acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article

Expressing Robot Incapability

Published: 26 February 2018 Publication History

Abstract

Our goal is to enable robots to express their incapability, and to do so in a way that communicates both what they are trying to accomplish and why they are unable to accomplish it. We frame this as a trajectory optimization problem: maximize the similarity between the motion expressing incapability and what would amount to successful task execution, while obeying the physical limits of the robot. We introduce and evaluate candidate similarity measures, and show that one in particular generalizes to a range of tasks, while producing expressive motions that are tailored to each task. Our user study supports that our approach automatically generates motions expressing incapability that communicate both what and why to end-users, and improve their overall perception of the robot and willingness to collaborate with it in the future.

Supplementary Material

MP4 File (fp1262.mp4)

References

[1]
Elliot Aronson, Ben Willerman, and Joanne Floyd . 1966. The effect of a pratfall on increasing interpersonal attractiveness. Psychonomic Science, Vol. 4, 6 (1966), 227--228.
[2]
Svante Augustsson, Jonas Olsson, Linn Gustavsson Christiernin, and Gunnar Bolmsjö . 2014. How to transfer information between collaborating human operators and industrial robots in an assembly. In Proceedings of the Eighth Nordic Conference on Human-Computer Interaction (NordiCHI): Fun, Fast, Foundational. 286--294.
[3]
Cynthia Breazeal and Paul Fitzpatrick . 2000. That certain look: Social amplification of animate vision AAAI Fall Symposium. 18--22.
[4]
Daniel J. Brooks, Momotaz Begum, and Holly A. Yanco . 2016. Analysis of reactions towards failures and recovery strategies for autonomous robots Proceedings of the Twenty-Fifth IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). 487--492.
[5]
John Carff, Matthew Johnson, Eman M. El-Sheikh, and Jerry E. Pratt . 2009. Human-robot team navigation in visually complex environments Proceedings of the Twenty-Second IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 3043--3050.
[6]
Elizabeth Cha, Anca D. Dragan, and Siddhartha S. Srinivasa . 2015. Perceived robot capability. In Proceedings of the Twenty-Fourth IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). 541--548.
[7]
Munjal Desai, Poornima Kaniarasu, Mikhail Medvedev, Aaron Steinfeld, and Holly Yanco . 2013. Impact of robot failures and feedback on real-time trust Proceedings of the Eighth ACM/IEEE International Conference on Human-Robot Interaction (HRI). 251--258.
[8]
Rosen Diankov . 2010. Automated construction of robotic manipulation programs. Ph.D. Dissertation. bibinfoschoolCarnegie Mellon University, Robotics Institute.
[9]
Anca D. Dragan, Shira Bauman, Jodi Forlizzi, and Siddhartha S. Srinivasa . 2015. Effects of robot motion on human-robot collaboration Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI). 51--58.
[10]
Anca D. Dragan, Kenton C. T. Lee, and Siddhartha S. Srinivasa . 2013. Legibility and predictability of robot motion. In Proceedings of the Eighth ACM/IEEE International Conference on Human-Robot Interaction (HRI). 301--308.
[11]
Michael J. Gielniak and Andrea L. Thomaz . 2011. Generating anticipation in robot motion. In Proceedings of the Twentieth IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). 449--454.
[12]
Brian Gleeson, Karon MacLean, Amir Haddadi, Elizabeth Croft, and Javier Alcazar . 2013. Gestures for industry: Intuitive human-robot communication from human observation Proceedings of the Eighth ACM/IEEE International Conference on Human-Robot Interaction (HRI). 349--356.
[13]
Amir Haddadi, Elizabeth A. Croft, Brian T. Gleeson, Karon MacLean, and Javier Alcazar . 2013. Analysis of task-based gestures in human-robot interaction Proceedings of the 2013 IEEE International Conference on Robotics and Automation (ICRA). 2146--2152.
[14]
Cory J. Hayes, Maryam Moosaei, and Laurel D. Riek . 2016. Exploring implicit human responses to robot mistakes in a learning from demonstration task Proceedings of the Twenty-Fifth IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). 246--252.
[15]
Poornima Kaniarasu, Aaron Steinfeld, Munjal Desai, and Holly Yanco . 2013. Robot confidence and trust alignment. In Proceedings of the Eighth ACM/IEEE International Conference on Human-Robot Interaction (HRI). 155--156.
[16]
Taemie Kim and Pamela Hinds . 2006. Who should I blame? Effects of autonomy and transparency on attributions in human-robot interaction. In Proceedings of the Fifteenth IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). 80--85.
[17]
Kazuki Kobayashi and Seiji Yamada . 2005. Informing a user of robot's mind. In Proceedings of the Third International Conference on Computational Intelligence, Robotics and Autonomous Systems (CIRAS).
[18]
Min Kyung Lee, Sara Kielser, Jodi Forlizzi, Siddhartha Srinivasa, and Paul Rybski . 2010. Gracefully mitigating breakdowns in robotic services Proceedings of the Fifth ACM/IEEE International Conference on Human-Robot Interaction (HRI). 203--210.
[19]
Milecia Matthews, Girish Chowdhary, and Emily Kieson . 2017. Intent communication between autonomous vehicles and pedestrians. arXiv preprint arXiv:1708.07123 (2017).
[20]
Nicole Mirnig, Gerald Stollnberger, Markus Miksch, Susanne Stadler, Manuel Giuliani, and Manfred Tscheligi . 2017. To err is robot: How humans assess and act toward an erroneous social robot. Frontiers in Robotics and AI Vol. 4 (2017), 21.
[21]
Bilge Mutlu, Fumitaka Yamaoka, Takayuki Kanda, Hiroshi Ishiguro, and Norihiro Hagita . 2009. Nonverbal leakage in robots: Communication of intentions through seemingly unintentional behavior. In Proceedings of the Fourth ACM/IEEE International Conference on Human-Robot Interaction (HRI). 69--76.
[22]
Monica N. Nicolescu and Maja J. Mataric . 2001. Learning and interacting in human-robot domains. IEEE Trans. Systems, Man, and Cybernetics, Part A, Vol. 31, 5 (2001), 419--430.
[23]
Stefanos Nikolaidis, Swaprava Nath, Ariel D. Procaccia, and Siddhartha Srinivasa . 2017. Game-theoretic modeling of human adaptation in human-robot collaboration Proceedings of the Twelfth ACM/IEEE International Conference on Human-Robot Interaction (HRI). 323--331.
[24]
Marco Ragni, Andrey Rudenko, Barbara Kuhnert, and Kai O. Arras . 2016. Errare humanum est: Erroneous robots in human-robot interaction Proceedings of the Twenty-Fifth IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). 501--506.
[25]
Vasumathi Raman, Constantine Lignos, Cameron Finucane, Kenton C Lee, Mitch Marcus, and Hadas Kress-Gazit . 2013. Sorry Dave, I'm afraid I can't do that: Explaining unachievable robot tasks using natural language Robotics: Science and Systems.
[26]
Eric Rosen, David Whitney, Elizabeth Phillips, Gary Chen, James Tompkin, George Konidaris, and Stefanie Tellex . 2017. Communicating robot arm motion intent through mixed reality head-mounted displays Proceedings of the International Symposium on Robotics Research (ISRR).
[27]
John Schulman, Yan Duan, Jonathan Ho, Alex Lee, Ibrahim Awwal, Henry Bradlow, Jia Pan, Sachin Patil, Ken Goldberg, and Pieter Abbeel . 2014. Motion planning with sequential convex optimization and convex collision checking. International Journal of Robotics Research (IJRR), Vol. 33, 9 (2014), 1251--1270.
[28]
Freek Stulp, Jonathan Grizou, Baptiste Busch, and Manuel Lopes . 2015. Facilitating intention prediction for humans by optimizing robot motions Proceedings of the Twenty-Eighth IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 1249--1255.
[29]
Daniel Szafir, Bilge Mutlu, and Terrence Fong . 2014. Communication of intent in assistive free flyers. Proceedings of the Ninth ACM/IEEE International Conference on Human-Robot Interaction (HRI). 358--365.
[30]
Leila Takayama, Doug Dooley, and Wendy Ju . 2011. Expressing thought: Improving robot readability with animation principles Proceedings of the Sixth ACM/IEEE International Conference on Human-Robot Interaction (HRI). 69--76.
[31]
Stefanie Tellex, Ross A. Knepper, Adrian Li, Daniela Rus, and Nicholas Roy . 2014. Asking for help using inverse semantics. In Robotics: Science and Systems.
[32]
Allan Zhou, Dylan Hadfield-Menell, Anusha Nagabandi, and Anca D. Dragan . 2017. Expressive robot motion timing. In Proceedings of the Twelfth ACM/IEEE International Conference on Human-Robot Interaction (HRI). 22--31.
[33]
Matt Zucker, Nathan Ratliff, Anca D. Dragan, Mihail Pivtoraiko, Matthew Klingensmith, Christopher M. Dellin, J. Andrew Bagnell, and Siddhartha S. Srinivasa . 2013. CHOMP: Covariant Hamiltonian optimization for motion planning. International Journal of Robotics Research (IJRR), Vol. 32, 9--10 (2013), 1164--1193.

Cited By

View all
  • (2024)Effects of Incoherence in Multimodal Explanations of Robot FailuresCompanion Proceedings of the 26th International Conference on Multimodal Interaction10.1145/3686215.3690155(6-10)Online publication date: 4-Nov-2024
  • (2024)When to Explain? Exploring the Effects of Explanation Timing on User Perceptions and Trust in AI systemsProceedings of the Second International Symposium on Trustworthy Autonomous Systems10.1145/3686038.3686066(1-17)Online publication date: 16-Sep-2024
  • (2024)Influence of Simulation and Interactivity on Human Perceptions of a Robot During Navigation TasksACM Transactions on Human-Robot Interaction10.1145/3675784Online publication date: 16-Jul-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '18: Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction
February 2018
468 pages
ISBN:9781450349536
DOI:10.1145/3171221
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 February 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. expressive robot motion
  2. incapability
  3. trajectory optimization

Qualifiers

  • Research-article

Conference

HRI '18
Sponsor:

Acceptance Rates

HRI '18 Paper Acceptance Rate 49 of 206 submissions, 24%;
Overall Acceptance Rate 268 of 1,124 submissions, 24%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)103
  • Downloads (Last 6 weeks)13
Reflects downloads up to 01 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Effects of Incoherence in Multimodal Explanations of Robot FailuresCompanion Proceedings of the 26th International Conference on Multimodal Interaction10.1145/3686215.3690155(6-10)Online publication date: 4-Nov-2024
  • (2024)When to Explain? Exploring the Effects of Explanation Timing on User Perceptions and Trust in AI systemsProceedings of the Second International Symposium on Trustworthy Autonomous Systems10.1145/3686038.3686066(1-17)Online publication date: 16-Sep-2024
  • (2024)Influence of Simulation and Interactivity on Human Perceptions of a Robot During Navigation TasksACM Transactions on Human-Robot Interaction10.1145/3675784Online publication date: 16-Jul-2024
  • (2024)Encouraging Bystander Assistance for Urban Robots: Introducing Playful Robot Help-Seeking as a StrategyProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661505(2514-2529)Online publication date: 1-Jul-2024
  • (2024)The Who in XAI: How AI Background Shapes Perceptions of AI ExplanationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642474(1-32)Online publication date: 11-May-2024
  • (2024)From Agent Autonomy to Casual Collaboration: A Design Investigation on Help-Seeking Urban RobotsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642389(1-14)Online publication date: 11-May-2024
  • (2024)A Generalizable Architecture for Explaining Robot Failures Using Behavior Trees and Large Language ModelsCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640551(1038-1042)Online publication date: 11-Mar-2024
  • (2024)Generative Expressive Robot Behaviors using Large Language ModelsProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634999(482-491)Online publication date: 11-Mar-2024
  • (2024)Aligning Human and Robot RepresentationsProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634987(42-54)Online publication date: 11-Mar-2024
  • (2024)Reactive or Proactive? How Robots Should Explain FailuresProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634963(413-422)Online publication date: 11-Mar-2024
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media