Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3610977.3634958acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article

Fast Perception for Human-Robot Handovers with Legged Manipulators

Published: 11 March 2024 Publication History

Abstract

Deploying perception modules for human-robot handovers is challenging because they require a high degree of reactivity, generalizability, and robustness to work reliably for a diversity of cases. Further complications arise as each object can be handed over in a variety of ways, causing occlusions and viewpoint changes. On legged robots, deployment is particularly challenging because of the limited computational resources and the image-space noise resulting from locomotion. In this paper, we introduce an efficient and object-agnostic real-time tracking framework, specifically designed for human-to-robot handover tasks with a legged manipulator. The proposed method combines optical flow with Siamese-network-based tracking and depth segmentation in an adaptive Kalman Filter framework. We show that we outperform the state-of-the-art for tracking during human-to-robot handovers with our legged manipulator. We demonstrate the generalizability, reactivity, and robustness of our system through experiments in different scenarios and by carrying out a user study. Additionally, as timing is proven to be more important than spatial accuracy for human-robot handovers, we show that we reach close to human timing performance during the approaching phase, both in terms of objective metrics and subjective feedback from the participants of our user study.

Supplemental Material

MP4 File
Supplemental video

References

[1]
[n. d.]. Jetson Xavier AGX. https://developer.nvidia.com/embedded/jetson-agxxavier. Accessed: 2023-06--26.
[2]
Shahrokh Akhlaghi, Ning Zhou, and Zhenyu Huang. [n. d.]. Adaptive adjustment of noise covariance in Kalman filter for dynamic state estimation. In 2017 IEEE Power & Energy Society General Meeting. 1--5.
[3]
Jacopo Aleotti, Vincenzo Micelli, and Stefano Caselli. [n. d.]. Comfortable robot to human object hand-over. In 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication. 771--776.
[4]
Oleksandr Bailo, François Rameau, Kyungdon Joo, Jinsun Park, Oleksandr Bogdan, and Inso Kweon. 2018. Efficient adaptive non-maximal suppression algorithms for homogeneous spatial keypoint distribution. Pattern Recognition Letters (2018).
[5]
Heike Benninghoff, Florian Rems, and Toralf Boge. 2014. Development and hardware-in-the-loop test of a guidance, navigation and control system for onorbit servicing. Acta Astronautica 102 (2014), 67--80.
[6]
J.-Y. Bouguet. 1999. Pyramidal implementation of the lucas kanade feature tracker.
[7]
Z. Cao, G. Hidalgo Martinez, T. Simon, S. Wei, and Y. A. Sheikh. 2019. OpenPose: Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields. IEEE Transactions on Pattern Analysis and Machine Intelligence (2019).
[8]
Po-Kai Chang, Jui-Te Huang, Yu-Yen Huang, and Hsueh-Cheng Wang. [n. d.]. Learning End-to-End 6DoF Grasp Choice of Human-to-Robot Handover using Affordance Prediction and Deep Reinforcement Learning. In 2022 IEEE International Conference on Robotics and Automation (ICRA).
[9]
Sammy Christen, Wei Yang, Claudia Pérez-D'Arpino, Otmar Hilliges, Dieter Fox, and Yu-Wei Chao. 2023. Learning Human-to-Robot Handovers from Point Clouds. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 9654--9664.
[10]
Marco Costanzo, Giuseppe De Maria, and Ciro Natale. 2021. Handover Control for Human-Robot and Robot-Robot Collaboration. Frontiers in Robotics and AI 8 (2021), 132.
[11]
Haonan Duan, Peng Wang, Yiming Li, Daheng Li, and Wei Wei. 2022. Learning Human-to-Robot Dexterous Handovers for Anthropomorphic Hand. IEEE Transactions on Cognitive and Developmental Systems (2022), 1--1. https://doi.org/10. 1109/TCDS.2022.3203025
[12]
F. Furrer, T. Novkovic, M. Fehr, A. Gawel, M. Grinvald, T. Sattler, R. Siegwart, and J. Nieto. [n. d.]. Incremental Object Database: Building 3D Models from Multiple Partial Observations. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 6835--6842.
[13]
Q. Gan and C.J. Harris. 2001. Comparison of two measurement fusion methods for Kalman-filter-based multisensor data fusion. IEEE Trans. Aerospace Electron. Systems 37, 1 (2001), 273--279. https://doi.org/10.1109/7.913685
[14]
Lianghua Huang, Xin Zhao, and Kaiqi Huang. 2021. GOT-10k: A Large High- Diversity Benchmark for Generic Object Tracking in the Wild. IEEE Transactions on Pattern Analysis and Machine Intelligence 43, 5 (2021), 1562--1577. https: //doi.org/10.1109/TPAMI.2019.2957464
[15]
Markus Huber, Markus Rickert, Alois Knoll, Thomas Brandt, and Stefan Glasauer. [n. d.]. Human-robot interaction in handing-over tasks. In RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication. 107--112.
[16]
Bryan J Kemp. 1973. Reaction time of young and elderly subjects in relation to perceptual deprivation and signal-on versus signal-off conditions. Developmental Psychology 8, 2 (1973), 268.
[17]
Ansgar Koene, Anthony Remazeilles, Miguel Prada, Ainara Garzo, Mildred Puerto, Satoshi Endo, and AlanMWing. 2014. Relative importance of spatial and temporal precision for user satisfaction in human-robot object handover interactions. In Third International Symposium on New Frontiers in Human-Robot Interaction.
[18]
Jelizaveta Konstantinova, Senka Krivic, Agostino Stilli, Justus Piater, and Kaspar Althoefer. [n. d.]. Autonomous object handover using wrist tactile information. In Towards Autonomous Robotic Systems: 18th Annual Conference, TAROS 2017. Springer, 450--463.
[19]
T.D. Larsen, N.A. Andersen, O. Ravn, and N.K. Poulsen. 1998. Incorporation of time delayed measurements in a discrete-time Kalman filter. In Proceedings of the 37th IEEE Conference on Decision and Control, Vol. 4. 3972--3977 vol.4.
[20]
Bo Li, Wei Wu, Qiang Wang, Fangyi Zhang, Junliang Xing, and Junjie Yan. 2018. SiamRPN: Evolution of Siamese Visual Tracking with Very Deep Networks. CoRR abs/1812.11703 (2018).
[21]
Naresh Marturi, Marek Kopicki, Alireza Rastegarpanah, Vijaykumar Rajasekaran, Maxime Adjigble, Rustam Stolkin, Ale? Leonardis, and Yasemin Bekiroglu. 2019. Dynamic grasp and trajectory planning for moving objects. Autonomous Robots 43, 5 (2019), 1241--1256.
[22]
Takahiro Miki, Joonho Lee, Jemin Hwangbo, Lorenz Wellhausen, Vladlen Koltun, and Marco Hutter. 2022. Learning robust perceptive locomotion for quadrupedal robots in the wild. Science Robotics 7, 62 (2022), eabk2822. https://doi.org/10. 1126/scirobotics.abk2822
[23]
A.T. Miller, S. Knoop, H.I. Christensen, and P.K. Allen. [n. d.]. Automatic grasp planning using shape primitives. In 2003 IEEE International Conference on Robotics and Automation, Vol. 2. 1824--1829.
[24]
Matthias Müller, Adel Bibi, Silvio Giancola, Salman Al-Subaihi, and Bernard Ghanem. 2018. TrackingNet: A Large-Scale Dataset and Benchmark for Object Tracking in the Wild. CoRR abs/1803.10794 (2018). arXiv:1803.10794
[25]
Valerio Ortenzi, Akansel Cosgun, Tommaso Pardi, Wesley P Chan, Elizabeth Croft, and Dana Kulic. 2021. Object handovers: a review for robotics. IEEE Transactions on Robotics (2021).
[26]
Matthew K.X.J. Pan, Espen Knoop, Moritz Bächer, and Günter Niemeyer. [n. d.]. Fast Handovers with a Robot Character: Small Sensorimotor Delays Improve Perceived Qualities. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 6735--6741.
[27]
José Luis Pech-Pacheco, Gabriel Cristóbal, Jesús Chamorro-Martinez, and Joaquín Fernández-Valdivia. [n. d.]. Diatom autofocusing in brightfield microscopy: a comparative study. In 2000 IEEE International Conference on Pattern Recognition. ICPR, Vol. 3. 314--317.
[28]
Patrick Rosenberger, Akansel Cosgun, Rhys Newbury, Jun Kwan, Valerio Ortenzi, Peter Corke, and Manfred Grafinger. 2020. Object-Independent Human-to-Robot Handovers using Real Time Robotic Vision. CoRR abs/2006.01797 (2020).
[29]
Ricardo Sanchez-Matilla, Konstantinos Chatzilygeroudis, Apostolos Modas, Nuno Ferreira Duarte, Alessio Xompero, Pascal Frossard, Aude Billard, and Andrea Cavallaro. 2020. Benchmark for human-to-robot handovers of unseen containers with unknown filling. IEEE Robotics and Automation Letters 5 (2020), 1642--1649.
[30]
Axel Sauer, Elie Aljalbout, and Sami Haddadin. 2019. Tracking Holistic Object Representations. In British Machine Vision Conference (BMVC).
[31]
Jean-Pierre Sleiman, Farbod Farshidian, Maria Vittoria Minniti, and Marco Hutter. 2021. A unified mpc framework for whole-body dynamic locomotion and manipulation. IEEE Robotics and Automation Letters 6, 3 (2021), 4688--4695.
[32]
Simon Thorpe, Denis Fize, and Catherine Marlot. 1996. Speed of processing in the human visual system. nature 381, 6582 (1996), 520--522.
[33]
Chien-YaoWang, Alexey Bochkovskiy, and Hong-Yuan Mark Liao. 2022. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv preprint arXiv:2207.02696 (2022).
[34]
Bin Yan, Houwen Peng, Jianlong Fu, DongWang, and Huchuan Lu. 2021. Learning Spatio-Temporal Transformer for Visual Tracking. In 2021 IEEE/CVF International Conference on Computer Vision (ICCV). 10428--10437.
[35]
Wei Yang, Chris Paxton, Maya Cakmak, and Dieter Fox. [n. d.]. Human grasp classification for reactive human-to-robot handovers. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 11123--11130.
[36]
Wei Yang, Chris Paxton, Arsalan Mousavian, Yu-Wei Chao, Maya Cakmak, and Dieter Fox. 2020. Reactive Human-to-Robot Handovers of Arbitrary Objects. CoRR abs/2011.08961 (2020).
[37]
Wei Yang, Balakumar Sundaralingam, Chris Paxton, Iretiayo Akinola, Yu-Wei Chao, Maya Cakmak, and Dieter Fox. [n. d.]. Model Predictive Control for Fluid Human-to-Robot Handovers. In 2022 IEEE International Conference on Robotics and Automation (ICRA). 6956--6962.
[38]
Luka Cehovin Zajc, Ales Leonardis, and Matej Kristan. 2015. Visual Object Tracking Performance Measures Revisited. IEEE Transactions on Image Processing 25 (02 2015). https://doi.org/10.1109/TIP.2016.2520370

Index Terms

  1. Fast Perception for Human-Robot Handovers with Legged Manipulators

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HRI '24: Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction
    March 2024
    982 pages
    ISBN:9798400703225
    DOI:10.1145/3610977
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 11 March 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. human-robot handover
    2. legged robotics
    3. physical human-robot interaction

    Qualifiers

    • Research-article

    Funding Sources

    • SNSF
    • National Centre of Competence in Research (NCCR) Digital Fabrication

    Conference

    HRI '24
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 268 of 1,124 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 179
      Total Downloads
    • Downloads (Last 12 months)179
    • Downloads (Last 6 weeks)38
    Reflects downloads up to 12 Nov 2024

    Other Metrics

    Citations

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media