Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1957656.1957802acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
short-paper

Humanoid robot control using depth camera

Published: 06 March 2011 Publication History

Abstract

Most human interactions with the environment depend on our ability to navigate freely and to use our hands and arms to manipulate objects. Developing natural means of controlling these abilities in humanoid robots can significantly broaden the usability of such platforms. An ideal interface for humanoid robot teleoperation will be inexpensive, person-independent, require no wearable equipment, and will be easy to use, requiring little or no user training.
This work presents a new humanoid robot control and interaction interface that uses depth images and skeletal tracking software to control the navigation, gaze and arm gestures of a humanoid robot. To control the robot, the user stands in front of a depth camera and assumes a specific pose to initiate skeletal tracking. The initial location of the user automatically becomes the origin of the control coordinate system. The user can then use leg and arm gestures to turn the robot's motors on and off, to switch operation modes and to control the behavior of the robot. We present two control modes. The body control mode enables the user to control the arms and navigation direction of the robot using the person's own arms and location, respectively. The gaze direction control mode enables the user to control the focus of attention of the robot by pointing with one hand, while giving commands through gestures of the other hand. We present a demonstration of this interface, in which a combination of these two control modes is used to successfully enable an Aldebaran Nao robot to carry an object from one location to another. Our work makes use of the Microsoft Kinect depth sensor.

Supplementary Material

JPG File (vid112r.jpg)
MP4 File (vid112r.mp4)

Cited By

View all
  • (2024)A Gesture-based Interactive System for Automated Material Handling Vehicles: Implementation and Comparative StudyProceedings of the 12th International Conference on Human-Agent Interaction10.1145/3687272.3688304(176-184)Online publication date: 24-Nov-2024
  • (2022)Phyx.io: Expert-Based Decision Making for the Selection of At-Home Rehabilitation Solutions for Active and Healthy AgingInternational Journal of Environmental Research and Public Health10.3390/ijerph1909549019:9(5490)Online publication date: 1-May-2022
  • (2020)Introducing the NEMO-Lowlands iconic gesture dataset, collected through a gameful human–robot interactionBehavior Research Methods10.3758/s13428-020-01487-0Online publication date: 19-Oct-2020
  • Show More Cited By

Index Terms

  1. Humanoid robot control using depth camera

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HRI '11: Proceedings of the 6th international conference on Human-robot interaction
    March 2011
    526 pages
    ISBN:9781450305617
    DOI:10.1145/1957656

    Sponsors

    In-Cooperation

    • RA: IEEE Robotics and Automation Society
    • Human Factors & Ergonomics Soc: Human Factors & Ergonomics Soc
    • The Association for the Advancement of Artificial Intelligence (AAAI)
    • IEEE Systems, Man and Cybernetics Society

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 March 2011

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. control
    2. depth camera
    3. human-robot interaction
    4. humanoid
    5. interface
    6. robotics
    7. robots
    8. teleoperation

    Qualifiers

    • Short-paper

    Conference

    HRI'11
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 268 of 1,124 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)7
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 21 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)A Gesture-based Interactive System for Automated Material Handling Vehicles: Implementation and Comparative StudyProceedings of the 12th International Conference on Human-Agent Interaction10.1145/3687272.3688304(176-184)Online publication date: 24-Nov-2024
    • (2022)Phyx.io: Expert-Based Decision Making for the Selection of At-Home Rehabilitation Solutions for Active and Healthy AgingInternational Journal of Environmental Research and Public Health10.3390/ijerph1909549019:9(5490)Online publication date: 1-May-2022
    • (2020)Introducing the NEMO-Lowlands iconic gesture dataset, collected through a gameful human–robot interactionBehavior Research Methods10.3758/s13428-020-01487-0Online publication date: 19-Oct-2020
    • (2020)MarioControl: An Intuitive Control Method for a Mobile Robot from a Third-Person PerspectiveCompanion Proceedings of the 2020 Conference on Interactive Surfaces and Spaces10.1145/3380867.3426205(9-13)Online publication date: 8-Nov-2020
    • (2019)A Humanoid Robot Object Perception Approach Using Depth Images2019 IEEE National Aerospace and Electronics Conference (NAECON)10.1109/NAECON46414.2019.9057808(437-442)Online publication date: Jul-2019
    • (2019)Autonomous Color Based Object Tracking of a Hexapod with Efficient Intuitive Characteristics2019 IEEE International Symposium on Measurement and Control in Robotics (ISMCR)10.1109/ISMCR47492.2019.8955728(D3-4-1-D3-4-7)Online publication date: Sep-2019
    • (2019)Deep Correspondence Learning for Effective Robotic Teleoperation using Virtual Reality2019 IEEE-RAS 19th International Conference on Humanoid Robots (Humanoids)10.1109/Humanoids43949.2019.9035031(477-483)Online publication date: Oct-2019
    • (2018)Real-Time Whole-Body Imitation by Humanoid Robots and Task-Oriented Teleoperation Using an Analytical Mapping Method and Quantitative EvaluationApplied Sciences10.3390/app81020058:10(2005)Online publication date: 22-Oct-2018
    • (2016)How would you gesture navigate a drone?Proceedings of the 20th International Academic Mindtrek Conference10.1145/2994310.2994348(113-121)Online publication date: 17-Oct-2016
    • (2016)Evaluation of an Inverse-Kinematics Depth-Sensing Controller for Operation of a Simulated Robotic ArmDesign, User Experience, and Usability: Technological Contexts10.1007/978-3-319-40406-6_36(373-381)Online publication date: 22-Jun-2016
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media