Nothing Special   »   [go: up one dir, main page]

Gromov et al., 2018 - Google Patents

Robot identification and localization with pointing gestures

Gromov et al., 2018

View PDF
Document ID
9413185718909217900
Author
Gromov B
Gambardella L
Giusti A
Publication year
Publication venue
2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)

External Links

Snippet

We propose a novel approach to establish the relative pose of a mobile robot with respect to an operator that wants to interact with it; we focus on scenarios in which the robot is in the same environment as the operator, and is visible to them. The approach is based on …
Continue reading at www.researchgate.net (PDF) (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Similar Documents

Publication Publication Date Title
Gromov et al. Robot identification and localization with pointing gestures
Gromov et al. Proximity human-robot interaction using pointing gestures and a wrist-mounted IMU
US11772266B2 (en) Systems, devices, articles, and methods for using trained robots
Yuan et al. Human gaze-driven spatial tasking of an autonomous MAV
Kang et al. Flycam: Multitouch gesture controlled drone gimbal photography
CN106354161A (en) Robot motion path planning method
Gromov et al. Wearable multi-modal interface for human multi-robot interaction
Kalinov et al. Warevr: Virtual reality interface for supervision of autonomous robotic system aimed at warehouse stocktaking
Dwivedi et al. Combining electromyography and fiducial marker based tracking for intuitive telemanipulation with a robot arm hand system
Stückler et al. Increasing flexibility of mobile manipulation and intuitive human-robot interaction in RoboCup@ Home
Gromov et al. Intuitive 3D control of a quadrotor in user proximity with pointing gestures
Hebert et al. Supervised remote robot with guided autonomy and teleoperation (SURROGATE): a framework for whole-body manipulation
Li et al. Visual–inertial fusion-based human pose estimation: A review
Schmidt et al. An indoor RGB-D dataset for the evaluation of robot navigation algorithms
Gromov et al. Guiding quadrotor landing with pointing gestures
Wang et al. Robot programming by demonstration with a monocular RGB camera
Pellois et al. An inertial human upper limb motion tracking method for robot programming by demonstration
Jin et al. Human-robot interaction for assisted object grasping by a wearable robotic object manipulation aid for the blind
WO2021073733A1 (en) Method for controlling a device by a human
Andersh et al. Modeling visuo-motor control and guidance functions in remote-control operation
CN206913156U (en) A kind of unmanned plane
Gootjes-Dreesbach et al. Comparison of view-based and reconstruction-based models of human navigational strategy
Jin et al. A wearable robotic device for assistive navigation and object manipulation
Sorribes et al. Visual tracking of a jaw gripper based on articulated 3d models for grasping
Abbate et al. Pointing at moving robots: Detecting events from wrist IMU data