Nothing Special   »   [go: up one dir, main page]

Gromov et al., 2019 - Google Patents

Proximity human-robot interaction using pointing gestures and a wrist-mounted IMU

Gromov et al., 2019

View PDF
Document ID
10345054958896876881
Author
Gromov B
Abbate G
Gambardella L
Giusti A
Publication year
Publication venue
2019 International Conference on Robotics and Automation (ICRA)

External Links

Snippet

We present a system for interaction between co-located humans and mobile robots, which uses pointing gestures sensed by a wrist-mounted IMU. The operator begins by pointing, for a short time, at a moving robot. The system thus simultaneously determines: that the …
Continue reading at idsia-robotics.github.io (PDF) (other versions)

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay

Similar Documents

Publication Publication Date Title
Gromov et al. Proximity human-robot interaction using pointing gestures and a wrist-mounted IMU
US8577126B2 (en) System and method for cooperative remote vehicle behavior
US20210205986A1 (en) Teleoperating Of Robots With Tasks By Mapping To Human Operator Pose
US20090180668A1 (en) System and method for cooperative remote vehicle behavior
Monajjemi et al. Hri in the sky: Creating and commanding teams of uavs with a vision-mediated gestural interface
US9836117B2 (en) Autonomous drones for tactile feedback in immersive virtual reality
Monajjemi et al. UAV, come to me: End-to-end, multi-scale situated HRI with an uninstrumented human and a distant UAV
US20170348858A1 (en) Multiaxial motion control device and method, in particular control device and method for a robot arm
Yuan et al. Human gaze-driven spatial tasking of an autonomous MAV
Jevtić et al. Comparison of interaction modalities for mobile indoor robot guidance: Direct physical interaction, person following, and pointing control
CN106354161A (en) Robot motion path planning method
Gromov et al. Wearable multi-modal interface for human multi-robot interaction
Villani et al. A natural infrastructure-less human–robot interaction system
Gromov et al. Robot identification and localization with pointing gestures
Kalinov et al. Warevr: Virtual reality interface for supervision of autonomous robotic system aimed at warehouse stocktaking
Sathiyanarayanan et al. Gesture controlled robot for military purpose
Gromov et al. Intuitive 3D control of a quadrotor in user proximity with pointing gestures
Ibrahimov et al. Dronepick: Object picking and delivery teleoperation with the drone controlled by a wearable tactile display
Stückler et al. Increasing flexibility of mobile manipulation and intuitive human-robot interaction in RoboCup@ Home
Gromov et al. Guiding quadrotor landing with pointing gestures
CN104656676B (en) A kind of servo-controlled apparatus and method of anthropomorphic robot hand leg eye
Galbraith et al. A neural network-based exploratory learning and motor planning system for co-robots
Quesada et al. Holo-SpoK: Affordance-aware augmented reality control of legged manipulators
Jorgensen et al. cockpit interface for locomotion and manipulation control of the NASA valkyrie humanoid in virtual reality (VR)
Zhang et al. An egocentric vision based assistive co-robot