Gromov et al., 2019 - Google Patents
Proximity human-robot interaction using pointing gestures and a wrist-mounted IMUGromov et al., 2019
View PDF- Document ID
- 10345054958896876881
- Author
- Gromov B
- Abbate G
- Gambardella L
- Giusti A
- Publication year
- Publication venue
- 2019 International Conference on Robotics and Automation (ICRA)
External Links
Snippet
We present a system for interaction between co-located humans and mobile robots, which uses pointing gestures sensed by a wrist-mounted IMU. The operator begins by pointing, for a short time, at a moving robot. The system thus simultaneously determines: that the …
- 230000003993 interaction 0 title abstract description 24
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0225—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Gromov et al. | Proximity human-robot interaction using pointing gestures and a wrist-mounted IMU | |
US8577126B2 (en) | System and method for cooperative remote vehicle behavior | |
US20210205986A1 (en) | Teleoperating Of Robots With Tasks By Mapping To Human Operator Pose | |
US20090180668A1 (en) | System and method for cooperative remote vehicle behavior | |
Monajjemi et al. | Hri in the sky: Creating and commanding teams of uavs with a vision-mediated gestural interface | |
US9836117B2 (en) | Autonomous drones for tactile feedback in immersive virtual reality | |
Monajjemi et al. | UAV, come to me: End-to-end, multi-scale situated HRI with an uninstrumented human and a distant UAV | |
US20170348858A1 (en) | Multiaxial motion control device and method, in particular control device and method for a robot arm | |
Yuan et al. | Human gaze-driven spatial tasking of an autonomous MAV | |
Jevtić et al. | Comparison of interaction modalities for mobile indoor robot guidance: Direct physical interaction, person following, and pointing control | |
CN106354161A (en) | Robot motion path planning method | |
Gromov et al. | Wearable multi-modal interface for human multi-robot interaction | |
Villani et al. | A natural infrastructure-less human–robot interaction system | |
Gromov et al. | Robot identification and localization with pointing gestures | |
Kalinov et al. | Warevr: Virtual reality interface for supervision of autonomous robotic system aimed at warehouse stocktaking | |
Sathiyanarayanan et al. | Gesture controlled robot for military purpose | |
Gromov et al. | Intuitive 3D control of a quadrotor in user proximity with pointing gestures | |
Ibrahimov et al. | Dronepick: Object picking and delivery teleoperation with the drone controlled by a wearable tactile display | |
Stückler et al. | Increasing flexibility of mobile manipulation and intuitive human-robot interaction in RoboCup@ Home | |
Gromov et al. | Guiding quadrotor landing with pointing gestures | |
CN104656676B (en) | A kind of servo-controlled apparatus and method of anthropomorphic robot hand leg eye | |
Galbraith et al. | A neural network-based exploratory learning and motor planning system for co-robots | |
Quesada et al. | Holo-SpoK: Affordance-aware augmented reality control of legged manipulators | |
Jorgensen et al. | cockpit interface for locomotion and manipulation control of the NASA valkyrie humanoid in virtual reality (VR) | |
Zhang et al. | An egocentric vision based assistive co-robot |