Carfì et al., 2021 - Google Patents
Hand-object interaction: From human demonstrations to robot manipulationCarfì et al., 2021
View HTML- Document ID
- 12675381783862767889
- Author
- Carfì A
- Patten T
- Kuang Y
- Hammoud A
- Alameh M
- Maiettini E
- Weinberg A
- Faria D
- Mastrogiovanni F
- Alenyà G
- Natale L
- Perdereau V
- Vincze M
- Billard A
- Publication year
- Publication venue
- Frontiers in Robotics and AI
External Links
Snippet
Human-object interaction is of great relevance for robots to operate in human environments. However, state-of-the-art robotic hands are far from replicating humans skills. It is, therefore, essential to study how humans use their hands to develop similar robotic capabilities. This …
- 230000003993 interaction 0 title abstract description 62
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/50—Computer-aided design
- G06F17/5009—Computer-aided design using simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Yin et al. | Modeling, learning, perception, and control methods for deformable object manipulation | |
Fang et al. | Survey of imitation learning for robotic manipulation | |
Carfì et al. | Hand-object interaction: From human demonstrations to robot manipulation | |
Wang et al. | Real-virtual components interaction for assembly simulation and planning | |
Li et al. | Survey on mapping human hand motion to robotic hands for teleoperation | |
Bohg et al. | Data-driven grasp synthesis—a survey | |
Xu et al. | Robot teaching by teleoperation based on visual interaction and extreme learning machine | |
Khalil et al. | Dexterous robotic manipulation of deformable objects with multi-sensory feedback-a review | |
Zhou et al. | Advanced robot programming: A review | |
Zhang et al. | Industrial robot programming by demonstration | |
Huang et al. | Grasping novel objects with a dexterous robotic hand through neuroevolution | |
Kumar et al. | Contextual reinforcement learning of visuo-tactile multi-fingered grasping policies | |
Fang et al. | Vision-based posture-consistent teleoperation of robotic arm using multi-stage deep neural network | |
Kadalagere Sampath et al. | Review on human‐like robot manipulation using dexterous hands | |
Zelensky et al. | Control system of collaborative robotic based on the methods of contactless recognition of human actions | |
Cichon et al. | Simulation-based user interfaces for digital twins: Pre-, in-, or post-operational analysis and exploration of virtual testbeds | |
Nguyen et al. | Merging physical and social interaction for effective human-robot collaboration | |
Lu et al. | Visual-tactile robot grasping based on human skill learning from demonstrations using a wearable parallel hand exoskeleton | |
Wang et al. | Learning robotic insertion tasks from human demonstration | |
Inamura et al. | HRP-2W: A humanoid platform for research on support behavior in daily life environments | |
Brock et al. | A framework for learning and control in intelligent humanoid robots | |
Li et al. | An incremental learning framework to enhance teaching by demonstration based on multimodal sensor fusion | |
Wang et al. | Robot programming by demonstration with a monocular RGB camera | |
Gu et al. | A Survey on Robotic Manipulation of Deformable Objects: Recent Advances, Open Challenges and New Frontiers | |
Ma et al. | Flexible robotic grasping strategy with constrained region in environment |