Eiband et al., 2021 - Google Patents
Identification of common force-based robot skills from the human and robot perspectiveEiband et al., 2021
View PDF- Document ID
- 11549803531131598986
- Author
- Eiband T
- Lee D
- Publication year
- Publication venue
- 2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)
External Links
Snippet
Learning from Demonstration (LfD) can significantly speed up the knowledge transfer from human to robot, which has been proven for relatively unconstrained actions such as pick and place. However, transferring contact or force-based skills (contact skills) to a robot is …
- 230000003993 interaction 0 abstract description 20
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6217—Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
- G06K9/6232—Extracting features by transforming the feature space, e.g. multidimensional scaling; Mappings, e.g. subspace methods
- G06K9/6247—Extracting features by transforming the feature space, e.g. multidimensional scaling; Mappings, e.g. subspace methods based on an approximation criterion, e.g. principal component analysis
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6267—Classification techniques
- G06K9/6268—Classification techniques relating to the classification paradigm, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39376—Hierarchical, learning, recognition and skill level and adaptation servo level
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6201—Matching; Proximity measures
- G06K9/6202—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
- G06N99/005—Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
- G06K9/00355—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/02—Gripping heads and other end effectors servo-actuated
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Vahrenkamp et al. | Part-based grasp planning for familiar objects | |
Sahbani et al. | An overview of 3D object grasp synthesis algorithms | |
Racca et al. | Learning in-contact control strategies from demonstration | |
Lu et al. | Modeling grasp type improves learning-based grasp planning | |
Eiband et al. | Identification of common force-based robot skills from the human and robot perspective | |
Simão et al. | Natural control of an industrial robot using hand gesture recognition with neural networks | |
Huang et al. | A modular approach to learning manipulation strategies from human demonstration | |
Bhattacharjee et al. | Rapid categorization of object properties from incidental contact with a tactile sensing robot arm | |
Kadalagere Sampath et al. | Review on human‐like robot manipulation using dexterous hands | |
Dang et al. | Robot learning of everyday object manipulations via human demonstration | |
Eiband et al. | Learning haptic exploration schemes for adaptive task execution | |
Franzel et al. | Detection of collaboration and collision events during contact task execution | |
Lin et al. | Grasp mapping using locality preserving projections and knn regression | |
Palm et al. | Recognition of human grasps by time-clustering and fuzzy modeling | |
Perico et al. | Learning robust manipulation tasks involving contact using trajectory parameterized probabilistic principal component analysis | |
Qiu et al. | Hand pose-based task learning from visual observations with semantic skill extraction | |
Lin et al. | Robot learning from human demonstration with remote lead hrough teaching | |
Romero et al. | Human-to-robot mapping of grasps | |
Wang et al. | Human intention estimation with tactile sensors in human-robot collaboration | |
Fresnillo et al. | A method for understanding and digitizing manipulation activities using programming by demonstration in robotic applications | |
Coppola et al. | Master of puppets: multi-modal robot activity segmentation from teleoperated demonstrations | |
Palm et al. | Grasp recognition by Time-clustering, Fuzzy Modeling, and Hidden Markov Models (HMM)-a comparative study | |
Zhang et al. | DexGrasp-Diffusion: Diffusion-based Unified Functional Grasp Synthesis Method for Multi-Dexterous Robotic Hands | |
Lu et al. | System of robot learning from multi-modal demonstration and natural language instruction | |
Kato et al. | Contact state recognition for selective cutting task of flexible objects |