Zollner et al., 2001 - Google Patents
Integration of tactile sensors in a programming by demonstration systemZollner et al., 2001
- Document ID
- 6064503030643636046
- Author
- Zollner R
- Rogalla O
- Dillmann R
- Publication year
- Publication venue
- Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No. 01CH37164)
External Links
Snippet
Easy programming methods following the programming by demonstration (PbD) paradigm have been developed. The main goal of these systems is to allow an inexperienced human user to easily integrate motion and perception skills or complex problem solving strategies …
- 238000004458 analytical method 0 abstract description 8
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6217—Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00362—Recognising human body or animal bodies, e.g. vehicle occupant, pedestrian; Recognising body parts, e.g. hand
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Dillmann et al. | Learning robot behaviour and skills based on human demonstration and advice: the machine learning paradigm | |
Yang et al. | Haptics electromyography perception and learning enhanced intelligence for teleoperated robot | |
Wang et al. | Controlling object hand-over in human–robot collaboration via natural wearable sensing | |
Ehrenmann et al. | Programming service tasks in household environments by human demonstration | |
Chen et al. | A human–robot interface for mobile manipulator | |
Lee et al. | Interaction force estimation using camera and electrical current without force/torque sensor | |
Kang et al. | A robot system that observes and replicates grasping tasks | |
Zollner et al. | Integration of tactile sensors in a programming by demonstration system | |
Aleotti et al. | Position teaching of a robot arm by demonstration with a wearable input device | |
Pan et al. | Robot teaching system based on hand-robot contact state detection and motion intention recognition | |
Palm et al. | Recognition of human grasps by time-clustering and fuzzy modeling | |
Zhang et al. | A feasibility study on an intuitive teleoperation system combining IMU with sEMG sensors | |
Brock et al. | A framework for learning and control in intelligent humanoid robots | |
Ku et al. | Modeling objects as aspect transition graphs to support manipulation | |
Gäbert et al. | Gesture based symbiotic robot programming for agile production | |
Nogales et al. | A proposal for Hand gesture control applied to the KUKA youBot using motion tracker sensors and machine learning algorithms | |
Dillmann et al. | Interactive natural programming of robots: Introductory overview | |
Iossifidis et al. | Anthropomorphism as a pervasive design concept for a robotic assistant | |
Huang et al. | Robot learning from demonstration with tactile signals for geometry-dependent tasks | |
Fresnillo et al. | A method for understanding and digitizing manipulation activities using programming by demonstration in robotic applications | |
Ehrenmann et al. | Interaction with robot assistants: Commanding ALBERT | |
Dillman et al. | Human friendly programming of humanoid robots—The German collaborative research center | |
Steil et al. | Learning issues in a multi-modal robot-instruction scenario | |
Kuniyoshi et al. | Haptic detection of object affordances by a multi-fingered robot hand | |
Chen et al. | Teaching manipulation skills to a robot through a haptic rendered virtual environment |