Ogawara et al., 2000 - Google Patents
Acquiring hand-action models in task and behavior levels by a learning robot through observing human demonstrationsOgawara et al., 2000
View PDF- Document ID
- 14977209148909239553
- Author
- Ogawara K
- Takamatsu J
- Iba S
- Tanuki T
- Kimura H
- Ikeuchi K
- Publication year
- Publication venue
- The IEEE-RAS Intl. Conf. on Humanoid Robots
External Links
Snippet
This paper describes our current research on learning tasks (what to do) and behaviors (how to do it) by a robot through observation of human demonstrations. We focus on human hand actions and represent such hand actions in symbolic behavior and task models. We …
- 241000282414 Homo sapiens 0 title abstract description 97
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/36—Nc in input of data, input key till input tape
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114080583B (en) | Visual teaching and repetitive movement manipulation system | |
Li et al. | Survey on mapping human hand motion to robotic hands for teleoperation | |
US8843236B2 (en) | Method and system for training a robot using human-assisted task demonstration | |
Sayour et al. | Autonomous robotic manipulation: real‐time, deep‐learning approach for grasping of unknown objects | |
US11969893B2 (en) | Automated personalized feedback for interactive learning applications | |
JPWO2003019475A1 (en) | Robot device, face recognition method, and face recognition device | |
US20220161424A1 (en) | Device and method for controlling a robotic device | |
WO2022014312A1 (en) | Robot control device and robot control method, and program | |
Ogawara et al. | Acquiring hand-action models in task and behavior levels by a learning robot through observing human demonstrations | |
Skoglund et al. | Programming by demonstration of pick-and-place tasks for industrial manipulators using task primitives | |
Lloyd et al. | Programming contact tasks using a reality-based virtual environment integrated with vision | |
Ben Abdallah et al. | Kinect‐Based Sliding Mode Control for Lynxmotion Robotic Arm | |
CN115686193A (en) | Virtual model three-dimensional gesture control method and system in augmented reality environment | |
Wu et al. | Improving human-robot interactivity for tele-operated industrial and service robot applications | |
Jiang et al. | Flexible virtual fixture enhanced by vision and haptics for unstructured environment teleoperation | |
Pedrosa et al. | A skill-based architecture for pick and place manipulation tasks | |
Palla et al. | Position based visual servoing control of a wheelchair mounter robotic arm using parallel tracking and mapping of task objects | |
Infantino et al. | Visual control of a robotic hand | |
Liu et al. | Manipulating complex robot behavior for autonomous and continuous operations | |
Fang et al. | Learning from wearable-based teleoperation demonstration | |
Aleotti et al. | A multimodal user interface for remote object exploration in teleoperation systems | |
Elachkar | Robot Learning From Human Observation Using Deep Neural Networks | |
Bisson | Development of an interface for intuitive teleoperation of COMAU manipulator robots using RGB-D sensors | |
Vinayavekhin et al. | Representation and mapping of dexterous manipulation through task primitives | |
Schebor | Virtual environment for undersea telepresence |