Nothing Special   »   [go: up one dir, main page]

Tee et al., 2022 - Google Patents

A framework for tool cognition in robots without prior tool learning or observation

Tee et al., 2022

Document ID
15841632774882831208
Author
Tee K
Cheong S
Li J
Ganesh G
Publication year
Publication venue
Nature Machine Intelligence

External Links

Snippet

Human tool use prowess distinguishes us from other animals. In many scenarios, a human is able to recognize objects seen for the first time as potential tools for a task and use them without requiring any learning. Here we propose a framework to enable similar abilities in …
Continue reading at www.nature.com (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/62Methods or arrangements for recognition using electronic means
    • G06K9/6217Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/36Image preprocessing, i.e. processing the image information without deciding about the identity of the image
    • G06K9/46Extraction of features or characteristics of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computer systems based on biological models
    • G06N3/02Computer systems based on biological models using neural network models
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N99/00Subject matter not provided for in other groups of this subclass
    • G06N99/005Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/50Computer-aided design
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computer systems utilising knowledge based models
    • G06N5/04Inference methods or devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F19/00Digital computing or data processing equipment or methods, specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Similar Documents

Publication Publication Date Title
Newbury et al. Deep learning approaches to grasp synthesis: A review
Levine et al. Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection
Levine et al. End-to-end training of deep visuomotor policies
Finn et al. Deep visual foresight for planning robot motion
Saxena et al. Robotic grasping of novel objects using vision
Bohg et al. Data-driven grasp synthesis—a survey
Martinez-Hernandez et al. Feeling the shape: Active exploration behaviors for object recognition with a robotic hand
Tee et al. A framework for tool cognition in robots without prior tool learning or observation
Ottenhaus et al. Visuo-haptic grasping of unknown objects based on gaussian process implicit surfaces and deep learning
Chen et al. Learning robust real-world dexterous grasping policies via implicit shape augmentation
Pozzi et al. Hand closure model for planning top grasps with soft robotic hands
Staretu et al. Leap motion device used to control a real anthropomorphic gripper
Faria et al. Knowledge-based reasoning from human grasp demonstrations for robot grasp synthesis
Liu et al. Rdt-1b: a diffusion foundation model for bimanual manipulation
Xu et al. Dexterous manipulation from images: Autonomous real-world rl via substep guidance
Bütepage et al. From visual understanding to complex object manipulation
Tang et al. Selective object rearrangement in clutter
Matsushima et al. World robot challenge 2020–partner robot: a data-driven approach for room tidying with mobile manipulator
Zhang et al. Digital twin-enabled grasp outcomes assessment for unknown objects using visual-tactile fusion perception
Kent et al. Construction of a 3D object recognition and manipulation database from grasp demonstrations
Hu et al. Reboot: Reuse data for bootstrapping efficient real-world dexterous manipulation
Alaaudeen et al. Intelligent robotics harvesting system process for fruits grasping prediction
Li et al. Interactive learning for multi-finger dexterous hand: A model-free hierarchical deep reinforcement learning approach
de La Bourdonnaye et al. Stage-wise learning of reaching using little prior knowledge
Kang et al. Team Tidyboy at the WRS 2020: A modular software framework for home service robots