Nothing Special   »   [go: up one dir, main page]

CN114683299A - Robot tool and method of operating the same - Google Patents

Robot tool and method of operating the same Download PDF

Info

Publication number
CN114683299A
CN114683299A CN202210215933.7A CN202210215933A CN114683299A CN 114683299 A CN114683299 A CN 114683299A CN 202210215933 A CN202210215933 A CN 202210215933A CN 114683299 A CN114683299 A CN 114683299A
Authority
CN
China
Prior art keywords
tool
objects
plan
handling
plans
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210215933.7A
Other languages
Chinese (zh)
Other versions
CN114683299B (en
Inventor
鲁仙·出杏光
松冈伸太郎
沟口弘悟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mujin Technology
Original Assignee
Mujin Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/392,108 external-priority patent/US11981518B2/en
Application filed by Mujin Technology filed Critical Mujin Technology
Priority to CN202210215933.7A priority Critical patent/CN114683299B/en
Publication of CN114683299A publication Critical patent/CN114683299A/en
Application granted granted Critical
Publication of CN114683299B publication Critical patent/CN114683299B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/02Gripping heads and other end effectors servo-actuated
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G37/00Combinations of mechanical conveyors of the same kind, or of different kinds, of interest apart from their application in particular machines or use in particular manufacturing processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G43/00Control devices, e.g. for safety, warning or fault-correcting
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/74Feeding, transfer, or discharging devices of particular kinds or types
    • B65G47/90Devices for picking-up and depositing articles or materials
    • B65G47/902Devices for picking-up and depositing articles or materials provided with drive systems incorporating rotary and rectilinear movements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/74Feeding, transfer, or discharging devices of particular kinds or types
    • B65G47/90Devices for picking-up and depositing articles or materials
    • B65G47/91Devices for picking-up and depositing articles or materials incorporating pneumatic, e.g. suction, grippers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G67/00Loading or unloading vehicles
    • B65G67/02Loading or unloading land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Manipulator (AREA)

Abstract

A system and method for operating a robotic system to manipulate an object using two or more tools. The robotic system may coordinate a planning process and/or planning implementation based on grouping the objects and available manipulation tools. The robotic system may use two or more tools to coordinate the planning process and/or implementation to reduce task completion time and/or task error rates.

Description

Robot tool and method of operating the same
The application is a divisional application of Chinese application CN202180004739.8, the application date is 2021, 11 months and 03 days, and the invention name is robot tool and operation method thereof.
Cross Reference to Related Applications
This application claims the benefit of U.S. provisional patent application serial No. 63/109,870, filed on 5.11.2020, which is incorporated herein by reference in its entirety.
Technical Field
The present technology relates generally to robotic systems and, more particularly, to robotic tools configured to grasp and hold objects.
Background
Robots (e.g., machines configured to automatically/autonomously perform physical actions) are now widely used in many areas. Robots may be used, for example, to perform various tasks (e.g., manipulating or handling objects) in manufacturing, packaging, transporting, and/or shipping, etc. In performing the task, the robot is able to replicate human actions, thereby replacing or reducing human involvement otherwise required to perform dangerous or repetitive tasks. Robots often lack the complexity necessary to replicate the human sensitivity and/or adaptability needed to perform more complex tasks. For example, robots often have difficulty grasping objects in certain sub-optimal positions or poses. Accordingly, there remains a need for improved robotic systems and techniques for handling objects using grasping tool sets.
Drawings
Fig. 1 is an illustration of an exemplary environment in which a robotic system transports objects in accordance with one or more embodiments of the present technique.
Fig. 2 is a block diagram illustrating a robotic system in accordance with one or more embodiments of the present technology.
Fig. 3 is a top view of a robotic system, according to one or more embodiments of the present technique.
Fig. 4A is an illustration of an exemplary handling unit in accordance with one or more embodiments of the present technology.
Fig. 4B is an illustration of an exemplary tool set in accordance with one or more embodiments of the present technology.
Fig. 5A is an illustration of a standard grip scenario in accordance with one or more embodiments of the present technology.
Fig. 5B is an illustration of an angled grip scenario in accordance with one or more embodiments of the present technology.
Fig. 6A is an illustration of a standard release scenario in accordance with one or more embodiments of the present technology.
Fig. 6B is an illustration of a first angled release scenario in accordance with one or more embodiments of the present technology.
Fig. 6C is an illustration of a second angled release scenario in accordance with one or more embodiments of the present technology.
Fig. 7 is an exemplary timing diagram in accordance with one or more embodiments of the present technology.
Fig. 8 is a flow diagram for operating a robotic system, in accordance with some embodiments of the present technique.
Detailed Description
Systems and methods for selecting, modifying, and using end effector toolsets are described herein. For example, a robotic system may include a transport robot configured to selectively connect to an end effector tool. The robotic system may select and connect to different end effector tools to grasp and carry objects. The robotic system may access and select from a set of tools. In some embodiments, the tool set may include standard fixed tools, fixed angle tools, and/or flexible head tools. Standard stationary tools may have an end effector rigidly attached to the tool arm. The standard holding tool may be configured to grasp an object having a general orientation or top surface orthogonal to the orientation of the tool arm. The fixed angle tool may be configured to have a non-orthogonal angle between the tool arm and the gripper head/interface. The flexible or adjustable tool head may be configured such that the relative angle/orientation between the tool arm and the gripper head/interface may be adjusted according to the pose of the target object.
In some embodiments, the robotic system may select an end effector tool based on: simulating and/or planning the handling of the object according to the selected tool; deriving measures of handling (e.g., cost and/or revenue); writing a set of plans for a set of target objects; and/or selecting a combination/sequence of plans that optimizes a corresponding metric (e.g., total handling time and/or estimated loss/error rate). Additionally or alternatively, the robotic system may implement the selected plan in parallel with the plan derivation for subsequent objects.
The tool set (e.g., a plurality of selectable tools) and/or the coordinated planning and tool usage for a set of objects (e.g., rather than based on individual objects as they are detected) enables reduced resource consumption and/or reduced error. In some embodiments, the planning and validation process may be completed in a duration (e.g., one second or less) that is shorter than the duration (e.g., one to five seconds) necessary to execute/implement the plan and/or the duration (e.g., five to ten seconds) necessary to change the tool. Thus, by planning the handling for a set of multiple objects (e.g., rather than planning for one object at a time), the robotic system may derive a set of plans that minimizes actions with the highest cost (e.g., time/duration), such as tool change operations.
In the following description, numerous specific details are set forth to provide a thorough understanding of the presently disclosed technology. In other embodiments, the techniques introduced herein may be practiced without these specific details. In other instances, well-known features, such as specific functions or routines, are not described in detail in order to avoid unnecessarily obscuring the present disclosure. Reference in this specification to "an embodiment," "one embodiment," or similar language means that a particular feature, structure, material, or characteristic described is included in at least one embodiment of the disclosure. Thus, the appearances of such phrases in this specification are not necessarily all referring to the same embodiment. On the other hand, such references are not necessarily mutually exclusive. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments. It is to be understood that the various embodiments shown in the figures are merely illustrative representations and are not necessarily drawn to scale.
For the sake of brevity, several details describing structures or processes that are well known and often associated with robotic systems and subsystems and that may unnecessarily obscure some important aspects of the disclosed technology are not set forth in the following description. Furthermore, while the following disclosure sets forth several embodiments of different aspects of the technology, several other embodiments may have configurations or components different from those described in this section. Accordingly, the disclosed technology may have other embodiments with additional elements or without several of the elements described below.
Many of the embodiments or aspects of the disclosure described below can be in the form of computer-or controller-executable instructions, including routines executed by a programmable computer or controller. One skilled in the relevant art will appreciate that the disclosed techniques can be practiced on computer or controller systems other than those shown and described below. The techniques described herein may be embodied in a special purpose computer or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions described below. Thus, the terms "computer" and "controller" as generally used herein refer to any data processor and may include internet appliances and hand-held devices (including palm-top computers, wearable computers, cellular or mobile phones, multi-processor systems, processor-based or programmable consumer electronics, network computers, mini-computers, and the like). The information processed by these computers and controllers may be presented at any suitable display medium, including a Liquid Crystal Display (LCD). Instructions for performing computer or controller-executable tasks may be stored in or on any suitable computer-readable medium including hardware, firmware, or a combination of hardware and firmware. The instructions may be embodied in any suitable memory device, including, for example, a flash drive, a USB device, and/or other suitable media, including tangible, non-transitory computer-readable media.
The terms "coupled" and "connected," along with their derivatives, may be used herein to describe structural relationships between components. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular embodiments, "connected" may be used to indicate that two or more elements are in direct contact with each other. Unless otherwise stated in context, the term "coupled" may be used to indicate that two or more elements are in direct or indirect (with other intervening elements between them) contact each other, or that two or more elements cooperate or interact with each other (e.g., as in a causal relationship, such as for signal transmission/reception or for function calls), or both.
Suitable environment
Fig. 1 is an illustration of an example environment in which a robotic system 100 transports objects in accordance with one or more embodiments of the present technique. The robotic system 100 may include and/or communicate with one or more units (e.g., robots) configured to perform one or more tasks. Aspects of tool selection and usage may be practiced or implemented by various units.
For the example shown in fig. 1, the robotic system 100 may include an unloading unit 102, a handling unit 104 (e.g., a palletizing robot and/or a pick-up robot), a transport unit 106, a loading unit 108, or a combination thereof, located in a warehouse or a distribution/shipping hub. Each of the units in the robotic system 100 may be configured to perform one or more tasks. The tasks may be combined in sequence to perform operations that achieve the goals, such as unloading objects from trucks or vans and storing them in a warehouse, or unloading objects from storage locations and preparing them for shipment. As another example, a task may include placing an object on a target location (e.g., on top of a pallet and/or inside a bin/cage/box/case). As described below, the robotic system may derive a plan for placing and/or stacking objects (e.g., a placement location/orientation, a sequence for handling objects, and/or a corresponding motion plan). Each of the units may be configured to perform a sequence of actions (e.g., by operating one or more components therein) to perform a task in accordance with one or more of the derived plans to perform the task.
In some embodiments, the task may include manipulating (e.g., moving and/or reorienting) the target object 112 (e.g., one of a package, a box, a case, a cage, a pallet, etc., corresponding to the task being performed), such as moving the target object 112 from the starting location 114 to the task location 116. For example, the off-load unit 102 (e.g., an unpacking robot) may be configured to carry the target object 112 from a location in a vehicle (e.g., a truck) to a location on a conveyor belt. Additionally, the handling unit 104 may be configured to handle the target object 112 from one location (e.g., a conveyor belt, pallet, or bin) to another location (e.g., a pallet, bin, etc.). As another example, the handling unit 104 (e.g., a palletizing robot) may be configured to handle target objects 112 from a source location (e.g., a pallet, a pick area, and/or a conveyor) to a destination pallet. Upon completion of the operation, the transport unit 106 may transport the target object 112 from the area associated with the handling unit 104 to the area associated with the loading unit 108, and the loading unit 108 may transport the target object 112 from the handling unit 104 (e.g., by moving a pallet carrying the target object 112) to a storage location (e.g., a location on a shelf). Details regarding the tasks and associated actions are described below.
For illustrative purposes, the robotic system 100 is described in the context of a shipping center; however, it should be understood that the robotic system 100 may be configured to perform tasks in other environments/for other purposes (such as for manufacturing, assembly, packaging, healthcare, and/or other types of automation). It should also be understood that the robotic system 100 may include other units not shown in fig. 1, such as manipulators, service robots, modular robots, and the like. For example, in some embodiments, the robotic system 100 may include a de-palletizing unit for transporting objects from a cage or pallet onto a conveyor or other pallet, a container exchange unit for transporting objects from one container to another, a packing unit for wrapping objects, a sorting unit for grouping objects according to one or more characteristics of the objects, a pick-up unit for manipulating (e.g., sorting, grouping, and/or handling) the objects differently according to one or more characteristics of the objects, or a combination thereof.
The robotic system 100 may include and/or be coupled to physical or structural members (e.g., robotic manipulator arms) connected at joints for movement (e.g., rotational and/or translational displacement). The structural members and joints may form a kinematic chain configured to manipulate an end effector (e.g., a gripper) configured to perform one or more tasks (e.g., grasping, spinning, welding, etc.) depending on the use/operation of the robotic system 100. The robotic system 100 may include actuation devices (e.g., motors, actuators, wires, artificial muscles, electroactive polymers, etc.) configured to drive or manipulate (e.g., displace and/or reorient) the structural members with respect to or at the corresponding joints. In some embodiments, the robotic system 100 may include a transport motor configured to transport a corresponding unit/undercarriage thereto.
The robotic system 100 may include sensors configured to obtain information for accomplishing tasks, such as for manipulating structural members and/or for transporting robotic units. The sensors may include devices configured to detect or measure one or more physical characteristics of the robotic system 100 (e.g., the state, condition, and/or position of one or more structural members/joints thereof) and/or one or more physical characteristics of the surrounding environment. Some examples of sensors may include accelerometers, gyroscopes, force sensors, strain gauges, tactile sensors, torque sensors, position encoders, and the like.
In some embodiments, for example, the sensor may include one or more imaging devices (e.g., a visual and/or infrared camera, a 2D and/or 3D imaging camera, a distance measuring device such as a lidar or radar, etc.) configured to detect the surrounding environment. The imaging device may generate a representation of the detected environment, such as a digital image and/or a point cloud, that may be processed through machine/computer vision (e.g., for automated inspection, robotic guidance, or other robotic applications). The robotic system 100 may process the digital image and/or point cloud to identify the target object 112, the starting location 114, the task location 116, the pose of the target object 112, or a combination thereof.
For manipulating the target object 112, the robotic system 100 may capture and analyze an image of a designated area (e.g., a pickup location, such as the interior of a truck or on a conveyor belt) to identify the target object 112 and its starting location 114. Similarly, the robotic system 100 may capture and analyze an image of another designated area (e.g., a drop location for placing objects on a conveyor, a location for placing objects inside containers, or a location on a pallet for stacking purposes) to identify the task location 116. For example, the imaging device may include one or more cameras configured to generate images of the pick-up area and/or one or more cameras configured to generate images of the task area (e.g., the drop zone). Based on the captured images, the robotic system 100 may determine a starting position 114, a task position 116, an associated pose, a packing/placement plan, a handling/packing order, and/or other processing results, as described below.
In some embodiments, for example, the sensors may include position sensors configured to detect the position of structural members (e.g., robotic arms and/or end effectors) and/or corresponding joints of the robotic system 100. The robotic system 100 may use position sensors to track the position and/or orientation of the structural members and/or joints during performance of a task.
Robot system
Fig. 2 is a block diagram illustrating a robotic system 100 in accordance with one or more embodiments of the present technique. In some embodiments, for example, the robotic system 100 (e.g., at one or more of the units and/or robots described above) may include electronic/electrical devices, such as one or more processors 202, one or more storage devices 204, one or more communication devices 206, one or more input-output devices 208, one or more actuation devices 212, one or more transport motors 214, one or more sensors 216, or a combination thereof. The various devices may be coupled to one another through wired and/or wireless connections. For example, the robotic system 100 may include a bus, such as a system bus, a Peripheral Component Interconnect (PCI) bus or PCI express bus, a hypertransport or Industry Standard Architecture (ISA) bus, a Small Computer System Interface (SCSI) bus, a Universal Serial Bus (USB), an IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus (also referred to as a "firewire"). Additionally, for example, the robotic system 100 may include a bridge, adapter, processor, or other signal-related device for providing a wired connection between devices. The wireless connection may be based on, for example, a cellular communication protocol (e.g., 3G, 4G, LTE, 5G, etc.), a wireless Local Area Network (LAN) protocol (e.g., wireless fidelity (WIFI)), a peer-to-peer or inter-device communication protocol (e.g., bluetooth, Near Field Communication (NFC), etc.), an internet of things (IoT) protocol (e.g., NB-IoT, LTE-M, etc.), and/or other wireless communication protocols.
The processor 202 may include a data processor (e.g., a Central Processing Unit (CPU), a special purpose computer, and/or an on-board server) configured to execute instructions (e.g., software instructions) stored on a storage device 204 (e.g., a computer memory). In some embodiments, the processor 202 may be included in a separate/independent controller operatively coupled to the other electronic/electrical devices shown in fig. 2 and/or the robotic unit shown in fig. 1. The processor 202 may implement program instructions that control/interact with other devices to cause the robotic system 100 to perform actions, tasks, and/or operations.
The storage 204 may include a non-transitory computer-readable medium having program instructions (e.g., software) stored thereon. Some examples of storage 204 may include volatile memory (e.g., cache and/or Random Access Memory (RAM)) and/or non-volatile memory (e.g., flash memory and/or a disk drive). Other examples of storage 204 may include portable memory drives and/or cloud storage.
In some embodiments, the storage device 204 may be used to further store and provide access to processing results and/or predetermined data/thresholds. For example, the storage device 204 may store master data that includes a description of objects (e.g., boxes, cases, and/or products) that may be manipulated by the robotic system 100. In one or more embodiments, the master data may include enrollment data for each such object. The registration data may include a size, shape (e.g., a template of potential poses and/or a computer-generated model for identifying objects in different poses), color scheme, image, identification information (e.g., a barcode, Quick Response (QR) code, logo, etc., and/or their expected locations), expected weight, other physical/visual characteristics, or a combination thereof, of an object expected to be manipulated by the robotic system 100. In some embodiments, the master data may include information related to the manipulation of the objects, such as a centroid (CoM) location or an estimate thereof on each of the objects, expected sensor measurements (e.g., for force, torque, pressure, and/or contact measurements) corresponding to one or more actions/manipulations, or a combination thereof.
The communication device 206 may include circuitry configured to communicate with an external or remote device over a network. For example, the communication device 206 may include a receiver, transmitter, modulator/demodulator (modem), signal detector, signal encoder/decoder, connector ports, network cards, and so forth. The communication device 206 may be configured to transmit, receive, and/or process electrical signals according to one or more communication protocols (e.g., Internet Protocol (IP), wireless communication protocols, etc.). In some embodiments, the robotic system 100 may use the communication device 206 to exchange information between units of the robotic system 100 and/or to exchange information with systems or devices external to the robotic system 100 (e.g., for reporting, data collection, analysis, and/or troubleshooting purposes).
Input-output devices 208 may include user interface devices configured to communicate information to and/or receive information from a human operator. For example, input-output devices 208 may include a display 210 and/or other output devices (e.g., speakers, haptic circuits, or haptic feedback devices, etc.) for communicating information to a human operator. Additionally, input-output devices 208 may include control or receiving devices, such as a keyboard, mouse, touch screen, microphone, User Interface (UI) sensors (e.g., a camera for receiving motion commands), wearable input devices, and so forth. In some embodiments, the robotic system 100 may use the input-output devices 208 to interact with a human operator performing an action, task, operation, or a combination thereof.
In some implementations, a controller (e.g., a separate electronic device) may include the processor 202, the storage device 204, the communication device 206, and/or the input-output device 208. The controller may be a separate component or part of the unit/assembly. For example, each of the unloading units, handling assemblies, transport units, and loading units of the system 100 may include one or more controllers. In some embodiments, a single controller may control multiple units or independent components.
The robotic system 100 may include physical or structural members (e.g., robotic manipulator arms) connected at joints for movement (e.g., rotational and/or translational displacement). The structural members and joints may form a kinematic chain configured to manipulate an end effector (e.g., a gripper) configured to perform one or more tasks (e.g., grasping, spinning, welding, etc.) depending on the use/operation of the robotic system 100. The robotic system 100 may include actuation devices 212 (e.g., motors, actuators, wires, artificial muscles, electroactive polymers, etc.) configured to drive or manipulate (e.g., displace and/or reorient) the structural members with respect to or at the corresponding joints. In some embodiments, the robotic system 100 may include a transport motor 214 configured to transport a corresponding unit/chassis thereto. For example, the actuation device 212 and transport motor are connected to or part of a robotic arm, linear slide, or other robotic component.
The sensors 216 may be configured to obtain sensors for implementing information for tasks, such as for manipulating structural members and/or for transporting robotic units. The sensors 216 may include devices configured to detect or measure one or more physical characteristics of the robotic system 100 (e.g., the state, condition, and/or position of one or more structural members/joints thereof) and/or one or more physical characteristics of the surrounding environment. Some examples of sensors 216 may include contact sensors, proximity sensors, accelerometers, gyroscopes, force sensors, strain gauges, torque sensors, position encoders, pressure sensors, vacuum sensors, and the like.
In some embodiments, for example, the sensor 216 may include one or more imaging devices 222 (e.g., a visual and/or infrared camera, a 2-and/or 3-dimensional imaging camera, a distance measuring device such as a lidar or radar, etc.) configured to detect the surrounding environment. Imaging device 222 may include a camera (including a visual and/or infrared camera), a lidar device, a radar device, and/or other distance measuring or detecting device. The imaging device 222 may generate a representation of the detected environment, such as a digital image and/or a point cloud, that may be processed through machine/computer vision (e.g., for automated inspection, robotic guidance, or other robotic applications).
For manipulating the target object 112, the robotic system 100 (e.g., via the various circuits/devices described above) may capture and analyze an image of a designated area (e.g., a pickup location, such as the interior of a truck or on a conveyor) to identify the target object 112 and its starting location 114. Similarly, the robotic system 100 may capture and analyze an image of another designated area (e.g., a drop location for placing objects on a conveyor, a location for placing objects inside containers, or a location on a pallet for stacking purposes) to identify the task location 116. For example, the imaging device 222 may include one or more cameras configured to generate images of the pick-up area and/or one or more cameras configured to generate images of the task area (e.g., the drop zone). Based on the captured images, the robotic system 100 may determine a starting position 114, a task position 116, an associated pose, a packing/placement plan, a handling/packing order, and/or other processing results, as described below.
In some embodiments, for example, the sensors 216 may include position sensors 224 configured to detect the position of structural members (e.g., robotic arms and/or end effectors) and/or corresponding joints of the robotic system 100. The robotic system 100 may use the position sensors 224 to track the position and/or orientation of the structural members and/or joints during performance of the task. The robotic system 100 may use the detected position, tracked orientation, etc. from the sensors 216 to derive tracking data representing a current position and/or a set of past positions of the target object 112 and/or structural member of fig. 1.
Fig. 3 is a top view of a robotic system 100 in accordance with one or more embodiments of the present technique. In some embodiments, the robotic system 100 may include a system manager 302, a planner 304, and/or a robotic arm 306. The system manager 302 and/or the planner 304 may be implemented or include one or more circuits (e.g., the processor 202, the storage 204, the communication 206, etc.) shown in fig. 2.
The system manager 302 may include mechanisms (e.g., devices and/or software applications) configured to manage the overall operation of one or more of the workstations and/or corresponding robotic units. For example, system manager 302 may include a facility management system such as for a warehouse or shipping hub. In managing overall operation, system manager 302 may receive order input 312 (e.g., a customer request for a set of objects accessible by robotic system 100). The system manager 302 may derive various tasks, interactions/controls between the mission stations and/or corresponding robots, associated sequences or timing, etc. to capture the objects listed in the order input 312. The system manager 302 may further interact with the robotic arm 306 to perform/execute tasks.
The planner 304 may include a mechanism (e.g., a device, software application/feature, or combination thereof) configured to derive detailed controls for operating one or more robots or components therein. Planner 304 may derive detailed steps, such as a motion plan of the robot and/or communication protocols or sequences with other subsystems to operate one or more robot cells and complete tasks determined by system manager 302.
To operate the robotic arm 306, the robotic system 100 may use the planner 304 to derive a transfer plan 314 corresponding to a path for moving one or more items from the start position 114 of fig. 1 to the task position 116 of fig. 1. For example, the robotic system 100 may obtain one or more images of the start position 114 and the task position 116 via the imaging device 222 of fig. 2. The robotic system 100 may process the images to identify or recognize the order objects at the starting location 114 and/or their pose within a bin (e.g., a container having at least one vertical wall). Similarly, the robotic system 100 may use the image of the task location 116 to derive or identify the placement location of each of the target objects at the starting location 114. The planner 304 may derive a transfer plan 314, the transfer plan 314 including a path and/or corresponding commands, settings, timing, etc. for operating the robotic arm 306 (e.g., the actuation device 212 of fig. 2 and/or the transport motor 214 of the actuation device 212 of fig. 2) to transfer the target object between corresponding locations. In some embodiments, the planner 304 may iteratively derive the handling plan 314 by: the next incremental position is iteratively derived starting from the placement position and moving toward the destination. The next incremental position may be a test position that satisfies one or more predetermined rules, such as for avoiding a collision, minimizing distance/time, etc.
The robotic system 100 may implement the transfer plan 314 and operate the robotic arm 306 accordingly to transfer the one or more target objects identified in the order entry 312. For example, the system manager 302 can interact with the robotic arm 306, such as by communicating the path and/or commands/settings of the transfer plan 314 to the robotic arm 306. The robotic arm 306 may execute the received information to perform a task.
As an illustrative example, the system manager 302 may interact with one or more robots (e.g., transport units, such as Automated Guided Vehicles (AGVs), conveyors, etc., and/or subsystems) to access containers (including, for example, the start bin 322) having ordered objects stored therein from a storage area. Thus, the robotic system 100 may operate the transport unit to transport the starting bin 322 to the starting position 114 of the robotic arm 306. Similarly, a target container 324 (e.g., a package or destination bin) may be placed at the task location 116 of the robotic arm 306. Alternatively, the task location 116 may correspond to a drop or placement location on a conveyor or robot (e.g., an AGV). Once the start location 114 and the task location 116 are ready, the system manager 302 can interact with the robotic arm 306 according to the transfer plan 314. Accordingly, the robotic arm 306 may grasp the target objects 112 (e.g., one or more of the objects specified in the order entry 312) and transport it/them from the start location 114 to the task location 116. For example, the robotic arm 306 may pick order objects from one or more starting bins 322 at the starting location 114 and place them in one or more target containers 324 and/or designated locations at the task location 116 to complete the order.
In some embodiments, the robotic arm 100 may utilize a set of tools (e.g., a dedicated end effector) to perform different tasks and/or improve performance of a given task using the same robot. For example, the robotic system 100 may selectively connect the robotic arm 306 to a gripper, welder, or cutter to perform a corresponding function according to the assigned task. Additionally, the robotic system 100 may selectively connect the robotic arm 306 to a clamp gripper or a vacuum gripper based on physical characteristics of the target object and/or their surroundings (e.g., relative position of other objects, availability of an access path, etc.).
In some embodiments, as described in detail below, the robotic system 100 may selectively connect the robotic arm 306 to grippers having different angled interfaces depending on the target object pose. For example, the robotic system 100 may select the first tool when the target object is positioned flat on the bottom of the bin, presenting a relatively horizontal top surface and/or a relatively vertical circumferential surface. Alternatively, the robotic system 100 may select the second tool when the target object has an angular pose with respect to the lateral plane, such as when the target object has a non-parallel surface or is placed on top of an uneven or non-planar contact point. The second tool may include a contact or gripping interface configured to grip such angled objects.
When utilizing a tool set, the system manager 302 may provide a target selection 313 to the planner 304 to identify a tool and/or target object 112 selected for one or more of the tasks/objects. The planner 304 may derive the transfer plan 314 from the target selection 313 and derive the feedback 318 accordingly. For example, when success is derived (e.g., one or more threshold conditions are met, such as for avoiding a collision or meeting a minimum error estimate), the feedback 318 may include the handling plan 314 for handling the target object with the specified tool. Feedback 318 may include an error message when the derivation is unsuccessful, such as when the designated tool is not suitable for grasping and/or handling the target object (due to, for example, a resulting collision event). In an alternative embodiment, the planner 304 may select a tool and derive a corresponding transfer plan 314 without interacting with the system manager 302 with respect to tool selection.
When multiple tools are available, the robotic system 100 may export and evaluate multiple tasks or actions as a single group rather than processing each task or action individually. Alternatively or additionally, the robotic system 100 may coordinate the sequence or timing between the derivation/evaluation, tool change, and planning implementation to improve or maximize the efficiency (e.g., overall completion time) of a set of tasks. Since the implementation of the export process, tool change process, and actions have different costs and benefits, exporting and evaluating individual tasks and actions as a group may result in an improvement in overall performance. For example, the duration (e.g., one second or less) necessary to derive and evaluate the transfer plan 314 may be less than the duration (e.g., five seconds or more) necessary to change tools and/or less than the duration (e.g., one to five seconds) necessary to implement the plan at the robotic arm 306. In some embodiments, changing the tool may take longer than the average or maximum duration necessary to complete the handling of an item.
To derive and evaluate a set of tasks, robotic system 100 (e.g., system manager 302 and/or planner 304) may determine a tool-based object grouping 316 that identifies or groups items according to tools suitable for their manipulation. For example, the robotic system 100 may derive the tool-based object groupings 316 based on the pose of the target object within the starting bin 322 (using, for example, the angle or orientation of the exposed surface relative to one or more predetermined horizontal/vertical lines or planes). The robotic system 100 may evaluate different orders or combinations of the transfer plans 314 according to a set of predetermined rules that account for the cost of changing and utilizing the tools. In one or more embodiments, the robotic system 100 may calculate a cost associated with each transfer plan 314, such as an estimated transfer time and/or an estimated error probability. The robotic system 100 may evaluate any handling plans resulting in collisions and/or having error probabilities (e.g., loss likelihoods) that exceed predetermined thresholds as described above, and thus determine a plan to grasp an object using an appropriate tool. The robotic system 100 may derive different sequences for the determined set of plans and include or plan tool change operations accordingly. For each sequence, the robotic system 100 may calculate an overall cost metric (e.g., an overall completion time or an estimated completion time that takes into account possible error rates) for evaluation. For implementation, the robotic system 100 may select a sequence that minimizes the overall cost metric.
In some embodiments, the robotic system 100 may improve the overall performance of the task by controlling the scheduling, tool changes, and the timing and parallel implementation of the plan implementation. For example, the robotic system 100 may identify a tool that has been attached to the robotic arm 306 when an object is identified or when the starting bin 322 is placed at the starting location 114. The robotic system 100 may identify a first set of target objects that may be grasped with the existing tool. The robotic system 100 may select one of the objects in the group and derive a corresponding plan. Once the first plan is complete, the robotic system 100 may implement the first plan in parallel with the derivation of the second plan in the group. When the planning of the first set of objects is complete, the robotic system 100 may derive the plan of the second set of objects in parallel with (e.g., concurrently/simultaneously with) the implementation of the first set of plans and/or tool change operations.
In other exemplary embodiments, the system manager 302 may identify the initial tool that has been attached to the robotic arm 306. The system manager 302 may interact with the planner 304 to identify and plan a first object that may be grasped and handled using an attached tool. During implementation of the first plan, system manager 302 may interact with planner 304 to plan the second object. When the derivation fails, system manager 302 can select and interact to plan different objects in an iterative manner in parallel with the previously planned implementation. When the derivation is successful for the existing tool, the corresponding object and plan can then be implemented. When no target objects remain to be adapted for the current connection, the robotic system 100 may plan to implement a tool change operation while the planned ongoing or previous implementation is over. In an alternative embodiment, the robotic system 100 may derive and evaluate the feasibility and cost of all available tools for each target object. The robotic system 100 may analyze all group derivations when determining the order of preference.
Exemplary tool
Fig. 4A is an illustration of an exemplary handling unit (e.g., robotic arm 306), in accordance with one or more embodiments of the present technique. The robotic arm 306 may be the handling unit 104 of fig. 1 (e.g., a pick-up or a pick-up robot) or a portion thereof. The robotic arm 306 may include an arm portion 402 configured to manipulate an end effector (e.g., a gripper) across an operating space. Arm portion 402 may include a set of structural members (e.g., beams, columns, etc.), a set of joints between structural members, and/or a corresponding set of actuators/motors configured to move the set of structural members about the joints.
An end effector tool 404 may be attached to the arm portion 402, such as at a distal end of the arm portion 402. The end effector tool 404 may include a tool connector 412 (e.g., a selective locking/attachment mechanism) configured to interface with and attach the tool to the arm portion 402. The tool connector 412 may be structurally connected to or integral with a tool arm 414 having an end effector attached at the opposite end. For the example shown in fig. 4A, the end effector may include a gripper configured to grip an object for manipulation (e.g., handling or shifting across space). The gripper end effector may include a gripper head 416 that houses or facilitates a gripping interface 418 (e.g., a set of suction cups for a vacuum-based gripper). The gripper interface 418 may be used to generate an attachment force or mechanism (e.g., a vacuum) that attaches the target object to the gripper head 416 and/or the robotic arm 306.
Fig. 4B is an illustration of an exemplary tool set 430 in accordance with one or more embodiments of the present technique. The tool set 430 may represent tools that the robotic system 100 of fig. 1 may use to perform one or more tasks. In some embodiments 440, the tool set 430 may include a standard fixed grip tool 440, a fixed angle grip tool 450, and/or an adjustable grip tool 460. Although not shown in fig. 4B, it should be understood that the tool set 430 may include other tools, such as finger-based pinch grippers, grippers with different types of suction/contact interfaces, different categories of end effectors (e.g., non-gripping end effectors such as those used for welding or cutting), and so forth.
The standard fixed gripping tool 440 may be configured to grip objects that are placed relatively flat compared to the floor of the starting bin 322. For example, the standard fixed gripping tool 440 may include a laterally oriented gripping interface 418 for gripping a laterally oriented top surface of an object. The joint connecting the tool arm 414 and the gripper head 416 may be secured together with the tool arm 414 extending normal to the suction interface or parallel to the clamp interface.
The fixed angle gripping tool 450 may be configured to grip objects placed at an angle compared to the floor of the starting bin 322. For example, the fixed angle gripping tool 450 may include a suction gripping interface 418 and a tool arm 414 configured at a non-orthogonal angle for gripping a non-lateral top surface of an object. The angled joint 452 may fixedly couple the tool arm 414 and the gripper head 416 together with structures that form corresponding non-orthogonal angles or orientations.
The adjustable gripping tool 460 may be configured to adjust the orientation of the gripper head 416 relative to the tool arm 414. For example, the adjustable gripping tool 460 may include an orientation control mechanism 464, such as a manipulator arm, a set of cables, an actuator, a motor, or the like, configured to adjust the orientation of the gripper head relative to the tool arm 414 and about the rotatable joint 462. Accordingly, the robotic system 100 may operate the orientation control mechanism 464 to adjust the pose or orientation of the gripping interface 418 according to the orientation of the target object or one or more portions thereof (e.g., exposed top surface).
Exemplary tool use
Fig. 5A is an illustration of a standard grip scenario 500 in accordance with one or more embodiments of the present technology. The standard grip scenario 500 may be used to grip an object 502 in a flat pose. The flat-pose object 502 may correspond to a target object having an orientation (e.g., bottom surface of the object) parallel to the bottom surface of the starting bin 322. Thus, the standard grip scenario 500 may correspond to the standard fixed gripping tool 440. For example, the robotic system 100 of fig. 1 may operate the robotic arm 306 of fig. 3 to position the standard fixed gripping tool 440 directly above the object 502 in a flat pose and lower the end effector to grip the object.
Fig. 5B is an illustration of an angled grip scenario 510 in accordance with one or more embodiments of the present technology. The angled grip scene 510 may be used to grasp an angled object 512. The angled objects 512 may include objects that lean or rest on uneven contact points (e.g., rest along non-horizontal planes) and/or objects having non-parallel opposing surfaces. In some embodiments, the angled object 512 may correspond to a gesture having one or more surfaces oriented along an angled direction/plane relative to a horizontal/vertical reference direction. Thus, the angled grip scenario 510 may correspond to the fixed angle gripping tool 450. For example, the robotic system 100 of fig. 1 may operate the robotic arm 306 of fig. 3 to position the fixed angle gripping tool 450 directly over the angled object 512 and lower the end effector to grip the object.
For comparison, the standard fixed grip tool 440 may not be suitable for angularly gripping the scene 510. For example, to grasp the angled object 512, the standard fixed grasping tool 440 may be tilted or angled such that the grasping interface 418 is oriented parallel to the interfacing portion of the angled object 512. Lowering the standard fixed gripping tool 440 to a tilt may cause a collision event (shown by an 'X' in fig. 5B) between the tool and/or robotic arm 306 and the starting cartridge 322.
In some embodiments, the robotic system 100 may select the fixed angle gripping tool 450 based on the surface pose 514 of the angled object 512. For example, the robotic system 100 may process one or more images (e.g., top view images) of the start bin 322 and/or the angled object 512 therein as captured by the imaging device 222 of fig. 2. The robotic system 100 may identify edges depicted in the image based on an edge detection mechanism (e.g., Sobel filter). The robotic system 100 may identify each continuous surface depicted in the image based on determining the connections and/or relative orientations between a set of edges and/or recognizing shapes, colors, and/or designs located between the edges. The robotic system 100 may map the surfaces to a three-dimensional image (e.g., a depth map) and use the depth metric to calculate one or more slopes for each surface. Using the one or more calculated slopes, the robotic system 100 may derive a surface pose 514 for each surface. The robotic system 100 may determine the angled object 512 based on comparing the surface pose 514 to a threshold value representing a horizontal or flat surface.
Fig. 6A is an illustration of a standard release scenario 600 in accordance with one or more embodiments of the present technology. The standard release scenario 600 may be used to release a gripped object (e.g., the target object 112 of fig. 1) from the standard fixed gripping tool 440. The standard release scenario 600 may be used to place a gripped object (e.g., the target object 112 of fig. 1) at a corresponding placement location at the task location 116 and/or in the target container 324 of fig. 3. The standard release scenario 600 may correspond to a flat placement of a gripped object and/or an upright orientation of the standard fixed gripping tool 440.
Fig. 6B is an illustration of a first angled release scenario 610, in accordance with one or more embodiments of the present technology. The first angled release scenario 610 may be used to release a gripped object (e.g., the target object 112 of fig. 1) from the fixed angle gripping tool 450. The first angled release scenario 610 may be used to place a gripped object (e.g., the target object 112 of fig. 1) at a corresponding placement location at the task location 116 and/or in the target container 324 of fig. 3. The first angled release scene 610 may correspond to a flat resting pose 612 of a gripped object. Accordingly, the robotic system 100 of fig. 1 may orient the fixed angle gripping tool 450 to the angled tool release pose 614 (e.g., a non-vertical or tilted orientation of the tool arm therein). The tool release gesture 614 may correspond to the angled joint 452 of fig. 4B of the fixed angle gripping tool 450.
Fig. 6C is an illustration of a second angled release scenario 620 in accordance with one or more embodiments of the present technology. The second angled release scenario 620 may be used to release a gripped object (e.g., the target object 112 of fig. 1) from the fixed angle gripping tool 450. The second angled release scenario 620 may be used to place a gripped object (e.g., the target object 112 of fig. 1) at a corresponding placement location at the task location 116 and/or in the target container 324 of fig. 3. The second angled release scene 620 may correspond to an angled placement pose 622 of the gripped object. Accordingly, the robotic system 100 of fig. 1 may orient the fixed angle gripping tool 450 to the upright tool release pose 624 (e.g., the vertical orientation of the tool arm therein). The angled placement pose 622 may correspond to the angled joint 452 of fig. 4B of the fixed angle gripping tool 450. The angled placement gesture 622 and the upright tool release gesture 624 may be used to release a gripped object without implementing the orientation manipulations associated with the tool release gesture 614 and the flat placement gesture 612 shown in fig. 6B. Accordingly, the angled placement pose 622 and the upright tool release pose 624 may correspond to lower implementation costs (e.g., shorter implementation times) than the tool release pose 614 and the flat placement pose 612. The angled placement pose 622 and the upright tool release pose 624 may be implemented for objects having predetermined characteristics, such as for softer and/or more flexible objects that are less susceptible to impact damage.
Exemplary task timing
Fig. 7 is an exemplary timing diagram 700 in accordance with one or more embodiments of the present technology. The timing diagram 700 may represent a sequence or temporal relationship of operations and/or processes of the robotic system 100 of fig. 1 (e.g., the system manager 302 of fig. 3, the planner 304 of fig. 3, and/or the robotic arm 306 of fig. 3). For example, the timing diagram 700 may show a temporal relationship between the schedule 702 and the implementation schedule 704. The scheduling schedule 702 may represent a sequence of processes that each lead out a case of the handling plan 314 of fig. 3, such as for handling target objects having assigned tools as specified by the target selection 313 of fig. 3. The implementation schedule 704 may represent a planned implementation/execution sequence at/by the robotic arm 306.
The robotic system 100 may perform the plan derivation process and the plan implementation process in parallel with improving overall efficiency and reducing overall task execution time. For the example shown in fig. 7, the system manager 302 may send a first object selection (e.g., in the case of target selection 313) identifying an object 1 to be grasped and handled using the part 1 (e.g., the standard fixed gripping tool 440 of fig. 4B). In response to the first object selection, the planner 304 may derive a first object plan 712 for grasping and handling the object 1 with the part 1. Upon successful derivation, the planner 304 may communicate the first object plan 712 to the system manager 302 through the corresponding feedback 318 of fig. 3, and the system manager 302 may interact with the robotic arm 306 to implement the first object plan 712. The first object handling 732 of the fulfillment schedule 704 may represent the fulfillment of the first object plan 712 at the robotic arm 306.
Additionally, upon successful export, the system manager 302 may send a second object selection (a new instance of target selection 313) identifying the object 2 to be grasped and carried using the connected part 1. In response to the second object selection, the planner 304 may derive a second object plan 714, such as using the example iterative process described above. When the second object is the angled object 512 of fig. 5B, the planner 304 may return a failure status indicating that a successful plan cannot be derived, such as due to an estimated collision event occurring while attempting to grasp the object. Planner 304 may communicate the failure status to system manager 302, such as through corresponding feedback 318. In some embodiments, system manager 302 can identify other target objects to be considered for planning with the currently connected tool. The system manager 302 may identify a second object for planning using an updated tool selection (e.g., the alternate head corners of the fixed-angle grasping tool 450 of fig. 4B and/or the adjustable grasping tool 460 of fig. 4B), such as when no remaining objects can be grasped using the currently connected tool and/or grasper head orientations. With the updated tool, the planner 304 may derive an updated second object plan 716. The second object handling 736 of the implementation schedule 704 may represent the implementation of the updated second object plan 716 at the robotic arm 306. Various planning processes may occur in parallel (e.g., independently) of the implementation schedule 704.
Upon successful export of the follow-up plan, system manager 302 can queue it for implementation in implementation schedule 704. When a successful derivation is based on providing an update tool in the target selection 313, such as for the updated second object plan 716, the system manager 302 may include the tool change 734 in the implementation schedule 704.
In some embodiments, the robotic system 100 (located, for example, at the system manager 302 and/or the planner 304) may estimate implementation metrics for various implementations. For example, the robotic system 100 may derive a first plan metric 722 for the first object plan 712 (e.g., including a total execution time to pick and/or place the object), a second plan metric 726 for the updated second object plan 716, and/or a tool change metric 724 for the tool change 734 (e.g., a desired maximum and/or average time). The robotic system 100 may use the implementation metrics to derive a planning sequence that minimizes the combined metrics. In some implementations, the tool change metric 724 may represent a cost associated with switching the pose of the grip interface 418 on the adjustable gripping tool 460 between functioning as a standard fixed grip tool 440 and a fixed angle grip tool 450. The tool change metric 724 may also include or account for costs (e.g., time and/or change in failure estimate) associated with implementing the tool release gesture 614 (e.g., additional manipulations as compared to a standard release from a standard fixed grasping tool). Additionally, the tool change metric 724 may also include or account for costs associated with drag changes or air resistance associated with changes in surface area relative to direction of movement, such as due to angled grips to reduce drag and/or additional manipulation.
As an illustrative example, the robotic system 100 may derive a planned schedule 702 for handling a plurality of objects from a starting location (e.g., a bin) to a target location (e.g., a different bin or conveyor). The robotic system 100 may derive estimated implementation metrics for handling one or more of the objects using the corresponding tool. The estimated measurements may account for any speed changes, additional maneuvers, and/or adjusted pick/drop maneuvers or poses of the different tools. In some implementations, the estimated metric may further take into account an expected error rate associated with the corresponding object tool combination. For example, the estimated metrics may include a transit time that increases the calculated amount to account for remedial actions to be deployed in error situations (e.g., piece loss). Additional adjustments may be calculated based on weighting the average duration of the remedial action by an error rate associated with the object-tool combination and/or corresponding movement settings (e.g., speed). Thus, the robotic system 100 may balance (1) any negative adjustments for handling objects with sub-optimal tools against (2) the cost of changing tools to handle objects with optimal tools and the impact of tool changes in handling other objects. Thus, the robotic system 100 may evaluate the overall cost of different combinations of motion plans and select one motion plan that minimizes the cost of handling one or more or all of the target objects. In some embodiments, the estimated metric may be expressed as a number of picks per minute or hour using the same robotic unit.
Operation process
Fig. 8 is a flow diagram of a method 800 for operating a robotic system (e.g., robotic system 100 of fig. 1) according to one or more embodiments of the present disclosure. The method 800 may be implemented using one or more of the above-described apparatus, such as the system manager 302 of FIG. 3, the planner 304 of FIG. 3, and/or the robotic arm 306 of FIG. 3. The method 800 may be implemented using one or more of the components described above (such as the processor 202 of fig. 2, the storage 204 of fig. 2, etc.). The method 800 may be used to plan and implement a task (e.g., to carry an object) using a tool set (e.g., tool set 430 of fig. 4B). As described in detail below, the method 800 may correspond to planning and implementing tasks in parallel and/or processing objects in groups according to the appropriate tools for minimizing the tool changes 734 of fig. 7.
At block 802, the robotic system 100 may receive an order for a set of items (e.g., order input 312 of fig. 3). For example, the robotic system 100 (via, e.g., the communication device 206 of fig. 2) may receive order input 312 from a customer or requesting a warehouse. Additionally, the order input 312 may correspond to an internally generated order to repackage or regroup items for storage, such as to rearrange/group objects and reduce storage containers. Thus, the robotic system 100 may identify a set of objects that need to be moved from a storage location to a different location (e.g., an away container) to satisfy an order.
At block 804, the robotic system 100 may coordinate acquisition of the ordered object. The robotic system 100 (via, for example, the system manager 302 of fig. 3) may identify one or more storage locations for the ordered objects. For example, the robotic system 100 may compare the order entry 312 to a record of stored/retrievable objects. Accordingly, the robotic system 100 identifies bins and/or objects corresponding to the order entry 312 shown at block 805. The robotic system 100 may determine a storage location and/or an identifier of a container in which the order object is stored.
The robotic system 100 may use the determined location and/or container identifier to coordinate the acquisition of the ordered object. For example, the robotic system 100 may directly operate one or more transport units (e.g., AGVs and/or conveyors) to transport the target container from the storage location to the start location 114 of FIG. 1 for the robotic arm 306 of FIG. 3. The robotic system 100 may similarly operate the transport unit to transport the target container 324 of fig. 3 from its storage location to the task location 116 of fig. 1. Additionally or alternatively, the robotic system 100 may interact or communicate with one or more subsystems (e.g., storage access systems) to place one or more target containers and/or target containers 324 at their respective locations.
At block 806, the robotic system 100 may obtain image data. For example, the robotic system 100 may use the imaging device 222 of fig. 2 to obtain a two-dimensional and/or three-dimensional image depicting the storage container at the starting location. Thus, the obtained image data may depict, in real-time, one or more of the ordered objects stored within the storage container.
At block 808, the robotic system 100 may identify a grouping of objects (e.g., targets or ordered objects) in the storage container. The robotic system 100 may process the obtained image data to detect or identify objects within the starting bin 322. For example, the robotic system 100 may identify a surface based on the detection lines (via, e.g., a Sobel detection mechanism) and the connection/arrangement between the detected lines. The robotic system 100 may compare the depicted image or portion thereof (e.g., a portion within the surface) to the surface image data in the master data. Additionally or alternatively, the robotic system 100 may compare the dimensions of the identified surface to the dimensions of the object stored in the master data. The robotic system 100 may identify the object based on a match of the depicted image and/or measured dimensions with corresponding images and/or predetermined dimensions stored in the master data.
The robotic system 100 may compare the identified set of objects in the starting bin 322 to a target or expected portion of the order input 312. Thus, the robotic system 100 may locate the target object in the starting bin 322. The robotic system 100 may further process the target objects to sort/group the objects according to the tool set 430, thereby determining the tool-based object grouping 316 of fig. 3. For example, the robotic system 100 may determine whether each of the target objects is the flat-pose object 502 of fig. 5A or the angled object 512 of fig. 5B. The robotic system 100 may use the depth metric associated with the depicted surface to calculate one or more slopes. The robotic system 100 may use the calculated slopes to derive the surface pose 514 of fig. 5B for each of the target objects. Surface gestures 514 may be used to classify or group objects, such as flat-posed object 502, angled object 512, etc., according to a tool that is appropriate or assigned to manipulate the corresponding object.
The robotic system 100 may further classify/group the objects according to other characteristics corresponding to different end effectors and/or target tasks. For example, the robotic system 100 may group objects according to structural rigidity and/or outer surface materials (e.g., boxes, plastic wraps, bags, etc.), overall shape, etc., associated with different types of grippers (e.g., different types of contact interfaces, such as suction grippers, suction cup sizes/positions, finger-based grippers, etc.). Additionally, the robotic system 100 may group objects that are bound together, thus requiring a cutting tool, separately from other unbound objects. Thus, the grouped objects may correspond to the tool-based object grouping 316.
As an illustrative example, the robotic system 100 may determine at least a first set of objects (e.g., a set of flat-posed objects 502) and a second set of objects (a set of angled objects 512) based on image data depicting objects within the starting bin 322. The first set of objects may have one or more aspects (e.g., pose) corresponding to a characteristic of the first tool (e.g., angle/orientation of the grip interface 418), and the second set of objects may have one or more aspects corresponding to a characteristic of the second tool (e.g., angle/orientation of the grip interface 418). The flat-pose objects 502 may each have a top portion parallel to the bottom surface of the starting bin 322, thus corresponding to a standard fixed gripping tool 440. The angled objects 512 may each have a top portion surface that forms an angle with respect to the bottom surface of the starting cartridge 322, thereby corresponding to the fixed angle grasping tool 450.
At block 810, the robotic system 100 may track and/or update the connected tools. The robotic system 100 may use internal mechanisms (e.g., Radio Frequency Identification (RFID) circuitry, hard coded identifiers, etc.) to dynamically detect a tool connected to the robotic arm 306. Additionally or alternatively, the robotic system 100 may track tool change operations and tool selections to track connected tools.
The robotic system 100 may access or determine the set of tools available to the robotic system 100 when tracking and/or updating the connected tools. For example, the robotic system 100 may access predetermined data regarding the tools and/or corresponding characteristics of the tool set 430 (e.g., pose of the grasping interface 418) available to the robotic arm 306. Additionally or alternatively, the robotic system 100 may dynamically track (e.g., in real-time) the tools used or connected to the robotic arm. Thus, the robotic system 100 may determine the tools that are remaining (e.g., not connected to any other robotic arms) and available for acquisition/connection. The robotic system 100 may further access predetermined or previously measured data to determine tool change metrics 724 (e.g., costs, such as time) of fig. 7 associated with switching between tools (e.g., disconnecting from one tool and connecting to a different tool) while manipulating the target object. In some embodiments, the tool change metric 724 may represent a cost associated with changing the pose of the gripping interface 418 for the adjustable gripping tool 460, such as by manipulating the orientation control mechanism 464 of fig. 4.
Additionally, at block 810, the robotic system 100 may update the connected tools, such as according to the status determination described below (e.g., decision block 820). For example, when it is determined from the status that a new tool is needed to manipulate/transport the object, the robotic system 100 may operate the robotic arm 306 to disconnect the connected tool and connect to the new tool for subsequent transport. The robotic system 100 may update the tracked tool information. In some embodiments, the robotic system 100 may schedule tool updates (e.g., tool changes 734) and update the tracked information when the tool changes 734 are implemented.
At block 812, the robotic system 100 may select one or more objects for planning. The robotic system 100 may select from a group corresponding to the tool connected to the robotic arm 306. Within the selected grouping, the robotic arm 100 may iteratively select one object for handling. The robotic system 100 may make the selection based on a set of predetermined selection rules, such as for selecting a higher positioned object first and/or a closer positioned object.
At block 814, the robotic system 100 (via, for example, the planner 304) may derive one or more plans (e.g., the handling plan 314) for selection. For example, the system manager 302 may provide a target selection 313 to the planner 304 based on the selected object and/or the connected tool. In some embodiments, the robotic system 100 may provide a list of objects to a group. In response, the planner 304 may derive one or more handling plans for each object. When the plan derivation is successful, the planner 304 may provide the resulting transfer plan 314 to the system manager 302 via feedback 318. Otherwise, when the plan derivation is unsuccessful (due to an estimated collision caused by, for example, improper tool dispensing), the planner 304 may provide feedback 318 reporting an unsuccessful status (e.g., an indication of an improper match between the dispensed tool and the target object).
Thus, the robotic system 100 may derive a set of transfer plans 314 for transferring the target object according to the tool. In some embodiments, for example, the robotic system 100 may follow a process for planning a transfer based on the tool change metric 724 (according to, for example, a set of rules and/or an architecture for minimizing overall transfer cost/time). For systems where tool changes have a larger average/minimum metric than planning the process and/or planning the implementation, the robotic system 100 may plan the transfer to minimize the tool changes. As an example, the robotic system 100 may plan and schedule its implementation of the applicable objects/groupings using the connected tools, followed by planning a set of different objects. In some embodiments, the robotic system 100 may iteratively conduct the selection and derivation process and implement the derived plan accordingly. In other embodiments, the robotic system 100 may derive a plan for one or more groupings of objects and arrange/queue the plan according to a set of rules for minimizing the overall cost of minimization.
In deriving the plan, the robotic system 100 may compute a metric (e.g., the plan metric 720 of fig. 7) as shown at block 816. For example, the robotic system 100 may calculate an estimated duration for each transfer plan based on the calculated travel distance, number and/or type of maneuvers, corresponding settings, and the like.
For the example shown in fig. 8, the robotic system 100 may derive a set of plans based on iteratively selecting and planning objects in one grouping before processing another grouping. At decision block 818, the planner 304 may determine whether the plan derivation was successful and provide corresponding feedback 318 to the system manager 302. When the plan derivation is unsuccessful, the robotic system 100 may determine whether any objects remain in the currently processed object group, such as shown at decision block 820. When an object remains in the currently selected/processed group, the flow may proceed to block 812 and the robotic system 100 may select the next object in the group. When no more objects remain in the grouping (e.g., tool-based object grouping 316 is empty), the flow may pass to block 810 and the robotic system 100 may update the connected tool (by, for example, performing a tool change 734). The robotic system 100 may select and connect to a new tool based on the remaining objects in the bin and one or more corresponding tools. Along with the tool change, the robotic system 100 may derive a plan for a different grouping associated with the new tool.
In some embodiments, when the planner 304 derives a plan, the robotic system 100 may derive and/or update a sequence for implementing the plan. For example, the system manager 302 may schedule the implementation of the plan, such as by queuing the plan at the system manager 302 or the robotic arm 306.
In one or more alternative embodiments, the robotic system 100 may derive multiple plans for each object (e.g., one plan for each available tool). At block 822, the robotic system 100 may derive an order/combination of plans based on the corresponding plan metrics and tool change metrics 724. The robotic system 100 may select and schedule the sequence with the lowest overall cost (e.g., overall fulfillment/handling time).
As an illustrative example of block 808-. The robotic system 100 may determine and select a first group of objects corresponding to a first tool (e.g., based on the case of the tool's object grouping 316, such as the object 502 in a flat pose). The robotic system may iteratively derive a first set of plans for operating the robotic arm 306 to handle a first set of objects. In some embodiments, the robotic system 100 may derive the first set of plans based on selecting objects in the set, deriving a test plan for the selected objects, and determining whether the test plan is feasible for implementation. After deriving the first set of plans, the robotic system 100 may schedule tool changes 734 to follow the last plan in the first set. Accordingly, after handling the first set of objects, the robotic system 100 may schedule a second tool (e.g., the fixed angle gripping tool 450) to be connected to the robotic arm 306. The robotic system 100 may continue to derive a plan for the second set of objects.
At block 824, the robotic system 100 (via, for example, the system manager 302) may implement the scheduled plan. In other words, the robotic system 100 may operate the robotic arm 306 and transport objects according to the derived transport plans and their scheduling order. Planning may be implemented in parallel (e.g., independently) with subsequent export processes. For example, the system manager 302 may initiate implementation of the first plan at the robotic arm 306 immediately before or after providing the target selection 313 to the planner 304 to initiate plan export of the second object. Thus, the export process for the second object or any subsequent objects may occur while the robotic arm 306 is handling the first object. Thus, the robotic system 100 may continuously derive and implement a motion plan until the starting location or the set of target objects is empty, as shown at block 826.
In some embodiments, the robotic system 100 may include operations for reducing the tension of the orientation control mechanism 464 of the adjustable gripping tool 460 in the handling plan. The robotic system 100 may include corresponding commands and/or settings (1) after contacting the corresponding object with at least a portion of the gripping interface and/or (2) before gripping the corresponding object. Thus, during implementation, the grip interface 418 may displace and/or rotate and increase the contact area with the exposed surface of the target object. In other words, the grip interface 418 may be rotated or adjusted about the rotatable joint 162 and match the surface pose 514 of the target object. The reduced tension may occur before or during activation of the gripping interface 418 (e.g., suction cup) in order to improve the grip established between the gripping interface 418 and the target object.
Conclusion
The foregoing detailed description of the examples of the disclosed technology is not intended to be exhaustive or to limit the disclosed technology to the precise form disclosed above. While specific examples of the disclosed technology are described for illustrative purposes, various equivalent modifications are possible within the scope of the disclosed technology, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative implementations may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks may be implemented in a variety of different ways. Additionally, while processes or blocks are sometimes shown as being performed in series, these processes or blocks may instead be performed or implemented in parallel, or may be performed at different times. Moreover, any specific numbers indicated herein are merely examples; alternative implementations may employ different values or ranges.
These and other changes can be made to the disclosed technology in light of the above detailed description. While the detailed description describes certain examples of the disclosed technology and the best mode contemplated, the disclosed technology can be practiced in many ways, no matter how detailed the above description appears. The details of the system may vary widely in its specific implementation, but are still covered by the techniques disclosed herein. As noted above, particular terminology used in describing certain features or aspects of the disclosed technology should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the disclosed technology with which that technology is associated. Accordingly, the invention is not limited except as by the appended claims. In general, the terms used in the following claims should not be construed to limit the disclosed technology to the specific examples disclosed in the specification, unless the above detailed description section explicitly defines such terms.
While certain aspects of the invention are presented below in certain claim forms, applicants contemplate the various aspects of the invention in any number of claim forms. Accordingly, the applicant reserves the right to add additional claims after filing the application to add such additional claims in the present application or in a subsequent application.

Claims (20)

1. A method for operating a robotic system, the method comprising:
identifying a toolset comprising two or more tools each uniquely configured to manipulate an object, wherein each tool is configured to be selectively attached to a robotic arm;
obtaining image data representing a set of target objects at a starting location, wherein the image data depicts representations of two or more objects in the set of target objects;
grouping the two or more objects according to a set of physical features and/or a current pose of the two or more objects, wherein the set of physical features and/or the current pose correspond to unique configurations of the two or more tools in the tool set;
deriving a set of plans comprising at least a first plan and a second plan based on the image data, wherein the set of plans is (1) for operating a robotic arm to transport at least a portion of the set of target objects from the starting location to a target location and (2) changing usage in connection with different tools;
determining one or more tool change metrics for the derived plan, wherein each tool change metric represents a cost associated with removing a previous tool from the robotic arm and attaching a subsequent tool to the robotic arm;
calculating a cost metric for each plan in the set of plans, wherein the calculated cost metric takes into account (1) the tool change metric and/or (2) a handling time associated with picking, handling, and placing at least the portion of the set of target objects;
selecting a plan from the set of plans based on the cost metric, wherein the selected plan corresponds to a lowest cost metric associated with handling at least the portion of the set of target objects.
2. The method of claim 1, wherein:
the first plan representing effecting a tool change from an initial tool to a subsequent tool and then transporting an object with the subsequent tool, wherein the first plan corresponds to a first cost measure that takes into account the tool change and a first transport time of the object;
the second plan representing the handling of the object with the initial tool, wherein the second plan corresponds to a second cost measure taking into account a second handling time of the object that is longer than the first handling time; and is
Selecting the plan includes comparing at least the second cost metric to a combination of the first cost metric and the tool change.
3. The method of claim 1, wherein:
grouping the two or more objects includes determining, based on the image data, (1) a set of flat-posed objects and/or (2) a set of angled objects, wherein
The set of flat posed objects represents a first subset of objects each having a top portion parallel to the bottom surface of the starting position,
the set of angled objects represents a second subset of objects each having a top portion surface pose forming an angle relative to a bottom surface of the starting position;
the tool set represents an adjustable tool and/or two or more fixed tools configured to change an angle between the robotic arm and a grasping interface;
selecting the plan includes (1) minimizing tool changes and/or (2) evaluating cost differences associated with (a) changing the tool to accommodate different poses and (b) maintaining additional maneuvers and/or speed adjustments necessary to the tool across the different poses.
4. The method of claim 3, wherein the set of plans is derived based on:
identifying a first tool connected to the robotic arm prior to handling the set of target objects, wherein the first tool represents one of: (1) a standard fixed gripping tool configured to grip the set of flat-posed objects and (2) a fixed-angle gripping tool configured to grip the set of angled objects;
determining a first set of objects corresponding to the first tool, wherein the first set of objects represents respective ones of (1) the set of flat-posed objects and (2) the set of angled objects;
deriving a first set of plans for operating the robotic arm to carry the first set of objects using the first tool; and
scheduling a tool change after deriving the first set of plans, wherein the tool change is for connecting a second tool to the robotic arm in place of the first tool after handling the first set of objects, the second tool representing the remainder of: (1) a standard fixed gripping tool configured to grip the set of flat-posed objects and (2) a fixed-angle gripping tool configured to grip the set of angled objects.
5. The method of claim 4, wherein the first set of plans is derived based on iteratively:
selecting objects in the first group of objects;
deriving a test plan for handling the selected object using the first tool; and
determining whether the test plan is feasible for implementation, wherein
Repeating said selecting, said deriving, and said determining in said first group of objects.
6. The method of claim 5, wherein iteratively deriving the first set of plans comprises:
deriving a first plan for handling a first object, wherein the first plan is verified as feasible for implementation; and
deriving a second plan for handling a second object, wherein the second plan is derived after the first plan and verified as feasible for implementation;
the method further comprises the following steps:
implementing the first plan in parallel with deriving the second plan, wherein the first plan is for operating the robotic arm to transport the first object from the storage container to the target location.
7. The method of claim 1, wherein deriving the set of plans comprises:
deriving a test plan for each tool of the set of tools for each object depicted in the image data and/or the set of target objects;
determining a set of validated plans based on determining feasibility of implementation of each plan according to one or more predetermined rules and/or thresholds; and is
The method further comprises the following steps:
deriving an implementation sequence based on minimizing a sum of costs for handling the set of target objects, wherein the implementation sequence is a handling plan for implementing each object in the set of target objects.
8. The method of claim 1, wherein:
the robotic system includes (1) a system manager configured to coordinate handling of the set of target objects and (2) a planner configured to derive one or more plans for operating the robotic arm to handle corresponding objects;
deriving the validated set of plans comprises deriving the validated set of plans at the planner; and is
Deriving the implementation sequence includes deriving the implementation sequence at the system manager.
9. The method of claim 1, wherein the calculated cost measure represents further robot operations for achieving a tool release posture and/or an adjusted handling speed.
10. The method of claim 1, wherein:
the robotic system includes (1) a system manager configured to coordinate handling of the set of target objects and (2) a planner configured to derive one or more plans for operating the robotic arm;
deriving the set of plans includes
Sending, from the system manager to the planner, a first target selection specifying a first object of the set of target objects;
receiving, at the system manager, a first transfer plan corresponding to the first target selection;
initiating, using the system manager, implementation of the first transfer plan to operate the robotic arm to transfer the first object; and
sending, from the system manager to the planner, a second target selection specifying a second object of the set of target objects during implementation of the first handling plan.
11. The method of claim 10, wherein deriving the set of plans comprises:
receiving feedback from the planner at the system manager, wherein the feedback indicates a failure to derive a transfer plan based on the second target selection; and
sending, from the system manager to the planner, a third target selection specifying remaining objects in the group associated with the connected tool in response to the feedback.
12. The method of claim 1, wherein:
the robotic system includes (1) a system manager configured to coordinate handling of the set of target objects and (2) a planner configured to derive one or more plans for operating the robotic arm;
deriving the set of plans includes
Sending, from the system manager to the planner, a first target selection specifying a first object of the set of target objects to be handled using the first tool;
receiving feedback from the planner at the system manager, wherein the feedback indicates a failure to derive a transfer plan based on the first target selection;
sending, from the system manager to the planner, a second target selection specifying the first object to be handled using the second tool in response to the feedback; and
scheduling a tool change in response to receiving a transfer plan corresponding to the second target selection, wherein the scheduled tool change precedes the transfer plan and represents operation of the robotic arm to disconnect from the first tool and connect to a second tool between transfers of objects.
13. The method of claim 12, wherein the derived plan and/or the tool changes are queued at the system manager, thereby controlling timing for implementing the derived plan and/or the tool changes.
14. The method of claim 12, further comprising:
communicating the derived plan and/or the tool changes to the robotic arm, the robotic arm configured to implement the derived plan and/or the tool changes according to an order of receipt.
15. The method of claim 1, wherein the tool change metric represents a cost associated with switching a pose of a gripping interface on an adjustable gripping tool.
16. The method of claim 1, wherein deriving the set of plans comprises deriving commands and/or settings for reducing tension of an orientation control mechanism of the adjustable gripping tool to allow the gripping interface to rotate about a rotatable joint and match a surface pose, wherein the tension reduction occurs (1) after contacting a corresponding object with at least a portion of the gripping interface and (2) before gripping the corresponding object.
17. The method of claim 1, further comprising:
determining a container identifier representing a bin comprising the set of target objects, wherein the bin comprises two or more vertical walls;
wherein:
the set of derived plans includes commands and/or settings for operating the vacuum gripper to contact and grip one or more objects while avoiding contact between the vertical wall and the robotic arm and attached tool.
18. A tangible, non-transitory computer-readable medium having processor instructions stored thereon, which, when executed by one or more processors, cause the one or more processors to perform a method, the method comprising:
determining at least a first set of objects and a second set of objects based on image data depicting objects at a starting location, wherein the first set of objects and the second set of objects correspond to a first tool and a second tool, respectively, in a tool set;
determining a tool change metric associated with switching between the first tool and the second tool for handling the object;
deriving at least a first plan and a second plan based on the image data, wherein
The first plan is for operating the robotic arm to handle the first set of objects with the first tool, the second set of objects with the second tool, and switch between the first tool and the second tool, and
the second plan is for operating the robotic arm to handle at least a portion of the first and second sets of objects with the first tool prior to or without switching tools;
calculating a first cost measure of the first plan, wherein the first cost measure represents an overall handling time for handling the first and second sets of objects and switching tools according to the first plan;
calculating a second cost measure for the second plan, wherein the second cost measure represents an overall handling time for handling the first and second sets of objects according to the second plan; and
selecting one of the first plan or the second plan to implement based on the lower of the respective first cost metric or the second cost metric.
19. A robotic system, comprising:
communication circuitry configured to communicate data, commands and/or settings with (1) a planner configured to derive a plan for operating a robotic arm and/or (2) a robotic arm configured to selectively connect to a set of tools and grasp and transport an object using the connected tools according to a corresponding plan;
at least one processor coupled to the communication circuitry and configured to:
determining at least a first set of objects and a second set of objects based on image data depicting a set of objects within a container, wherein the first set of objects and the second set of objects correspond to a first tool and a second tool, respectively, of the set of tools;
determining a tool change metric associated with switching between the first tool and the second tool for handling the object; and is provided with
Deriving a set of plans based on the tool change metrics, wherein the set of plans is for operating the robotic arm to handle the first set of objects with the first tool, to handle the second set of objects with the second tool, and to switch between the first tool and the second tool.
20. The system of claim 19, further comprising:
the planner communicatively coupled to the communication circuit;
the robotic arm communicatively coupled to the communication circuit; and
the set of tools, including the first tool and the second tool, configured to (1) selectively connect to the robotic arm and (2) grasp one or more objects for manipulation.
CN202210215933.7A 2020-11-05 2021-11-03 Robot tool and method of operating the same Active CN114683299B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210215933.7A CN114683299B (en) 2020-11-05 2021-11-03 Robot tool and method of operating the same

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US202063109870P 2020-11-05 2020-11-05
US63/109,870 2020-11-05
US17/392,108 US11981518B2 (en) 2020-11-05 2021-08-02 Robotic tools and methods for operating the same
US17/392,108 2021-08-02
CN202180004739.8A CN114746224A (en) 2020-11-05 2021-11-03 Robot tool and method of operating the same
PCT/US2021/057831 WO2022098706A1 (en) 2020-11-05 2021-11-03 Robotic tools and methods for operating same
CN202210215933.7A CN114683299B (en) 2020-11-05 2021-11-03 Robot tool and method of operating the same

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202180004739.8A Division CN114746224A (en) 2020-11-05 2021-11-03 Robot tool and method of operating the same

Publications (2)

Publication Number Publication Date
CN114683299A true CN114683299A (en) 2022-07-01
CN114683299B CN114683299B (en) 2023-12-29

Family

ID=82156262

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210215933.7A Active CN114683299B (en) 2020-11-05 2021-11-03 Robot tool and method of operating the same

Country Status (1)

Country Link
CN (1) CN114683299B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08174389A (en) * 1994-12-27 1996-07-09 Omron Corp Work programing method and device
JP2006309577A (en) * 2005-04-28 2006-11-09 Fuji Electric Systems Co Ltd Production plan preparation system
US20070282475A1 (en) * 2006-05-31 2007-12-06 Kilian Schmidt Method and system for determining utilization of process tools in a manufacturing environment based on characteristics of an automated material handling system
US20160221187A1 (en) * 2013-03-15 2016-08-04 Industrial Perception, Inc. Object Pickup Strategies for a Robotic Device
CN108241336A (en) * 2016-12-27 2018-07-03 发那科株式会社 Production plan device
CN110494258A (en) * 2017-04-04 2019-11-22 牧今科技 Control device, picking up system, logistics system, program, control method and production method
CN111699501A (en) * 2019-01-14 2020-09-22 牧今科技 Robot system with coordination mechanism and operation method thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08174389A (en) * 1994-12-27 1996-07-09 Omron Corp Work programing method and device
JP2006309577A (en) * 2005-04-28 2006-11-09 Fuji Electric Systems Co Ltd Production plan preparation system
US20070282475A1 (en) * 2006-05-31 2007-12-06 Kilian Schmidt Method and system for determining utilization of process tools in a manufacturing environment based on characteristics of an automated material handling system
US20160221187A1 (en) * 2013-03-15 2016-08-04 Industrial Perception, Inc. Object Pickup Strategies for a Robotic Device
CN108241336A (en) * 2016-12-27 2018-07-03 发那科株式会社 Production plan device
CN110494258A (en) * 2017-04-04 2019-11-22 牧今科技 Control device, picking up system, logistics system, program, control method and production method
US20200030977A1 (en) * 2017-04-04 2020-01-30 Mujin, Inc. Control device, picking system, distribution system, program, control method and production method
CN111699501A (en) * 2019-01-14 2020-09-22 牧今科技 Robot system with coordination mechanism and operation method thereof

Also Published As

Publication number Publication date
CN114683299B (en) 2023-12-29

Similar Documents

Publication Publication Date Title
CN110465960B (en) Robot system with article loss management mechanism
CN111730603B (en) Control device and control method for robot system
JP6765741B1 (en) Non-temporary computer-readable storage media, how to operate robot systems and object transfer systems
JP6697204B1 (en) Robot system control method, non-transitory computer-readable recording medium, and robot system control device
JP7191354B2 (en) ROBOT TOOL AND METHOD OF OPERATION THEREOF
WO2021138691A1 (en) Robotic system with dynamic motion adjustment mechanism and methods of operating the same
CN115703232A (en) Robot system with image-based sizing mechanism and method of operating the same
CN115485216A (en) Robot multi-surface gripper assembly and method of operating the same
JP7126667B1 (en) Robotic system with depth-based processing mechanism and method for manipulating the robotic system
CN114683299B (en) Robot tool and method of operating the same
CN111618852B (en) Robot system with coordinated transfer mechanism
JP7218881B1 (en) ROBOT SYSTEM WITH OBJECT UPDATE MECHANISM AND METHOD FOR OPERATING ROBOT SYSTEM
CN115258510A (en) Robot system with object update mechanism and method for operating the robot system
CN115946107A (en) Robotic gripper assembly for openable objects and method of picking up objects
CN115609569A (en) Robot system with image-based sizing mechanism and method of operating the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant