SE1930157A1 - Robot cell configuration system and process - Google Patents
Robot cell configuration system and processInfo
- Publication number
- SE1930157A1 SE1930157A1 SE1930157A SE1930157A SE1930157A1 SE 1930157 A1 SE1930157 A1 SE 1930157A1 SE 1930157 A SE1930157 A SE 1930157A SE 1930157 A SE1930157 A SE 1930157A SE 1930157 A1 SE1930157 A1 SE 1930157A1
- Authority
- SE
- Sweden
- Prior art keywords
- robot
- cell
- computer
- work
- factory
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 230000008569 process Effects 0.000 title claims abstract description 46
- 238000004519 manufacturing process Methods 0.000 claims description 29
- 238000012549 training Methods 0.000 claims description 17
- 238000013528 artificial neural network Methods 0.000 claims description 11
- 238000012545 processing Methods 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 7
- 238000004422 calculation algorithm Methods 0.000 claims description 6
- 238000010276 construction Methods 0.000 claims description 4
- 238000007689 inspection Methods 0.000 claims description 3
- 238000009434 installation Methods 0.000 claims description 3
- 230000010354 integration Effects 0.000 claims description 3
- 238000009472 formulation Methods 0.000 claims description 2
- 230000008676 import Effects 0.000 claims description 2
- 239000000203 mixture Substances 0.000 claims description 2
- 238000012546 transfer Methods 0.000 claims description 2
- 238000012800 visualization Methods 0.000 claims description 2
- POFWRMVFWIJXHP-UHFFFAOYSA-N n-benzyl-9-(oxan-2-yl)purin-6-amine Chemical compound C=1C=CC=CC=1CNC(C=1N=C2)=NC=NC=1N2C1CCCCO1 POFWRMVFWIJXHP-UHFFFAOYSA-N 0.000 claims 1
- 210000004027 cell Anatomy 0.000 description 125
- 239000000047 product Substances 0.000 description 52
- 238000013135 deep learning Methods 0.000 description 5
- 238000005553 drilling Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000003754 machining Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000007514 turning Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000003466 welding Methods 0.000 description 2
- NPPQSCRMBWNHMW-UHFFFAOYSA-N Meprobamate Chemical compound NC(=O)OCC(C)(CCC)COC(N)=O NPPQSCRMBWNHMW-UHFFFAOYSA-N 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 210000003850 cellular structure Anatomy 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 238000011022 operating instruction Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J21/00—Chambers provided with manipulation devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/41815—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the cooperation between machine tools, manipulators and conveyor or other workpiece supply system, workcell
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/41885—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by modeling, simulation of the manufacturing system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/04—Viewing devices
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/31—From computer integrated manufacturing till monitoring
- G05B2219/31054—Planning, layout of assembly system
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32085—Layout of factory, facility, cell, production system planning
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/35—Nc in input of data, input till input file format
- G05B2219/35083—Parametric design, parameters for geometric design and for process planning
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39082—Collision, real time collision avoidance
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39386—Cell configuration, selection and connection of cell combinations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40115—Translate goal to task program, use of expert system
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40323—Modeling robot environment for sensor based robot system
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40392—Programming, visual robot programming language
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40393—Learn natural high level command, associate its template with a plan, sequence
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40419—Task, motion planning of objects in contact, task level programming, not robot level
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40425—Sensing, vision based motion planning
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manufacturing & Machinery (AREA)
- General Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Manipulator (AREA)
Abstract
A robot cell configuration system and process comprising at least three sets of interacting system components: a sensor- or vision guided robot (11; 12), a robot controller (21), and a cell generator (23). A robot edge- or onboard computer (20) configured to run a software designed to formulate instructions for robot movement and work within the robot cell based on a cell definition file (25) of computer readable format imported from the cell generator (23), and to implement these instructions as robot control code in the robot controller (21).
Description
Robot cell configuration system and processTECHNICAL FIELD OF THE INVENTION The present invention relates to a system and a process designed for configuration ofrobot cells for an automated industrial process. The invention relates also to anautomation process implementing embodiments of the robot cell configuration system and process of the invention.
BACKGROUND AND PRIOR ART In this context, a robot is an industrial robot Which can be described as anautomatically controlled multipurpose manipulator, programmable and re-programmable in three or more aXis, Which can be either fixed in position or mobilefor use in industrial automation applications. The system and process of the presentinvention can be implemented on any robot system that fits the definition, including but not limited to Articulated Robots, SCARA Robots, Cartesian or Parallel Robots, e.g.
Industrial robots can be used and programmed for various operations such asmachining, assembly, picking, packaging etc. In operation the robot moves a tool orrobot hand in repetitive cycles between pickup, manipulation and release, eachmanoeuvre performed at defined positions Within the Working range of the robot. Thepositions for pickup, manipulation and release, as Well as the feed of objects forhandling by the robot, are subjects that need consideration in the task of organizing a physical Working area for the robot, a robot cell.
The concept of configuring and setting up a robot cell is hoWever not limited tophysical parameters only, but involves also non-physical parameters and instructions that control the movements of the robot and the feed of objects.
Conventionally, automation takes a significant time to achieve, is usually costly and strongly tied to a single or feW variants of a product.
For illustration of the background to the present invention, a traditional model forsetting up a robot cell for automated production in a factory facility may comprise the following steps: 0 drawing up a requirements specification for the work to be done in the robotcell, 0 requesting quotations on physical cell construction cost from one or severalsystem integrators (persons or companies), 0 producing a virtual robot cell that is adapted to the physical constraints of thefactory and to the current product flow, 0 installing the robot cell and associated systems in the factory.
The robot cell is usually maintained in the factory until the end of life of the product,at which point the robot cell needs to be re-configured for a new product. Overall thismeans, that automation is not viable for small series or for products having a short life span. Previous attempts to improve the flow of robot cell setup are mostly aiming towards cost savings such as by making robot programming easier, e.g.
SUMMARY OF THE INVENTION It is an object of the present invention to provide a system and a corresponding process designed for creation of robot cells with a minimum of human intervention.
It is another object to provide an automation process wherein embodiments of the robot cell configuration system and process of the invention are implemented.
The first mentioned object is met by a system as defined in appending claim 1 and a process as defined in appending claim 6.
The second mentioned object is met by the automation process defined in appending claim 12.
The present invention is implemented in connection with a robot system whereinrobots are capable of identification of objects and determination of object positionswithin the working range of the robot. This capability involves computing capacity paired with a dedicated sensing technologf including, e.g., image acquisition means or motion limit sensors and proximity sensors carried on the robot and/ or mounted stationary on surrounding structures in close environment to the robot.
What is provided to meet the first mentioned object of the invention, briefly, is arobot cell configuration system comprising at least three sets of interacting system components: 0 a sensor- or vision guided robot, programmable/reprogrammable in at leastthree axes of motion, the robot served by a robot edge computer and dedicatedsensing means, 0 a robot controller, the robot controller configured to run software designed toexecute robot control code, 0 a cell generator, the cell generator configured to run software designed tocompile the robot cell by processing of digitized process and product information into a cell definition file of computer readable format.
In a system aspect of the invention, the robot edge computer configured to run asoftware designed to formulate instructions for robot movement and robot workwithin the robot cell based on the cell definition file imported from the cell generator, and to implement these instructions as robot control code in the robot controller.
In other words, figuratively speaking, the present invention provides automating ofthe automation setup. The cell generator and the robot edge computer thus allowingto go directly from the requirement specif1cation for a processed product to creationof the automation cell from scratch (or reusing/modifying a previously created robot cell, if appropriate).
The sensing means may alternatively and advantageously include 2D or 3D imaging.2D and 3D imaging can be accomplished by use of single lens or double lens digital cameras and associated software.
Briefly, 2D and 3D image acquisition and analysis rely on software designed totranslate captured images into computable data which form the basis for real timedecisions made by the robot system. This data may include, e.g., object identificationdata, object location, object classification data, data defining relative motion betweenobjects, relative motion between the robot system and the environment, or other data.
The 2D and 3D image acquisition- and analysis software provides a digitalrepresentation of a physical product and its location in space, which can be compared with a CAD-drawing file of the same product.
In one embodiment, accordingly, the robot cell configuration system comprises imageacquisition means and image processing software integrated With the robot, whereinthe robot edge computer software is designed to apply image data as input to at leastone of a movement planner programme and a work planner programme installed in the robot edge computer software.
In one embodiment, the cell generator is configured to run a neural networkalgorithm for robot training, based on CAD-drawings and product specifications imported to the cell generator.
In another embodiment, the system comprises a visualization generator which provides graphic presentation of the robot cell of the cell definition file.
In a further embodiment, the system comprises construction plans generator for installation of the robot cell physically in the factory, based on the cell definition file.In a process aspect of the invention, a robot cell configuration process comprises 0 generating a digital map of a robot cell area in a factory, 0 defining a robot location within the robot cell area, 0 calculating effective robot range with regard to physical factory constraintsand robot load limits, 0 determining pickup-, work- and delivery locations within the robot range andwith regard to processed product specifications, 0 generating a layout of the robot cell including means for feeding products toand from the robot cell, 0 compiling the above data in computer readable format, and 0 generating robot control code automatically by digital processing of thecompiled data file.
In one embodiment the process comprises: providing a sensor- or vision guided robot, programmable/ reprogrammable inat least three axes of motion, the robot served by a robot edge computer anddedicated sensing means, feeding digitized process and product information for processing by a cellgenerator generating a cell definition file of computer readable format,importing the cell definition file to the robot edge computer and formulating,by software installed in the robot edge computer, instructions for robotmovement and work based on the cell definition file and implementing these instructions as robot control code in a robot controller.
In one embodiment, the process further comprises the steps of: providing image acquisition means and image processing software integratedwith the robot, applying image data as input to at least one of a movement plannerprogramme and a work planner programme installed in the robot edge computer software.
In an automation aspect of the invention, embodiments of the above robot cellconfiguration system and process can be implemented in an automation process comprising: 0 configuration of a robot cell, including formulation of robot movement andwork instructions through the robot edge computing software of a visionguided robot programmable/reprogrammable in at least three axes of motion, 0 installing the robot cell physically in a factory taking into account one or someof the following requisites: o physical constraints in the form of available space, robot cell limits andlocation according to factory plans and concurrent production,o way of feeding products to and from the robot cell,o coordination in time and space with nearby robot cells and robots,o specified work and tools required for the steps of production in the robot cell, and 0 training the robot in a neural network by comparison of CAD-drawings ofproducts with digital representations of the physical products and their location in space using the vision-based robot control.
The present invention also relates to a computer programme product storable on acomputer usable medium containing instruction for a processor of a cell generator or robot edge computer to execute the inventive process.
The computer programme product may be provided at least in part over a data transfer network, such as Ethernet or Internet.
The present invention further relates to a computer readable medium which contains the computer programme product.
SHORT DESCRIPTION OF THE DRAWINGS Additional details and further embodiments of the invention are defined in thesubordinated claims, and explained in the following detailed description of the invention with reference made to the accompanying dravvings: Fig. 1 is a schematic overview illustrating a pair of robot cells in a factoryinstallation,Fig. 2 is a block diagram illustrating system components and process flow in an automated robot cell configuration system and process.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS Fig. 1 shows two robot cells 1 and 2 operating in parallel. Each robot cell is servedby a pair of supplying conveyors 3 and 4, and a discharge conveyor 5 respectively.The supplying conveyors 3 and 4 are each associated with a pickup table 6 and off-pushers 7 and 8 respectively at a turning end of the conveyors. The dischargeconveyors 5 are each associated with a delivery table 9 and an on-pusher 10 at theturning end, respectively, of the discharge conveyors. Optical or other sensors (not shown) can be applied to initiate the off- and on-pushers.
In this connection it should be pointed out that instead of the conveyor feedillustrated in Fig. 1, the processed products may be picked by a robot from a pallet or from a bin, if appropriate.
In the robot cells 1 and 2, a robot 11 or 12 is operable and programmed to pick anitem or product from the pickup table 6, placing the item or product on a work table13 to perform a value adding operation, before the item or product is placed on thedelivery table 9 for discharge. The value adding operation can be any kind ofassembling, machining, painting, inspection, testing etc. In the illustrated example,the robot cells 1 and 2 may be suitable for assembling of two or more incoming product parts into a composite, singular, outgoing item.
The robots 11 and 12 are articulated robots, driven and programmed for turningmotion about at least three axes X, Y, and Z in an orthogonal coordinate space. Eachrobot has a robot arm 14 carrying in its end a robot hand 15. The robot arm 14 isextendable to reach the pickup, delivery and work tables by turning about the axesX, Y and Z. On one hand, aXis Z represents a centre around which the robot androbot arm 14 can rotate in a horizontal plane, whereas on the other hand aXis Z alsorepresents a point of location of the robot in the robot cell. The axes X provideturnability in vertical planes, while the aXis Y provides rotation in any arbitraryinclined plane parallel with the X aXes. Naturally, the robot 11 or 12 may include other pivot axes for additional mobility, such as in the robot hand, e.g.
The robot hand 15 is arranged for docking with different kinds of tools for grasping,drilling, probing, measuring etc. In Fig. 1, the robots 11 and 12 have each put together an individual set of tools 16 and 17, chosen from a supply of tool boxes 18.
In operation, the robots 11 and 12 are each guided by a vision system 19 and a robotedge computer 20. In the drawing of Fig. 1, the robot edge computer 20 is illustratedas being installed onboard the robot but it may alternatively be arranged separately,within or outside the robot cell. The vision system 19 may include 2D or 3D imageacquisition means such as a single lens or double lens camera and associated image processing software.
The robot edge- or onboard computer 20 communicates with a robot controller 21 via a wireless or cable network 22. Connected to the network 22 is a cell generator 23 containing processor and software designed to organize the robot cells 1 and 2based on digitized product and factory information 24. The cell generator 23 may be installed on a cloud server or on a private server as appropriate.
The block diagram of Fig. 2 shows a cell configuration system and process for afactory installation, substantially as the one illustrated in Fig. 1 as an example. Maincomponents of the cell configuration system are the cell generator 23, the robot edgecomputer 20 and the robot controller 21. Each of the cell generator, the robot edgecomputer and the robot controller comprise data input and output functions, hardware or emulated processor and memory, and programmes executing software.
The cell configuration system and process use different kinds of input data 24 inorder to set up the robot cell and to generate operating instructions for the robot.The input data to the cell generator 23 includes, but is not necessarily limited to,numerical factory data and digitized factory layout drawings; data on concurrentproduction in nearby robot cells; CAD-produced drawings on products; product specifications; digital 2D or 3D representation of physical products, etc.
Robot cell definition file The input data is processed in the cell generator 23 which provides computerreadable output in the form of a robot cell definition file 25 containing, inter alia:digital 2D or 3D representation of products; robot work instructions; robot trainingalgorithms; time schedules and factory production integration files; graphicpresentation of robot cells; robot cell manufacturing files; production statistics, or other case specific data as appropriate.
Creation of the cell definition file 25 involves various portions of the cell generator software such as: Process and work planner: a computer programme which uses CAD drawings and product specifications to make decisions on assembly or manufacture, choice between tools and operations (fitting, drilling, welding etc.); Cell location planner: a computer programme which uses factory data and data onconcurrent production to make decisions on appropriate location of the new robot cell within the factory; Robot selection planner: a computer programme which uses factory data and data on concurrent production to decide which robot among the ones available that is the best choice for the work in the new robot cell; Product infeed/outfeed planner: a computer programme which uses all the above data, or the output from the above planner programmes, to regulate the flow of products in to and out from the new robot cell; Product placement planner: a computer programme which uses CAD drawings and product specif1cations together with output from the process and work plannerprogramme to make decisions on product's orientations at the pickup tables, work tables and delivery tables; Robot movement planner: a computer programme which can use the input to the cell generator and/ or the output from the above planners to generate movement patterns for the robot; Robot training planner: a computer programme which def1nes neural networks for deep learning and fine-tuning of robot movements and operations.
It may be noted that the task of coding the different planners of the cell generator,per se, may lie within the ordinary skill of a computer programmer of industrialprogram code, whereas in the present invention the leverage of improvement abovethe prior art lies in the combination of planners and the resulting cell definition filewhich, by extended involvement of the robot and robot edge computer, provides a higher degree of automation in the configuration and setup of a robot cell.
Thus, at least some of the above planners will be involved in the creation of a celldefinition file 25 from which robot control code can be generated automaticallythrough digital processing in the robot edge computer 20. Running the programmesinstalled in the cell generator 23 results in a set of procedural steps leading towards an effective and functional cell definition file: 0 generating a digital map of a robot cell area in a factory,0 def1ning a robot location (Z) within the robot cell area,0 calculating effective robot range with regard to physical factory constraints and robot load limits, determining pickup-, work- and delivery locations Within the robot range andWith regard to processed product specifications, generating a layout of the robot cell including means for feeding processedproducts to and from the robot cell, and compiling the above data in computer readable format.
Accordingly, creation of the cell definition file 25 comprises processing of digitized descriptions of at least one or some of the following parameters: physical constraints in the form of available space, robot cell limits andlocation according to factory plans and concurrent production, way of feeding products to and from the robot cell, coordination in time and space With nearby robot cells and robots,specification of work and choice of tools required for the steps of production in the new robot cell.
The cell generator 23 software thus contains executable programmes configured to receive and process at least the following import data: numerical factory data and digitized factory layout drawings,data on concurrent production in nearby robot cells, CAD drawing files, product specifications, digital 2D or 3D representation of processed products, and to generate at least one of the following export data: robot cell definition files on computer readable format,graphic presentation of robot cells, robot cell manufacturing files, digital 2D or 3D representation of processed products,robot work instructions, robot training algorithms, time schedules and factory production integration files, production statistics. 11 One basic input to the cell generator is a CAD assembly file. The CAD assembly fileis generated in a CAD programme that uses a design tool which determines theproper order of assembling the parts of the product, or defines other manipulation ofthe product to be done in the robot cell. The CAD assembly file can be generatedunder supervision by a human designer, or it can be automatically generated bysimulation, such as by simulator reinforcement learning with trial and error using feedback from own actions until satisfactory result is achieved.
The CAD assembly file provides computer readable data containing, inter alia, a set of constraints for calculation and generation of the cell definition file, such as: Grippers/gripper fingers required for assembly: can be defined using grasp plannersoftware which may be built on brute force simulation, possibly assisted byalgorithms such as constraint solvers or deep learning. The grasp planner softwarewill optimize for a low number of grippers and fingers by finding a set of possiblegripper configurations for each part and then minimize the required configurationsby picking options that satisfy as many parts as possible. The grasp plannersoftware may also identify the required machining tools for welding, drilling, deburring, etc., and determines the suitable size of machining tools, if appropriate; Supplements to the robot cell: requisition of magnetic tables, shakers for screws etc., and other requisites that are unsuitable for picking by the robot; Infeed/outfeed of products from the robot cell: the CAD assembly file defines which products / product parts need to be fed in to and out from the robot cell, and withhelp of information on how parts are packaged, it is possible to calculate necessaryentrance and exit routes as well as defining ways of transport, such as conveyorbelts, pallet drop off points, or other ways of supplying products / product parts tothe robot cell.
The cell generator software is thus configured to generate a robot cell layout thatsatisfies all incoming constraints and selects a robot that has sufficient range toreach all stations in the robot cell (or assigns more robots /stations as needed). Therobot cell layout forms a basis for making part lists, drawings for assembly, etc. Inother words, the sum of data imported to the cell generator are processed therein for generation of the cell definition file, which by its content on computer readable 12 format, enables automatic generation of robot control code by processing through the planner programmes of the robot edge computer.Robot control code The cell definition file 25 is exported to the robot edge computer 20 and used by itssoftware for creating instructions for robot movements and work operations. Theseinstructions are exported to the robot controller 21 in the form of a robot controlcode. The character of the robot control code is in the form of orthogonal x/ y/ z-coordinates, polar coordinates in terms of distance + direction + elevation, speed ofmotion, time lapse and duration of holds and halts in the robot moves andoperations, or other data appropriate for the control of an automation robot as is known per se.
Creation of the robot control code involves various portions of the robot edge computer software such as: Movement p1anner: a computer programme which uses sensor- or vision guidance19 and/ or a deep learning algorithm 26 in neural network training of the robot for fine-tuning of robot movements; Work p1anner: a computer programme which uses sensor- or vision guidance 19and/ or a deep learning algorithm 26 in neural network training of the robot for fine- tuning of robot tool operations.
Accordingly, running the programmes installed in the robot edge computer 20results in a set of procedural steps leading towards an effective and functional robot control code (in this case for an assembly process used as example): 0 reading the cell definition file for spatial coordinates on robot location andobject's pick-up, work and delivery locations, 0 reading the cell definition file for work specif1cation and digital drawings,0 moving robot hand to pick-up location object A, 0 detecting object's position at pick-up location, 0 grasping and placing object A at assembly location, 0 moving robot hand to pick-up location object B, 0 detecting object's position at pick-up location, 13 0 grasping and placing object B With object A at assembly location, 0 compare the resulting assembly With work specif1cation and digital dravvings, 0 if assembly is successful, memorize the robot movements as new robot controlcode, 0 if assembly is un-successful, make corrections to robot movements based onthe comparison of assembled objects With work specification and digital drawings, and run through the programme until assembly is successful.
In other words, the compilation of computer readable data in a cell definition file asprovided results in automation of the setup of a robot cell through computerprocessing in the robot edge computer substantially without need for human intervention.
In a highly advantageous embodiment, the robot edge computer 20 is integrated with3D imaging and processing means 19 onboard the robot, this preferred constellationproviding unmatched flexibility in the configuration and setup of robot cells, and afast and flexible learning process including visual feedback at the very end of the automation line.Sensor- or vision-based robot control In both training and productive operation, sensor or vision guidance is arranged to provide feedback to the robot edge computer.
Depending on the nature of products and the automated production, sensorguidance through limit switches, proximity sensors, or other sensing technology such as light detection and ranging, laser, lidar etc., may be applied as appropriate.
In other applications, vision guidance through image capture and image processingmay be the relevant choice. In the latter case, single or double lens cameras can befitted on the robot close to the robot hand 15 to capture a view whose centre is linedup With the robot arm 14. Alternatively, or in addition to a robot mounted camera,stationary cameras (not shown in Fig. 1) may be arranged in the robot cell to overview the location and orientation of products on pickup tables and work tables, e.g. 14 Vision-based robot control and intelligent vision systems for robots is widely spreadtechnologf and its general implementation in automated production is well known topersons skilled in the art of automation. However, in the present invention thevision-based robot control is additionally used in the process of setting up the robotcell. More precisely, the vision guidance system can be utilized for gatheringinformation on structural robot cell components and physical constraints in therobot's environment, and provide this information as input and feedback to the cellgenerator or to the robot edge computer for creation or modif1cation of the cell definition file or the robot control code.
Thus, in accordance with embodiments of the invention, a process of automated configuration of a robot cell may also be summarized as follows: 0 providing a vision guided robot, programmable/reprogrammable in three ormore axes of motion, the robot being served by a robot edge computer andvision-based robot control, 0 def1ning, for the subject robot cell, the sequential steps of an automatedproduction process based on CAD-dravvings of products, productspecifications, or digital 2D or 3D representations of products, 0 determining the spatial locations, in the robot's local coordinate system, ofpickup position, manufacturing/ assembly position and delivery position ofprocessed products, 0 choosing the relevant tool for the robot work (assembly, manufacture,inspection etc.), and 0 training the robot in a neural network against feedback provided by the vision-based robot control.
Neural network training Generally speaking, machine learning or deep learning and training of a robot in aneural network may involve recognition of characteristic object elements in imagescaptured by a digital camera onboard the robot, and comparison of the spatial position of these elements with the position coordinates for the same elements of adigitized representation of the object, in the robot's coordinate system. Each robot cell however may require individually designed training algorithms, and a detailed description of any specific neural network for robot training in a specific robot cellwill not be given or required in this disclosure. For persons skilled in the art of robotcontrol, guidance can be found in the extensive literature on neural network training.
Eventually, With reference to Fig. 2, the reference number 27 indicates a generatorfor visual/ graphic display of the robot cell, and/ or a generator for cellmanufacturing, i.e. creating construction plans for the robot cell based on data and information compiled in the cell definition file 25.
In other words, what is disclosed hereinabove is a system and a process to initiatethe whole automation sequence from the product itself in a flexible way. This is doneby taking, inter alia, CAD f1le(s) and informing various parts of the system asexplained in text and shown in the drawings. Combined with the fact that centralparts of the automation itself is handled by a sensor- or vision guided robot there isprovided automation which is flexible, requires little or no subsequent externalprogramming, can generate graphic representations and other sales material, and can be used to generate the automation cell itself or variations thereof.
The present invention may be implemented as software, hardware, or a combinationthereof. A computer program product or a computer program implementing theprocess or a part thereof comprises software or a computer program run on a generalpurpose or specially adapted computer, processor or microprocessor. The softwareincludes computer program code elements or software code portions that make thecomputer perform the process. The program may be stored in whole or part, on, orin, one or more suitable computer readable media or data storage means such as amagnetic disk, CD-ROM or DVD disk, hard disk, magneto-optical memory storagemeans, in RAM or volatile memory, in ROM or flash memory, as firmware, on a dataserver, or a cloud server. Such a computer program product or a computer program can also be supplied via a network, such as Internet.
It is to be understood that the embodiments described above and illustrated in thedrawings are to be regarded only as non-limiting examples of the present invention and may be modified within the scope of the appended claims.
Claims (1)
1. 3. A robot cell configuration system comprising at least three sets of interactingsystem components: - a sensor- or vision guided robot (1 1; 12), programmable/re-programmable in at least three axes of motion (X, Y, Z), the robot served by arobot edge computer (20) and dedicated sensing means (19), - a robot controller (21), the robot controller running software designedto execute robot control code, - a cell generator (23), the cell generator configured to run softwaredesigned to compile the robot cell (1; 2) by processing of digitized process andproduct information (24) into a cell definition file (25) of computer readableformat, - wherein the robot edge computer (20) configured to run a softwaredesigned to formulate instructions for robot movement and work within therobot cell based on the cell definition file (25) imported from the cell generator(23), and to implement these instructions as robot control code in the robot controller (21). The system of claim 1, comprising 2D or 3D image acquisition means (19)and image processing software integrated with the robot, wherein the robotedge computer software is designed to apply image data as input to at leastone of a movement planner programme and a work planner programme installed in the robot edge computer software. The system of claim1 or 2, wherein the cell generator (23) is configured to runa neural network algorithm for robot training, based on CAD-drawings and product specif1cations imported to the cell generator. The system of any of claims 1 to 3, comprising a visualization generator providing graphic presentation of the robot cell of the cell definition file (25). 17 . The system of any of claims 1 to 4, comprising construction plans generator for installation of the robot cell physically in the factory, based on the celldefinition file (25). . A robot cell configuration process comprising: - generating a digital map of a robot cell area in a factory, - defining a robot location (Z) Within the robot cell area, - calculating effective robot range With regard to physical factory constraintsand robot load limits, - determining pickup-, Work- and delivery locations Within the robot rangeand With regard to processed product specifications, - generating a layout of the robot cell including means for feeding products toand from the robot cell, - compiling the above data in computer readable format, and - generating robot control code automatically by digital processing of thecompiled data file. . The process of claim 6 comprising: - providing a sensor- or vision guided robot (1 1; 12),programmable/ reprogrammable in at least three aXes of motion (X, Y, Z), therobot served by a robot edge computer (20) and dedicated sensing means (19), - feeding digitized process and product information for processing by acell generator (23) generating a cell definition file (25) of computer readableformat, - importing the cell definition file (25) to the robot edge computer (20), - formulating, by software installed in the robot edge computer (20),instructions for robot movement and Work based on the cell definition file (25)and implementing these instructions as robot control code in a robot controller (21). . The process of claim 7, comprising the steps of: - providing 2D or 3D image acquisition means (19) and image processing software integrated With the robot (11; 12), 9. 18 - applying image data as input to at least one of a movement plannerprogramme and a Work planner programme installed in the robot edge computer (20) software. The process of any of claims 6 to 8, comprising: - providing a vision guided robot (11; 12),programmable/ reprogrammable in at least three aXes of motion (X, Y, Z), therobot served by a robot edge computer (20) and vision-based robot control(19), - def1ning, for the subject robot cell, the sequential steps of anautomated production process based on CAD-dravvings of products, productspecif1cations, or digital 2D or 3D representations of products (24), - determining the spatial locations, in the robot's local coordinatesystem, of pickup position (6), manufacturing/ assembly position (13) anddelivery position (9) of processed products, - choosing the relevant tool (16; 17) for the robot Work (assembly,manufacture, inspection etc.), and - training the robot in a neural network (26) against feedback provided by the vision-based robot control (19). 10. The process of any of claims 7 to 9, Wherein generation of the robot cell definition file (25) includes processing of digitized descriptions of the followingprocess and product parameters: 0 physical constraints in the form of available space, robot celllimits and location according to factory plans and concurrentproduction, 0 feed of products to and from the robot cell, 0 coordination in time and space With nearby robot cells and robots, 0 specification of Work and choice of tools required for the steps of production in the robot cell. 19 1 1. The process of any of claims 7 to 10, wherein the cell generator (23) software contains executable programmes configured to receive and process the following import data: numerical factory data and digitized factory layout drawings,data on concurrent production in nearby robot cells, CAD drawing files, product specif1cations, digital 2D or 3D representation of processed products, and to generate at least one of the following export data: robot cell definition files on computer readable format,graphic presentation of robot cells, robot cell manufacturing files, digital 2D or 3D representation of processed products,robot work instructions, robot training algorithms, time schedules and factory production integration files, production statistics. 12. An automation process comprising: - configuration of a robot cell (1; 2), including formulation of robot movement and work instructions through the robot edge computing (20) software of a vision guided robot (11; 12), programmable/reprogrammable in at least three aXes of motion (X, Y, Z), - installing the robot cell physically in a factory taking into account one or some of the following requisites: physical constraints in the form of available space, robot celllimits and location according to factory plans and concurrentproduction, way of feeding products to and from the robot cell,coordination in time and space with nearby robot cells and robots, 13. 14. 15. 16. I specified work and tools required for the steps of production inthe robot cell, and- training the robot in a neural network (26) by comparison of CAD-drawings of products with digital representations of the physicalproducts and their location in space using vision-based robot control (19). The process of claim 12, wherein the digital representation of a physicalproduct and its location in space is produced by means of a 2D or 3D image acquisition means and image processing software integrated With the robot. A computer programme product storable on a computer usable mediumcontaining instruction for a processor of a cell generator or robot edge computer to execute the process of any of claims 6 to 13. The computer program product of claim 14 provided at least in part over a data transfer network, such as Ethernet or Internet. A computer readable medium, characterized in that it contains a computer programme product according to claim 14.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1930157A SE1930157A1 (en) | 2019-05-15 | 2019-05-15 | Robot cell configuration system and process |
PCT/SE2020/050493 WO2020231319A1 (en) | 2019-05-15 | 2020-05-14 | Robot cell setup system and process |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1930157A SE1930157A1 (en) | 2019-05-15 | 2019-05-15 | Robot cell configuration system and process |
Publications (1)
Publication Number | Publication Date |
---|---|
SE1930157A1 true SE1930157A1 (en) | 2020-11-16 |
Family
ID=73290302
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SE1930157A SE1930157A1 (en) | 2019-05-15 | 2019-05-15 | Robot cell configuration system and process |
Country Status (2)
Country | Link |
---|---|
SE (1) | SE1930157A1 (en) |
WO (1) | WO2020231319A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4074471A1 (en) * | 2021-04-14 | 2022-10-19 | BAE SYSTEMS plc | Robotic cells |
EP4323158A1 (en) * | 2021-04-14 | 2024-02-21 | BAE SYSTEMS plc | Robotic cells |
EP4074470A1 (en) * | 2021-04-14 | 2022-10-19 | BAE SYSTEMS plc | Robotic cells |
EP4323159A1 (en) * | 2021-04-14 | 2024-02-21 | BAE SYSTEMS plc | Robotic cells |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10105844B2 (en) * | 2016-06-16 | 2018-10-23 | General Electric Company | System and method for controlling robotic machine assemblies to perform tasks on vehicles |
KR20110095700A (en) * | 2010-02-19 | 2011-08-25 | 현대중공업 주식회사 | Industrial robot control method for workpiece object pickup |
KR101919463B1 (en) * | 2016-11-24 | 2019-02-08 | 한국폴리텍7대학 산학협력단 | Gripper robot control system for picking of atypical form package |
CN108029340A (en) * | 2017-12-05 | 2018-05-15 | 江苏科技大学 | A kind of picking robot arm and its control method based on adaptive neural network |
US11292133B2 (en) * | 2018-09-28 | 2022-04-05 | Intel Corporation | Methods and apparatus to train interdependent autonomous machines |
CN109048926A (en) * | 2018-10-24 | 2018-12-21 | 河北工业大学 | A kind of intelligent robot obstacle avoidance system and method based on stereoscopic vision |
-
2019
- 2019-05-15 SE SE1930157A patent/SE1930157A1/en not_active Application Discontinuation
-
2020
- 2020-05-14 WO PCT/SE2020/050493 patent/WO2020231319A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2020231319A1 (en) | 2020-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Tipary et al. | Generic development methodology for flexible robotic pick-and-place workcells based on Digital Twin | |
CN110573308B (en) | Computer-based method and system for spatial programming of robotic devices | |
SE1930157A1 (en) | Robot cell configuration system and process | |
Dömel et al. | Toward fully autonomous mobile manipulation for industrial environments | |
CN103406905A (en) | Robot system with visual servo and detection functions | |
Miller | Industrial robot handbook | |
Gan et al. | Off-line programming techniques for multirobot cooperation system | |
Park et al. | Development of robotic bin picking platform with cluttered objects using human guidance and convolutional neural network (CNN) | |
Gkournelos et al. | Model based reconfiguration of flexible production systems | |
He et al. | TacMMs: Tactile mobile manipulators for warehouse automation | |
Wang et al. | An LLM-based vision and language cobot navigation approach for Human-centric Smart Manufacturing | |
Rückert et al. | Calibration of a modular assembly system for personalized and adaptive human robot collaboration | |
Cipriani et al. | Applications of learning algorithms to industrial robotics | |
Tipary et al. | Planning and optimization of robotic pick-and-place operations in highly constrained industrial environments | |
US20240208069A1 (en) | Automatic pick and place system | |
Pichler et al. | Towards robot systems for small batch manufacturing | |
Martinez et al. | Automated 3D vision guided bin picking process for randomly located industrial parts | |
CN110914021A (en) | Operating device with an operating device for carrying out at least one work step, and method and computer program | |
Castro et al. | AdaptPack studio: automatic offline robot programming framework for factory environments | |
Lefranc | Review of trends in manufacturing systems based on industry 4.0: the opportunities | |
Nambiar et al. | Automation of unstructured production environment by applying reinforcement learning | |
Glogowski et al. | ROS-Based Robot Simulation in Human-Robot Collaboration | |
Shukla et al. | Robotized grasp: grasp manipulation using evolutionary computing | |
Feng et al. | Research on disorderly grasping system based on binocular vision | |
Wojtynek et al. | InteractiveWorkspace Layout focusing on the reconfiguration with collaborative robots in modular production systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NAV | Patent application has lapsed |