Nothing Special   »   [go: up one dir, main page]

CN116330279A - Welding robot parameterized programming method and system based on machine vision and neural network - Google Patents

Welding robot parameterized programming method and system based on machine vision and neural network Download PDF

Info

Publication number
CN116330279A
CN116330279A CN202310228303.8A CN202310228303A CN116330279A CN 116330279 A CN116330279 A CN 116330279A CN 202310228303 A CN202310228303 A CN 202310228303A CN 116330279 A CN116330279 A CN 116330279A
Authority
CN
China
Prior art keywords
welding
robot
weld
neural network
seam
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310228303.8A
Other languages
Chinese (zh)
Inventor
刘宝
刘天宝
叶飞
梁福学
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Longlifu Machinery Technology Co ltd
China University of Petroleum East China
Original Assignee
Qingdao Longlifu Machinery Technology Co ltd
China University of Petroleum East China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Longlifu Machinery Technology Co ltd, China University of Petroleum East China filed Critical Qingdao Longlifu Machinery Technology Co ltd
Priority to CN202310228303.8A priority Critical patent/CN116330279A/en
Publication of CN116330279A publication Critical patent/CN116330279A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • B23K37/02Carriages for supporting the welding or cutting element
    • B23K37/0258Electric supply or control circuits therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a welding robot parameterized programming method and a system based on machine vision and a neural network, which are characterized in that a weld joint type is judged through a deep learning target detection algorithm, a target weld joint region is determined, a corresponding welding gun gesture is determined according to the weld joint type, the target weld joint region is used as an ROI region and is processed once, and Cartesian space welding path feature points are determined according to a once processing result; and carrying out track planning of a robot joint space according to the Cartesian space welding path characteristic points, calculating the width of a welding seam and the depth of the welding seam before welding, calculating working parameters of a welding gun by combining a predicted neural network model and preset welding parameters, and sending a command of executing a welding program to a robot executing mechanism so as to finish welding work. The invention can realize teaching-free rapid parameterized programming, solves the problems of complicated and low-efficiency on-line programming of the traditional welding robot manual teaching, non-ideal welding gun working parameters for different welding seam types and plates and need to be repositioned before each welding, and improves the intelligent degree of the welding robot.

Description

Welding robot parameterized programming method and system based on machine vision and neural network
Technical Field
The invention relates to the technical field of industrial intelligent manufacturing, in particular to a welding robot parameterized programming method and system based on machine vision and a neural network.
Background
Industrial welding robots are automation tools which are indispensable for realizing large-scale and complicated industrial production, and play an important role in the manufacturing industries of automobiles, ships, aerospace and the like. The industrial robot technology in China has made great progress through a long-term attack and shut plan, and the market is gradually mature. Along with the continuous improvement of the requirements on the refinement and automation degree of industrial production, the development and innovation of China are also carried out in the intelligent control field of robots.
It is known that welding efficiency and welding quality have a great influence on industrial processes and industrial product quality. At present, an industrial welding robot mostly adopts a manual teaching mode, aiming at different welding plate materials, welding seam widths and depths, expected weld seam surplus height, penetration, welding speed and the like, the voltage, current size, wire feeding speed, protective air flow and welding gun posture of a welding gun are manually set by virtue of experience of an operation technician, and meanwhile, the welding gun of the robot reaches a specified welding position according to operations such as high-voltage locating, laser locating and the like. When the type of weld (butt weld, inside corner weld, outside corner weld, etc.) changes, the re-teaching is needed, and there are situations where the parameters of the welding gun set up are not ideal enough when the experience of the teaching personnel is insufficient. The traditional manual teaching on-line programming process is complex, low in efficiency and low in unmanned degree of the welding robot, and is not beneficial to better meeting the requirements of Chinese intelligent manufacturing strategy and realizing modern construction. Meanwhile, the single structured light vision is difficult to realize the positioning and measurement of the welding line under a complex background, and the single multi-vision system has the defects of calibration errors of the same characteristic point in a plurality of images, difficult measurement of a narrow welding line and the like.
Disclosure of Invention
Aiming at the technical problems, the invention provides a parameterized programming method and a parameterized programming system for a welding robot based on machine vision and a neural network, which can carry out rapid parameterized programming without teaching, an operator only needs to fix a plate to be welded on a welding table and input welding plate materials, expected weld seam surplus height after welding, fusion depth and welding speed when a mechanical arm runs on a man-machine interaction interface of a welding system, and the robot can independently find a welding position to start welding, thereby solving the problems of complicated and low-efficiency on-line programming, non-ideal setting of welding machine working parameters for different welding seam types and plates and need to re-locate before each welding of the traditional welding robot, and improving the intelligent degree of the welding robot.
Specifically, the invention provides a welding robot parameterized programming method and a system based on machine vision and a neural network, wherein the method comprises the following steps:
s1: initializing, and collecting a weld joint area image;
s2: judging the type of a welding seam through a deep learning target detection algorithm, determining a target welding seam area, and determining a corresponding welding gun posture according to the type of the welding seam;
s3: taking the target weld joint region as an ROI region, performing primary treatment, and determining Cartesian space welding path feature points according to a primary treatment result; performing track planning of a robot joint space according to the Cartesian space welding path characteristic points, and calculating the width of a welding seam and the depth of the welding seam before welding;
s4: and executing a welding procedure according to the weld width, the weld depth before welding and preset welding parameters.
The initialization in the invention specifically comprises the following steps: establishing a world coordinate system, each axis coordinate system of the robot and a tool coordinate system, performing forward and reverse kinematics modeling on the robot, calibrating a multi-vision camera and a structural optical module, and performing hand-eye calibration on the multi-vision camera, the structural optical module and the robot.
Further, the initializing in the present invention further includes: and setting preset welding parameters according to the welding plate materials, wherein the preset welding parameters at least comprise a weld seam excess height expected value, weld penetration, welding speed and welding plate materials, but are not limited to the preset welding parameters.
In the present invention, the one-time processing at least includes filtering, segmenting, morphological processing, and weld centerline extraction of the ROI area image, but is not limited thereto.
In the present invention, the weld types include, but are not limited to, inside corner welds, outside corner welds, butt welds, and the like.
Before the track planning of the robot joint space is performed, the method further comprises the following steps: and converting the position of the characteristic point from a pixel coordinate system to a world coordinate system and a joint coordinate system, and recording the coordinate information of the characteristic point.
In the invention, the method for calculating the weld width and the weld depth before welding specifically comprises the following steps: and obtaining reflected light rays irradiated by the structured light source, finding out corresponding positions on the structured light image according to the coordinate information of the recorded characteristic points, calculating the width of the welding seam, and calculating the distance between the welding seam and the surface of the welding plate to be used as the depth of the welding seam before welding.
The step S4 specifically comprises the following steps: inputting the preset welding parameters, the calculated weld width and the calculated weld depth before welding into a prediction neural network model, and calculating to obtain a predicted welding gun voltage current value, a predicted wire feeding speed and a predicted protection gas flow; and according to the voltage and current values, wire feeding speed, protective air flow, welding gun posture, track planning of a robot joint space, presetting welding parameters and controlling a robot executing mechanism to work.
As another preferred aspect, the present invention also provides a welding robot parameterized programming system based on machine vision and neural network, the system comprising at least:
the multi-vision camera set is used for shooting multi-dimensional weld joint images, completing image acquisition of weld joint areas and being in communication connection with the upper computer;
a structured light module, comprising: the structured light source and the structured light image acquisition equipment are both in communication connection with the upper computer;
the upper computer is used for presetting welding parameters, determining a target welding line area, performing image processing and robot track planning, calculating the width and depth of the welding line, predicting the working parameters of the welding machine and controlling the working of the robot executing mechanism.
The PLC is communicated with the upper computer through the field bus, and is used for receiving information sent by the upper computer and returning a value to inform the upper computer whether the upper computer is successfully executed or not; meanwhile, the robot actuator is in interactive communication with the field bus;
a robotic actuator comprising: a mechanical arm body structure and a welding module; the welding module comprises a welding machine power supply and a welding gun, and the welding gun is installed at the tail end of the mechanical arm through flange connection.
The upper computer judges the type of the welding seam through a deep learning target detection algorithm, determines a target welding seam area and determines a corresponding welding gun posture according to the type of the welding seam; the deep learning target detection algorithm at least adopts one of a YOLO target detection model or other prediction models based on convolutional neural networks.
The upper computer calculates a predicted welding gun voltage current value, a wire feeding speed and a protective gas flow through a predicted neural network model according to preset welding parameters and the calculated welding seam width and the welding seam depth before welding; the prediction neural network model at least adopts one of a BP network model or other prediction models based on an artificial neural network.
In summary, the invention provides a welding robot parameterized programming method and system based on machine vision and a neural network, which comprises the steps of judging a weld joint type through a deep learning target detection algorithm, determining a target weld joint region, determining a corresponding welding gun gesture according to the weld joint type, taking the target weld joint region as an ROI (region of interest) region, performing primary processing, and determining Cartesian space welding path feature points according to primary processing results; and carrying out track planning of the robot joint space according to the Cartesian space welding path characteristic points, calculating the width of the welding seam and the depth of the welding seam before welding, and executing a welding program according to the width of the welding seam and the depth of the welding seam before welding and preset welding parameters.
Compared with the prior art, the invention can realize teaching-free rapid parameterized programming, breaks through the problems of complicated and low efficiency of manual teaching on-line programming, non-ideal welding gun parameter setting and need of locating again before each welding of the traditional welding robot, and improves the intelligent degree of the welding robot.
Drawings
Fig. 1 is a schematic block diagram of an embodiment of a welding robot based on machine vision and neural network according to the present invention.
Fig. 2 is a block diagram of a welding robot parameterized programming system based on machine vision and neural network according to the present invention.
Fig. 3 is a flowchart of a welding robot parameterized programming system based on machine vision and neural network according to the present invention.
Fig. 4 is a diagram of a human-machine interface according to the present invention.
Wherein, host computer 1, robot PLC controller 2, welding machine power 3, robot body 4, welding bench 5, structure optical module 6, multi-vision camera group 7 and welder 8.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1-2, the present invention provides a welding robot parameterized programming system based on machine vision and neural network, the system comprising at least:
the multi-vision camera set is used for shooting multi-dimensional weld joint images, completing image acquisition of weld joint areas and being in communication connection with the upper computer;
a structured light module, comprising: the structured light source and the structured light image acquisition equipment are both in communication connection with the upper computer;
the upper computer is used for presetting welding parameters, determining a target welding line area, performing image processing and robot track planning, calculating the width and depth of the welding line, predicting the working parameters of the welding machine and controlling the working of the robot executing mechanism.
The preset welding parameters at least include a desired value of weld seam surplus height, weld penetration, welding speed, and welding plate quality, but are not limited thereto.
The PLC is communicated with the upper computer through the field bus, and is used for receiving information sent by the upper computer and returning a value to inform the upper computer whether the upper computer is successfully executed or not; meanwhile, the robot actuator is in interactive communication with the field bus;
a robotic actuator comprising: a mechanical arm body structure and a welding module; the welding module comprises a welding machine power supply and a welding gun, and the welding gun is installed at the tail end of the mechanical arm through flange connection.
Referring to fig. 1, the multi-vision camera set is distributed at a plurality of positions near the welding table, is not limited to the number and the position arrangement shown in the figure, is provided with image acquisition cards, is connected with an upper computer in a communication manner and is fixed by a camera bracket, and is used for acquiring and transmitting welding line images.
The structural light module comprises a structural light source and structural light image acquisition equipment which are fixed on a robot end joint holding pole support, and an image acquisition card is arranged in each structural light source and structural light image acquisition equipment and is in communication connection with an upper computer and used for acquiring and transmitting welding line images with pre-encoded structural light stripes.
Wherein the structured light source is emitted by an infrared emitter, but is not limited thereto.
The structured light image acquisition device may be an infrared camera, but is not limited thereto.
The welding gun is installed at the tail end of the robot through flange connection.
The robot PLC controller 2 is respectively connected with an upper computer, a robot and a welding machine power supply through a field bus.
And the welding machine power supply is connected with the welding gun.
The upper computer is used for manually inputting characteristic parameters, processing image information acquired by the multi-vision camera set and the structural light module, identifying the type and the position of a welding seam, determining the posture of the welding gun, carrying out track planning on the robot, predicting the parameters of the welding gun, and optionally, judging the type of the welding seam by the upper computer through a deep learning target detection algorithm, determining a target welding seam area and determining the posture of the corresponding welding gun according to the type of the welding seam; the deep learning target detection algorithm at least adopts one of a YOLO target detection model or other prediction models based on convolutional neural networks.
The upper computer calculates a predicted welding gun voltage current value, a wire feeding speed and a protective gas flow through a predicted neural network model according to preset welding parameters and the calculated welding seam width and the welding seam depth before welding; the prediction neural network model at least adopts one of a BP network model or other prediction models based on an artificial neural network.
Further, the upper computer sends welding speed, welding gun posture, track planning information and welding gun parameters of the welding robot to the robot PLC.
Further, the robot PLC controller generates a motion instruction to drive the mechanical arm according to the robot track planning information, and sends welding gun parameters to a welding machine power supply to drive a welding gun.
Preferably, the man-machine interface input item for manually inputting parameters includes: the weld seam surplus height, the expected penetration value, the welding plate material and the welding speed after welding.
Preferably, the fieldbus is a Modbus communication protocol, but is not limited thereto.
As another preferred embodiment, as shown in fig. 3, the welding robot parameterized programming method based on machine vision and neural network provided by the present invention includes the following steps:
step 1) establishing a world coordinate system, a robot (each axis) coordinate system and a tool coordinate system, performing forward and backward kinematic modeling on the robot, calibrating a multi-vision camera and a structured light module, and performing hand-eye calibration on the camera and the structured light module and the robot respectively.
Step 2), an operator inputs characteristic parameters (shown in fig. 4) through a human-computer interface of the upper computer, the plate to be welded is fixed on the welding table 5, and a welding procedure is started.
And 3) the multi-vision camera unit transmits the acquired image to an upper computer for processing, a preset welding seam detection neural network judges the type of the welding seam, the posture of a welding gun corresponding to the type of the welding seam is determined, and a neural network prediction frame is used as a welding seam area image.
And 4) performing image processing on the welding seam area image by the upper computer, determining welding path characteristic points, converting the characteristic point positions from a pixel coordinate system to a world coordinate system and a joint coordinate system, recording characteristic point coordinate information, and performing track planning of a joint space.
And 5) receiving reflected light rays irradiated by the structured light source by the structured light module receiver, finding a corresponding position on the structured light image according to the coordinate information of the characteristic points of the welding path recorded in the step 5), calculating the width of the welding seam, and calculating the distance between the welding seam and the surface of the welding plate as the depth of the welding seam before welding.
And 6) the upper computer takes the input characteristic parameters of the step 2) and the weld width and depth of the step 5) as inputs of a prediction neural network to obtain a welding gun voltage current value, a wire feeding speed and a protection air flow.
And 7) the upper computer sends the corresponding parameter information about the welding gun and the robot motion planning obtained in the steps 1) to 6) to the robot PLC.
And 8) generating a motion instruction by the robot PLC according to the robot track planning information to drive the mechanical arm, and sending welding gun parameters to a welding machine power supply to drive the welding gun.
Further comprises: step 9) the robot executing mechanism finishes the action instruction or generates an error, the executing mechanism sends corresponding information to the robot PLC controller, the robot PLC controller sends corresponding communication information to inform the upper computer, if the error occurs, the program is immediately interrupted, and the intervention of an operator is waited.
In the description of the present invention, "multiple" in a group of multiple vision cameras means two or more, unless specifically defined otherwise.
In the present invention, filtering, segmentation, morphological processing, and weld centerline extraction are also included for the ROI area image, but not limited thereto.
In the invention, the method for calculating the weld width and the weld depth before welding specifically comprises the following steps: and obtaining reflected light rays irradiated by the structured light source, finding out corresponding positions on the structured light image according to the coordinate information of the recorded characteristic points, calculating the width of the welding seam, and calculating the distance between the welding seam and the surface of the welding plate to be used as the depth of the welding seam before welding.
In the invention, the preset welding parameters, the calculated weld width and the calculated weld depth are input into a prediction neural network model, and the predicted welding gun voltage current value, the predicted wire feeding speed and the predicted protection gas flow are calculated; and according to the voltage and current values, wire feeding speed, protective air flow, welding gun posture, track planning of a robot joint space, presetting welding parameters and controlling a robot executing mechanism to work.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above illustrative embodiments are merely illustrative and are not intended to limit the scope of the present invention thereto. Various changes and modifications may be made therein by one of ordinary skill in the art without departing from the scope and spirit of the invention. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, e.g., the division of the elements is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another device, or some features may be omitted or not performed.
Various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some of the modules according to embodiments of the present invention may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present invention can also be implemented as an apparatus program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present invention may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. While the invention has been described in conjunction with the specific embodiments above, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art in light of the foregoing description. Accordingly, all such alternatives, modifications, and variations are included within the spirit and scope of the following claims.

Claims (10)

1. A welding robot parameterized programming method based on machine vision and neural network, the method comprising the steps of:
s1: initializing, and collecting a weld joint area image;
s2: judging the type of a welding seam through a deep learning target detection algorithm, determining a target welding seam area, and determining a corresponding welding gun posture according to the type of the welding seam;
s3: taking the target weld joint region as an ROI region, performing primary treatment, and determining Cartesian space welding path feature points according to a primary treatment result; performing track planning of a robot joint space according to the Cartesian space welding path characteristic points, and calculating the width of a welding seam and the depth of the welding seam before welding;
s4: and executing a welding procedure according to the weld width, the weld depth before welding and preset welding parameters.
2. The welding robot parameterized programming method based on machine vision and neural network of claim 1, wherein the initializing specifically comprises: establishing a world coordinate system, each axis coordinate system of the robot and a tool coordinate system, performing forward and reverse kinematics modeling on the robot, calibrating a multi-vision camera and a structural optical module, and performing hand-eye calibration on the multi-vision camera, the structural optical module and the robot.
3. The machine vision and neural network based welding robot parameterized programming method of claim 2, wherein the initializing further comprises: and setting preset welding parameters according to the welding plate material, wherein the preset welding parameters at least comprise a weld seam excess height expected value, weld penetration, welding speed and the welding plate material.
4. The welding robot parameterized programming method based on machine vision and neural network of claim 3, wherein the one-time processing includes at least filtering, segmentation, morphological processing, and weld centerline extraction of the ROI area image.
5. The welding robot parameterized programming method based on machine vision and neural network of claim 4, further comprising, prior to performing trajectory planning of robot joint space: and converting the position of the characteristic point from a pixel coordinate system to a world coordinate system and a joint coordinate system, and recording the coordinate information of the characteristic point.
6. The machine vision and neural network based welding robot parameterized programming method of claim 5, wherein the calculating weld width and weld depth before welding specifically comprises: and obtaining reflected light rays irradiated by the structured light source, finding out corresponding positions on the structured light image according to the recorded coordinate information of the characteristic points, calculating the width of the welding seam, and calculating the distance between the welding seam and the surface of the welding plate as the depth of the welding seam before welding.
7. The welding robot parameterized programming method based on machine vision and neural network of claim 6, wherein step S4 specifically comprises: inputting the preset welding parameters, the calculated weld width and the calculated weld depth before welding into a prediction neural network model, and calculating to obtain a predicted welding gun voltage current value, a predicted wire feeding speed and a predicted protection gas flow; and according to the voltage and current values, wire feeding speed, protective air flow, welding gun posture, track planning of a robot joint space, presetting welding parameters and controlling a robot executing mechanism to work.
8. A system of a welding robot parameterized programming method based on machine vision and neural networks according to any of claims 1-7, characterized in that the system comprises at least:
the multi-vision camera set is used for shooting multi-dimensional weld joint images, completing image acquisition of weld joint areas and being in communication connection with the upper computer;
a structured light module, comprising: the structured light source and the structured light image acquisition equipment are both in communication connection with the upper computer;
the upper computer is used for presetting welding parameters, determining a target welding line area, performing image processing and robot track planning, calculating the width and depth of the welding line, predicting the working parameters of the welding machine and controlling the working of the robot executing mechanism;
the PLC is communicated with the upper computer through the field bus, and is used for receiving information sent by the upper computer and returning a value to inform the upper computer whether the upper computer is successfully executed or not; meanwhile, the robot actuator is in interactive communication with the field bus;
a robotic actuator comprising: a mechanical arm body structure and a welding module; the welding module comprises a welding machine power supply and a welding gun, and the welding gun is installed at the tail end of the mechanical arm through flange connection.
9. The system of claim 8, wherein the host computer determines a weld type by a deep learning target detection algorithm, determines a target weld area, and determines a corresponding welding gun pose according to the weld type; the deep learning target detection algorithm at least adopts one of a YOLO target detection model or other prediction models based on convolutional neural networks.
10. The system of claim 9, wherein the upper computer calculates predicted welding gun voltage current value, wire feed speed and protective gas flow through a predicted neural network model according to preset welding parameters and calculated weld width and weld depth; the prediction neural network model at least adopts one of a BP network model or other prediction models based on an artificial neural network.
CN202310228303.8A 2023-03-10 2023-03-10 Welding robot parameterized programming method and system based on machine vision and neural network Pending CN116330279A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310228303.8A CN116330279A (en) 2023-03-10 2023-03-10 Welding robot parameterized programming method and system based on machine vision and neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310228303.8A CN116330279A (en) 2023-03-10 2023-03-10 Welding robot parameterized programming method and system based on machine vision and neural network

Publications (1)

Publication Number Publication Date
CN116330279A true CN116330279A (en) 2023-06-27

Family

ID=86892249

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310228303.8A Pending CN116330279A (en) 2023-03-10 2023-03-10 Welding robot parameterized programming method and system based on machine vision and neural network

Country Status (1)

Country Link
CN (1) CN116330279A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118237825A (en) * 2024-05-28 2024-06-25 凯沃智能装备(青岛)有限公司 Welding machine method and welding robot based on artificial intelligence

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118237825A (en) * 2024-05-28 2024-06-25 凯沃智能装备(青岛)有限公司 Welding machine method and welding robot based on artificial intelligence

Similar Documents

Publication Publication Date Title
US8706300B2 (en) Method of controlling a robotic tool
CN104841593B (en) Control method of robot automatic spraying system
CN102424971B (en) Rapid laser repair method and device for defect of aluminum alloy guide blade
CN108453439A (en) The robot welding track self-programming system and method for view-based access control model sensing
CN114515924B (en) Automatic welding system and method for tower foot workpiece based on weld joint identification
CN113119122B (en) Hybrid off-line programming method of robot welding system
CN114161048B (en) 3D vision-based parameterized welding method and device for tower legs of iron tower
CN112958959A (en) Automatic welding and detection method based on three-dimensional vision
CN113333998A (en) Automatic welding system and method based on cooperative robot
CN114474041A (en) Welding automation intelligent guiding method and system based on cooperative robot
CN202607049U (en) Wheeled autonomously-moving welding robot control system with function of image monitoring
CN113634958A (en) Three-dimensional vision-based automatic welding system and method for large structural part
CN112453648A (en) Off-line programming laser welding seam tracking system based on 3D vision
CN113787245A (en) Robot intelligent welding program generation method and system
CN106583974A (en) Laser quick locating welding system and laser quick locating welding method without programming structural part
CN114769988B (en) Welding control method, system, welding equipment and storage medium
CN111992895A (en) Intelligent marking system and method
CN116117373A (en) Intelligent welding method and system for small assembly components in ship
CN116330279A (en) Welding robot parameterized programming method and system based on machine vision and neural network
CN115723133B (en) Automatic positioning and correcting system for space welding seam of robot based on virtual-real combination
CN117047237A (en) Intelligent flexible welding system and method for special-shaped parts
CN112958974A (en) Interactive automatic welding system based on three-dimensional vision
CN117381788A (en) High-precision positioning and intelligent operation guiding method for composite robot
CN114851209B (en) Industrial robot working path planning optimization method and system based on vision
CN110076767A (en) A kind of Intelligent welding control system and method based on image recognition technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination