Nothing Special   »   [go: up one dir, main page]

CN112958960B - Robot hand-eye calibration device based on optical target - Google Patents

Robot hand-eye calibration device based on optical target Download PDF

Info

Publication number
CN112958960B
CN112958960B CN202110171661.0A CN202110171661A CN112958960B CN 112958960 B CN112958960 B CN 112958960B CN 202110171661 A CN202110171661 A CN 202110171661A CN 112958960 B CN112958960 B CN 112958960B
Authority
CN
China
Prior art keywords
robot
coordinate system
optical target
point
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110171661.0A
Other languages
Chinese (zh)
Other versions
CN112958960A (en
Inventor
李欢欢
杨涛
李晓晓
王丛华
王芳
奈斌
周翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gedian Technology Shenzhen Co ltd
Original Assignee
Gedian Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gedian Technology Shenzhen Co ltd filed Critical Gedian Technology Shenzhen Co ltd
Priority to CN202110171661.0A priority Critical patent/CN112958960B/en
Publication of CN112958960A publication Critical patent/CN112958960A/en
Application granted granted Critical
Publication of CN112958960B publication Critical patent/CN112958960B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/0095Means or methods for testing manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

The invention aims to provide a robot hand-eye calibration method based on an optical target, which solves the problems of complicated calibration and large calibration error of the robot hand-eye. The invention uses the optical target with the accurately known relationship between the tip and the characteristic points as a calibration tool, the tip of the optical target is used for calibrating a tool coordinate system and a robot, the characteristic points on the target are used for calibrating an area array three-dimensional vision sensor and the tool coordinate system (method), and the rapid and high-precision calibration of the robot hand and eye is realized by an optimized calculation method. The method can overcome the problem of error accumulation in the calibration of the tool coordinate system and the visual coordinate system respectively, and solves the problems of complex operation and difficult teaching in the calibration process.

Description

Robot hand-eye calibration device based on optical target
Technical Field
The invention belongs to the field of three-dimensional vision and the field of robot application, and particularly relates to a robot hand-eye calibration device based on an optical target.
Background
Robots are increasingly applied to various industries, welding work is seriously polluted, and the industries damaging human health urgently need the robots to replace manual work to realize automatic operation. However, in the robot application process, the problems of complex calibration, difficult path planning, high technical requirements on operators and the like exist, so that the industrial application is slowly promoted. The three-dimensional vision sensor can automatically identify a welding path and guide the robot to automatically operate, but the calibration of the vision sensor and the calibration of the welding gun are respectively carried out, so that the calibration precision of the welding gun and the calibration precision of the vision are qualified, but the system error is still large. In the prior scheme, a standard device is required for calibrating the welding gun and the vision sensor respectively, and the calibration process is complex.
Disclosure of Invention
The invention aims to provide a robot hand-eye calibration device based on an optical target, which solves the problems of complicated calibration and larger calibration error of the robot hand-eye. The invention uses the optical target with the accurately known relationship between the tip and the characteristic point as a calibration tool, the tip of the optical target is used for calibrating a tool coordinate system and a robot, the characteristic point on the target is used for calibrating an area array three-dimensional visual sensor and the tool coordinate system (method), and the rapid and high-precision calibration of the robot hand and eye is realized by the optimized calculation method. The method can overcome the problem of error accumulation in the calibration of the tool coordinate system and the visual coordinate system respectively, and solves the problems of complex operation and difficult teaching in the calibration process.
The realization process of the invention is as follows:
the invention relates to a robot hand-eye calibration device based on an optical target, which comprises a robot 1, an area array three-dimensional vision sensor (vision sensor for short) 2, a welding gun 3, the optical target 4 and a computer 5, as shown in figure 1. The robot 1 is composed of a robot body 11, a robot controller 12, and a teach pendant 13. The welding gun 3 and the vision sensor 2 are both fixedly arranged at the tail end of the robot; the welding gun 3 is an actuating mechanism of the system, and is also called as a tool, and the tool refers to the welding gun 3 in the following; the vision sensor 2 is the eye of the system; the optical target 4 is the workpiece in the present system. If the positions of the welding gun 3, the vision sensor 2 and the tail end joint of the robot body 21 are changed relatively, the system needs to be calibrated again by hands and eyes. The optical target 4 has a tip 41 and a number of identifiable feature points 42, the positional relationship between the feature points and the target tip being precisely known.
The invention relates to a robot hand-eye calibration based on an optical target, as shown in fig. 2, the calibration process is as follows:
s1, fixing an optical target in a working range of a robot;
s2: manually teaching, aligning a Tool end Center Point (TCP) and a tip of the optical Target (Target Top Point);
s3: the TCP of the robot is kept still, the posture of the robot is changed, and the position and posture coordinates of the robot are recorded;
s4: calculating to obtain a conversion relation between a robot basic coordinate system and a tool coordinate system;
s5: manually teaching, and enabling a vision sensor to shoot a characteristic Point TMP (Target Mark Point) of an optical Target;
s6: calculating the shooting attitude of the optical target, shooting at multiple angles, and recording the pose coordinates of the robot obtained at each angle;
s7: calculating the coordinate of the optical target characteristic point TMP in a visual coordinate system;
s8: calculating a conversion relation between a visual coordinate system and a robot terminal coordinate system;
s9: and optimizing the conversion relation between the basic coordinate system and the tool coordinate system of the robot according to the relation between the TTP and the TMP of the optical target to obtain a more accurate conversion matrix.
And S10, calculating a hand-eye transformation matrix of the robot, and multiplying an inverse matrix of a matrix obtained in the S9 and obtained by the transformation from the optimized robot basic coordinate system to the tool coordinate system by a transformation matrix obtained in the S8 and the robot terminal coordinate system.
Advantageous effects
The calibration method of the robot hand-eye calibration device based on the optical target can complete the calibration of the robot hand-eye only by simple teaching; the invention has simple operation, reduces the requirement on operators and is beneficial to promoting the wide application of industrial robots; the error of manual operation is reduced, and the system precision is ensured.
The invention adopts the optical target to calibrate the end tool and the visual sensor at the same time, and realizes the quick calibration and automatic calibration of the tool coordinate system and the visual coordinate system. The optical target with accurately known relative positions of the tip point and the characteristic point is used as a calibration tool, so that the problem of error accumulation in the calibration of a tool coordinate system and a visual coordinate system respectively is solved, the closed-loop control of data of hand-eye calibration is realized, and the precision of the hand-eye calibration of the system is improved; and a global optimization method of least squares is adopted, so that the calculation is simple and the efficiency is high.
Drawings
FIG. 1 is a system composition diagram.
Fig. 2 illustrates a system coordinate system.
Detailed Description
The following detailed description of embodiments of the present invention is provided in connection with the accompanying drawings and examples. The following embodiments are suitable for the present invention, but are not the full scope of the invention. It should be noted that the meaning of the letter symbol in the following formula is the same as that of the same letter symbol in the same formula of the inventive content of the specification, and is not described again.
The invention relates to a robot hand-eye calibration device based on an optical target, which comprises a robot 1, an area array three-dimensional vision sensor (vision sensor for short) 2, a welding gun 3, the optical target 4 and a computer 5, as shown in figure 1. The right side of the robot 1 consists of a robot body 11, a robot controller 12 and a demonstrator 13;
the welding gun 3 and the vision sensor 2 are both fixedly arranged at the tail end of the robot; the welding gun 3 is an actuating mechanism of the system, and is also called as a tool, and the tool refers to the welding gun 3 in the following; the vision sensor 2 is the eye of the system; the optical target 4 is a workpiece in the present system. If the positions of the welding gun 3, the vision sensor 2 and the tail end joint of the robot body 21 are changed relatively, the system needs to be calibrated again by hands and eyes.
The optical target 4 has a tip 41 and a number of identifiable feature points 42, the positional relationship between the feature points and the target tip being precisely known. The recognizable feature point may be a pattern or a light source such as led, which is required to be photographed from various angles in order to meet the accuracy requirement, and the spatial position of the feature is kept unchanged. The tip 41 and the feature point 42 of the optical target obtain their precise relative position relationship through a complicated calibration process, and the mapping relationship is fI.e. P 0 =f(P j ) J =1 to k wherein P j And P 0 Is under the same coordinate system, P j Is the coordinate of the optical target feature point TMP, P 0 Is the coordinates of the optical target tip TTP.
The Coordinate System according to the present invention is shown in fig. 2, and the basic Coordinate System BCS (Base Coordinate System) of the robot has points in the Coordinate System P b (X b ,Y b ,Z b ) (ii) a Robot End Coordinate System ECS (End Coordinate System) with points in the Coordinate System representing P e (X e ,Y e ,Z e ) (ii) a Visual Coordinate System CCS (Camera Coordinate System) with points in the Coordinate System representing P c (X c ,Y c ,Z c ) (ii) a Tool Coordinate System TCS (Tool Coordinate System), the points in the Coordinate System of which represent P t (X t ,Y t ,Z t ) (ii) a The object Coordinate System WCS (Work Coordinate System), the points in the Coordinate System of which represent P w (X w ,Y w ,Z w )。
The invention relates to a robot hand-eye calibration method based on an optical target, which is shown in figure 2 and comprises the following calibration processes:
s1, fixing the optical Target in the working range of the robot, wherein the working range is generally the center of the working range of the robot, the tip TTP (Target Top Point) of the optical Target is upward, and the front surface of a characteristic Point TMP (Target Mark Point) on the optical Target faces the robot.
S2: operating the demonstrator to make the Tool end Center Point TCP (Tool Center Point) and the tip TTP of the optical target in contact alignment;
s3: the TCP at the tail end of the robot tool is kept still, the posture of the robot is changed, and the obtained position and posture coordinate of the robot is
Figure GDA0003969549720000031
i =1 to m, the larger the change in robot posture, the better, and m is 4 or more.
S4: calculating to obtain the initial conversion relation from the tool coordinate system to the robot terminal coordinate system
Figure GDA0003969549720000032
The usual method is the 4-point method; usually, the calculation result can be directly read from the robot controller
Figure GDA0003969549720000041
S5: operating a demonstrator to enable a visual sensor to shoot characteristic points TMP of an optical target, wherein all the characteristic points need to be shot clearly, and the distance from the visual sensor to the optical target is the optimal working distance of the visual sensor; when the characteristic points of the optical target are shot from n angles and multi-angle shooting is carried out, the pose coordinate of the robot is
Figure GDA0003969549720000042
j=1~n,n≥6。
S7, calculating the coordinate of the characteristic point TMP of the shot optical target under a visual coordinate system CCS as P c jk J =1 to n, k =1 to s, s is the number of TMPs on the optical target, and n is the number of angles at which the optical target is photographed.
S8: the conversion relation between the visual coordinate system and the robot terminal coordinate system is obtained by calculation
Figure GDA0003969549720000043
S9: optimizing according to the known position relation of the tip point TTP and the characteristic point TMP of the optical target
Figure GDA0003969549720000044
Obtaining the accurate conversion relation between the new tool coordinate system TCS and the robot end coordinate system ECS
Figure GDA0003969549720000045
S10, converting a hand-eye transformation matrix of the robot, namely a transformation matrix for transforming the vision sensor coordinate system CCS into the tool coordinate system TCS into a matrix of:
Figure GDA0003969549720000046
wherein,
Figure GDA0003969549720000047
and transforming the matrix for the robot hand-eye.
The key steps in the above steps are described as follows:
s7, calculating the coordinate of the TMP obtained by shooting
The three-dimensional vision sensor is a binocular vision sensor, and when the target feature points are shot, binocular cameras (a left camera and a right camera) are exposed simultaneously to obtain a left picture and a right picture; preprocessing the left and right pictures, extracting the feature points to respectively obtain the center coordinates of the mark points on the left and right pictures, and performing feature matching and coordinate calculation according to the internal parameters of a binocular system to obtain the coordinates { P) of TMC in a visual coordinate system CCS c ij |(x cij ,y cij ,z cij ) I =1 to k, j =1 to n }. Because the characteristic points TMC in the shot picture are complete, the calculated three-dimensional point coordinates are arranged according to a fixed sequence to form ordered points.
S8: calculating to obtain an initial value H of the conversion relation between the visual coordinate system and the basic coordinate system of the robot b c
And converting the vision coordinate system CCS into a robot tail end coordinate system ECS and then into a robot basic coordinate system BCS, wherein a homogeneous matrix is arranged between every two adjacent coordinate systems to represent the relative pose relationship. Therefore, the three-dimensional coordinates of the corner points in the camera coordinate system can be converted into the basic coordinate system of the robot through the pose relations among the four parts. Represented by the following formula:
Figure GDA0003969549720000048
in the formula, P c Represents the three-dimensional coordinates, P, of the characteristic point TMP in the visual coordinate system CCS b Representing the three-dimensional coordinates of the BCS of the three-dimensional point cloud coordinates under a robot basic coordinate system;
Figure GDA0003969549720000051
is a transformation matrix from a visual coordinate system CCS to a robot terminal coordinate system ECS;
Figure GDA0003969549720000052
is a transformation matrix from a robot terminal coordinate system ECS to a robot basic coordinate system BCS.
P c Is the ordered feature point coordinates calculated in S8.
In the above formula, P b And
Figure GDA0003969549720000053
the unknown quantities, others are known quantities, and in order to solve for the unknown quantities, are represented by the following equations:
Figure GDA0003969549720000054
in the formula: j-the j-th movement of the robot is performed, and the maximum is n; s-the s-th feature point on the target, the maximum is k;
Figure GDA0003969549720000055
-coordinates in BCS of the s angular point at the j-th movement of the robot;
Figure GDA0003969549720000056
-pose of the robot tip relative to the base coordinate system the jth time the robot moves to a certain position.
In the above process, the position of the TMP in the robot base coordinate system is not changed, and the coordinates of the TMP corner points obtained by photographing at different positions in the robot base coordinate system should be equal. Based on this principle, the objective function is as follows:
Figure GDA0003969549720000057
solving equation 3 to obtain
Figure GDA0003969549720000058
S9: optimizing according to the known position relation of the tip point TTP and the characteristic point TMP of the optical target
Figure GDA0003969549720000059
Obtaining a new accurate conversion relation matrix of a tool coordinate system TCS and a robot end coordinate system ECS
Figure GDA00039695497200000510
According to the relation between TTP and TMP of the optical target, the optical target is obtained by accurate calibration, the coordinate system of the optical target is a workpiece coordinate system WCS, the tip TTP of the optical target is an origin, and the coordinate of the characteristic point TMP in the WCS is
Figure GDA00039695497200000511
j =1 to k. The coordinates of the optical target tip TTP can be calculated from the coordinates of the characteristic points TMP, and the mapping relation is f, i.e. P 0 =f(P j ) J =1 to k. In the step S7, the coordinates of the target feature point TMP in the visual coordinate system CCS are known, and the coordinates of the target tip point TTP in the same coordinate system can be obtained through the mapping relationship. Because alignment errors are generated when the tool coordinate system TCP is aligned with the target tip TTP, the calibration result calculated by the vision sensor is more accurate compared with the calibration result calculated by the vision sensor, and therefore, the calibration of the tool end is compensated by taking the target tip coordinate calculated by the vision coordinate system as the TTP as a true value.
Figure GDA0003969549720000061
Optimization
Figure GDA0003969549720000062
And obtaining a new conversion relation between the tool coordinate system TCS and the robot end coordinate system ECS
Figure GDA0003969549720000063

Claims (6)

1. The utility model provides a robot hand eye calibration device based on optical target which characterized in that: the system comprises a robot, an area array three-dimensional vision sensor, a welding gun, an optical target and a computer; the robot consists of a robot body, a robot controller and a demonstrator; the welding gun and the area array three-dimensional vision sensor are both fixedly arranged at the tail end of the robot body; the optical target is arranged in the visible range of the area array three-dimensional visual sensor; the optical target is provided with a target tip and a plurality of identifiable characteristic points, and the relative position relationship between the characteristic points and the target tip is determined; the working process of the robot hand-eye calibration device is as follows:
s1, fixing an optical target in a working range of a robot;
s2: manually teaching to align a tool tip center point with a tip of the optical target;
s3: keeping the central point of the tail end of the robot tool still, transforming the posture of the robot, and recording the position and posture coordinates of the robot;
s4: calculating to obtain a conversion relation between a basic coordinate system and a tool coordinate system of the robot;
s5: manually teaching, and enabling a visual sensor to shoot characteristic points of the optical target;
s6: calculating the shooting posture of the optical target, shooting at multiple angles, and recording the pose coordinates of the robot obtained at each angle;
s7: calculating the coordinates of the optical target characteristic points in a visual coordinate system;
s8: calculating a conversion relation between a visual coordinate system and a robot terminal coordinate system;
s9: optimizing the conversion relation between the robot tail end coordinate system and the tool coordinate system according to the relation between the tip of the optical target and the characteristic point to obtain a more accurate conversion matrix; s9, the calibration results of the visual coordinate system and the robot tail end coordinate system are unchanged, the coordinates of the target point end point are calculated according to the visual sensor, and the transformation matrix of the tool coordinate system and the robot tail end is optimized;
and S10, calculating a hand-eye transformation matrix of the robot, obtaining an inverse matrix of a matrix for converting the optimized robot tail end coordinate system into a tool coordinate system in S9, and multiplying the inverse matrix by the conversion matrix of the vision coordinate system and the robot tail end coordinate system obtained in S8.
2. A robot hand-eye calibration device according to claim 1, wherein: the recognizable feature point is a pattern or a luminous light source; the mapping relation of the tip and the characteristic point of the optical target is f, namely P 0 =f(P s ) S =1 to k; wherein, P s And P 0 Is under the same coordinate system, P s Is the coordinate of the optical target feature point TMP, P 0 Is the coordinates of the optical target tip TTP; i =1 to m, the larger the change in robot posture, the better, and m is 4 or more.
3. The robot hand-eye calibration device of claim 1, wherein: s4, calculating to obtain the initial of the conversion relation from the tool coordinate system to the robot terminal coordinate system
Figure FDA0003969549710000011
The usual method is the 4-point method; usually, the calculation result can be directly read from the robot controller
Figure FDA0003969549710000012
4. The robot hand-eye calibration device of claim 1, wherein: s7, all the characteristic points need to be clearly shot, and the distance from the visual sensor to the optical target is the optimal working distance of the visual sensor; when the characteristic points of the optical target are shot from n angles and multi-angle shooting is carried out, the pose coordinate of the robot is
Figure FDA0003969549710000021
5. The robot hand-eye calibration device of claim 1, wherein: in S7, the characteristics of the shot optical target are calculatedThe coordinate of the point TMP under the visual coordinate system CCS is P c js J =1 to n, s =1 to k, k being the number of TMPs on the optical target, n being the number of angles at which the optical target is photographed.
6. The robot hand-eye calibration device of claim 1, wherein: s8, calculating to obtain an initial value H of a conversion relation between a visual coordinate system and a basic coordinate system of the robot b c, Converting a vision coordinate system CCS into a robot tail end coordinate system ECS and then into a robot basic coordinate system BCS, wherein a homogeneous matrix is arranged between every two adjacent coordinate systems to represent the relative pose relation; therefore, the angular point three-dimensional coordinates under the camera coordinate system can be converted into a robot basic coordinate system through the pose relationship among the four parts; represented by the following formula:
Figure FDA0003969549710000022
in the formula, P c Represents the three-dimensional coordinates, P, of the characteristic point TMP in the visual coordinate system CCS b Representing the three-dimensional coordinates of the BCS of the three-dimensional point cloud coordinates under a robot basic coordinate system;
Figure FDA0003969549710000023
is a transformation matrix from a visual coordinate system CCS to a robot terminal coordinate system ECS;
Figure FDA0003969549710000024
is a transformation matrix from an ECS (robot end coordinate system) to a BCS (basic coordinate system) of the robot;
P c the ordered feature point coordinates obtained by calculation in S7;
in the above formula, P b And
Figure FDA0003969549710000025
the unknown quantities, others are known quantities, and in order to solve for the unknown quantities, are represented by the following equations:
Figure FDA0003969549710000026
in the formula: j-j is the j th time of the robot movement, and the maximum is n; s-the s-th feature point on the target, the maximum is k;
Figure FDA0003969549710000027
-coordinates in BCS of the s angular point at the j-th movement of the robot;
Figure FDA0003969549710000028
-pose of the robot end with respect to the base coordinate system when the robot moves to a certain position for the jth time;
in the process, the position of the TMP under the robot basic coordinate system is not changed, and the coordinates of the TMP angular points obtained by shooting at different positions under the robot basic coordinate system are equal; based on this principle, the objective function is as follows:
Figure FDA0003969549710000029
solving equation 3 to obtain
Figure FDA00039695497100000210
CN202110171661.0A 2021-02-08 2021-02-08 Robot hand-eye calibration device based on optical target Active CN112958960B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110171661.0A CN112958960B (en) 2021-02-08 2021-02-08 Robot hand-eye calibration device based on optical target

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110171661.0A CN112958960B (en) 2021-02-08 2021-02-08 Robot hand-eye calibration device based on optical target

Publications (2)

Publication Number Publication Date
CN112958960A CN112958960A (en) 2021-06-15
CN112958960B true CN112958960B (en) 2023-01-24

Family

ID=76275382

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110171661.0A Active CN112958960B (en) 2021-02-08 2021-02-08 Robot hand-eye calibration device based on optical target

Country Status (1)

Country Link
CN (1) CN112958960B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114310881B (en) * 2021-12-23 2024-09-13 中国科学院自动化研究所 Calibration method and system of mechanical arm quick-change device and electronic equipment
CN115139283B (en) * 2022-07-18 2023-10-24 中船重工鹏力(南京)智能装备系统有限公司 Robot hand-eye calibration method based on random mark dot matrix
CN115847423B (en) * 2022-12-30 2024-05-28 合肥工业大学 Industrial robot eye-to-hand vision system calibration method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system
CN108717715A (en) * 2018-06-11 2018-10-30 华南理工大学 A kind of line-structured light vision system automatic calibration method for arc welding robot
CN108765489A (en) * 2018-05-29 2018-11-06 中国人民解放军63920部队 A kind of pose computational methods, system, medium and equipment based on combination target
CN109658460A (en) * 2018-12-11 2019-04-19 北京无线电测量研究所 A kind of mechanical arm tail end camera hand and eye calibrating method and system
CN110906863A (en) * 2019-10-30 2020-03-24 成都绝影智能科技有限公司 Hand-eye calibration system and calibration method for line-structured light sensor
CN111504183A (en) * 2020-04-22 2020-08-07 无锡中车时代智能装备有限公司 Calibration method for relative position of linear laser three-dimensional measurement sensor and robot
CN112223302A (en) * 2020-12-17 2021-01-15 国网瑞嘉(天津)智能机器人有限公司 Rapid calibration method and device of live working robot based on multiple sensors
WO2021012122A1 (en) * 2019-07-19 2021-01-28 西门子(中国)有限公司 Robot hand-eye calibration method and apparatus, computing device, medium and product

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system
CN108765489A (en) * 2018-05-29 2018-11-06 中国人民解放军63920部队 A kind of pose computational methods, system, medium and equipment based on combination target
CN108717715A (en) * 2018-06-11 2018-10-30 华南理工大学 A kind of line-structured light vision system automatic calibration method for arc welding robot
CN109658460A (en) * 2018-12-11 2019-04-19 北京无线电测量研究所 A kind of mechanical arm tail end camera hand and eye calibrating method and system
WO2021012122A1 (en) * 2019-07-19 2021-01-28 西门子(中国)有限公司 Robot hand-eye calibration method and apparatus, computing device, medium and product
CN110906863A (en) * 2019-10-30 2020-03-24 成都绝影智能科技有限公司 Hand-eye calibration system and calibration method for line-structured light sensor
CN111504183A (en) * 2020-04-22 2020-08-07 无锡中车时代智能装备有限公司 Calibration method for relative position of linear laser three-dimensional measurement sensor and robot
CN112223302A (en) * 2020-12-17 2021-01-15 国网瑞嘉(天津)智能机器人有限公司 Rapid calibration method and device of live working robot based on multiple sensors

Also Published As

Publication number Publication date
CN112958960A (en) 2021-06-15

Similar Documents

Publication Publication Date Title
CN112958960B (en) Robot hand-eye calibration device based on optical target
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
CN109794938B (en) Robot hole-making error compensation device and method suitable for curved surface structure
CN108717715B (en) Automatic calibration method for linear structured light vision system of arc welding robot
CN110666798B (en) Robot vision calibration method based on perspective transformation model
CN108582076A (en) A kind of Robotic Hand-Eye Calibration method and device based on standard ball
CN111941425B (en) Rapid workpiece positioning method of robot milling system based on laser tracker and binocular camera
CN111775154A (en) Robot vision system
CN110276806A (en) Online hand-eye calibration and crawl pose calculation method for four-freedom-degree parallel-connection robot stereoscopic vision hand-eye system
CN112894823B (en) Robot high-precision assembling method based on visual servo
CN109794963B (en) Robot rapid positioning method facing curved surface component
CN108098762A (en) A kind of robotic positioning device and method based on novel visual guiding
CN111781894B (en) Method for carrying out space positioning and attitude navigation on assembly tool by utilizing machine vision
CN104827480A (en) Automatic calibration method of robot system
CN113146620A (en) Binocular vision-based double-arm cooperative robot system and control method
CN114519738A (en) Hand-eye calibration error correction method based on ICP algorithm
CN110202560A (en) A kind of hand and eye calibrating method based on single feature point
CN112648934A (en) Automatic elbow geometric form detection method
CN111482964A (en) Novel robot hand-eye calibration method
CN109059755B (en) High-precision hand-eye calibration method for robot
CN114643578A (en) Calibration device and method for improving robot vision guide precision
CN115284292A (en) Mechanical arm hand-eye calibration method and device based on laser camera
CN110962127B (en) Auxiliary calibration device for tail end pose of mechanical arm and calibration method thereof
CN113172632A (en) Simplified robot vision servo control method based on images
CN115179323A (en) Machine end pose measuring device based on telecentric vision constraint and precision improving method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant