Nothing Special   »   [go: up one dir, main page]

CN108789404A - A kind of serial manipulator kinematic calibration method of view-based access control model - Google Patents

A kind of serial manipulator kinematic calibration method of view-based access control model Download PDF

Info

Publication number
CN108789404A
CN108789404A CN201810510980.8A CN201810510980A CN108789404A CN 108789404 A CN108789404 A CN 108789404A CN 201810510980 A CN201810510980 A CN 201810510980A CN 108789404 A CN108789404 A CN 108789404A
Authority
CN
China
Prior art keywords
robot
optical axis
kinematic
camera
kinematic parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810510980.8A
Other languages
Chinese (zh)
Other versions
CN108789404B (en
Inventor
朱齐丹
谢心如
李超
夏桂华
张智
蔡成涛
吕晓龙
刘志林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Engineering University
Original Assignee
Harbin Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Engineering University filed Critical Harbin Engineering University
Priority to CN201810510980.8A priority Critical patent/CN108789404B/en
Publication of CN108789404A publication Critical patent/CN108789404A/en
Application granted granted Critical
Publication of CN108789404B publication Critical patent/CN108789404B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

本发明提供一种基于视觉的串联机器人运动学参数标定方法,将相机光轴作为虚拟直线约束,建立基于直线约束的运动学误差模型;在机器人末端固定的标定板上选择一个固定点作为特征点,使用基于图像的视觉控制方法控制机械臂运动,使特征点到达相机的光轴上;根据机器人的关节角数据,使用正运动学计算特征点的名义位置,计算对齐误差矩阵;通过迭代最小二乘算法估计运动学参数误差,根据名义的运动学参数计算实际的运动学参数。本发明利用相机的光轴作为虚拟约束,仅使用机器人的关节角数据即可完成标定,成本低、易操作,不需要昂贵的高精度测量设备,对串联机器人标定具有通用性,可广泛应用于工业、空间、水下环境中提高机械臂的绝对定位精度。

The invention provides a vision-based calibrating method for the kinematics parameters of serial robots. The optical axis of the camera is used as a virtual straight line constraint, and a kinematics error model based on the straight line constraint is established; a fixed point is selected as a feature point on a calibration plate fixed at the end of the robot. , use the image-based visual control method to control the movement of the manipulator, so that the feature points reach the optical axis of the camera; according to the joint angle data of the robot, use forward kinematics to calculate the nominal position of the feature points, and calculate the alignment error matrix; through iterative least squares The multiplication algorithm estimates the kinematic parameter error, and calculates the actual kinematic parameter according to the nominal kinematic parameter. The invention uses the optical axis of the camera as a virtual constraint, and can complete the calibration using only the joint angle data of the robot. It is low in cost, easy to operate, and does not require expensive high-precision measuring equipment. It is universal for calibrating series robots and can be widely used Improve the absolute positioning accuracy of the robotic arm in industrial, space, and underwater environments.

Description

一种基于视觉的串联机器人运动学参数标定方法A Vision-Based Calibration Method for Kinematics Parameters of Serial Robots

技术领域technical field

本发明涉及一种基于视觉的串联机器人运动学参数标定方法,属于机器人标定领域。The invention relates to a vision-based calibrating method for kinematics parameters of serial robots, belonging to the field of robot calibrating.

背景技术Background technique

随着机器人越来越多的应用于装配、手术、协作等任务,对机器人的末端定位精度有着越来越高的要求。目前机器人末端的位置和姿态误差无法直接进行测量,通常使用运动学模型与关节角数据间接计算得到。由于制造公差、磨损、传输误差、安装位置、环境变化等因素的影响,机器人模型中实际的运动学参数与其名义值间存在偏差,若用名义运动学参数计算机器人的末端位姿,将导致机器人末端位姿的绝对定位精度降低。为了提高机器人的绝对定位精度,必须对机器人的运动学参数进行有效的标定,这也是机器人学研究领域的难点之一。As robots are more and more used in tasks such as assembly, surgery, and collaboration, there are increasingly higher requirements for the positioning accuracy of the robot's end. At present, the position and attitude errors of the robot end cannot be directly measured, and are usually calculated indirectly by using kinematic models and joint angle data. Due to the influence of manufacturing tolerances, wear, transmission errors, installation position, environmental changes and other factors, there are deviations between the actual kinematic parameters in the robot model and their nominal values. If the nominal kinematic parameters are used to calculate the end pose of the robot, it will lead to The absolute localization accuracy of the end pose is reduced. In order to improve the absolute positioning accuracy of the robot, the kinematic parameters of the robot must be calibrated effectively, which is also one of the difficulties in the field of robotics research.

通常,在机器人标定算法中使用高精度的测量设备测量机器人末端的实际位姿,但是这类测量仪器非常昂贵且标定过程十分复杂,对安装调试及测量过程的技术要求较高。基于视觉的标定方法通常使用相机作为测量工具测量实际的末端位姿,相机视野及相机参数误差对运动学参数标定结果有很大影响。为了避免使用高精度测量设备,使用一种成本低、易操作的标定方法,并降低测量设备对标定结果的影响,对于各应用场景中机器人的绝对定位精度的提高十分必要。Usually, high-precision measuring equipment is used in the robot calibration algorithm to measure the actual pose of the robot end, but such measuring instruments are very expensive and the calibration process is very complicated, which requires high technical requirements for installation, debugging and measurement process. Vision-based calibration methods usually use the camera as a measurement tool to measure the actual end pose, and the camera field of view and camera parameter errors have a great impact on the calibration results of kinematic parameters. In order to avoid the use of high-precision measurement equipment, use a low-cost, easy-to-operate calibration method, and reduce the impact of measurement equipment on the calibration results, it is necessary to improve the absolute positioning accuracy of robots in various application scenarios.

本发明涉及的方法针对典型的六自由度工业机械臂,通过定义相机光轴为虚拟直线约束,解决了机器人末端定位精度低的问题。基于视觉的串联机器人运动学参数标定方法对提高串联机械臂的绝对定位精度具有重要的借鉴意义,可直接应用于串联机械臂的运动学参数标定。The method involved in the invention solves the problem of low positioning accuracy at the end of the robot by defining the optical axis of the camera as a virtual straight line constraint for a typical six-degree-of-freedom industrial robot arm. The vision-based calibrating method of kinematic parameters of tandem manipulator has important reference significance for improving the absolute positioning accuracy of tandem manipulator, and can be directly applied to the calibration of kinematic parameters of tandem manipulator.

发明内容Contents of the invention

本发明的目的是为了实现机器人基于相机光轴虚拟约束完成运动学参数标定而提供一种基于视觉的串联机器人运动学参数标定方法。The purpose of the present invention is to provide a vision-based method for calibrating the kinematics parameters of serial robots in order to realize the calibration of the kinematics parameters of the robot based on the virtual constraints of the camera optical axis.

本发明的目的是这样实现的:步骤如下:The object of the present invention is achieved like this: step is as follows:

步骤一:将相机光轴作为虚拟直线约束,建立基于直线约束的运动学误差模型;Step 1: Use the camera optical axis as a virtual straight line constraint, and establish a kinematic error model based on the straight line constraint;

步骤二:在机器人末端固定的标定板上选择一个固定点作为特征点,使用基于图像的视觉控制方法控制机械臂运动,使特征点到达相机的光轴上;Step 2: Select a fixed point on the calibration plate fixed at the end of the robot as a feature point, and use an image-based visual control method to control the movement of the manipulator so that the feature point reaches the optical axis of the camera;

步骤三:根据机器人的关节角数据,基于正运动学模型计算特征点的名义位置,计算位置对齐误差矩阵;Step 3: According to the joint angle data of the robot, calculate the nominal position of the feature point based on the forward kinematics model, and calculate the position alignment error matrix;

步骤四:通过迭代最小二乘算法估计运动学参数误差,根据名义的运动学参数计算实际的运动学参数,完成串联机器人的运动学参数标定。Step 4: Estimate the error of the kinematic parameters by the iterative least squares algorithm, calculate the actual kinematic parameters according to the nominal kinematic parameters, and complete the calibration of the kinematic parameters of the tandem robot.

本发明还包括这样一些结构特征:The present invention also includes such structural features:

1.步骤一所述的建立运动学误差模型具体为:1. The establishment of the kinematic error model described in step 1 is specifically:

(1)基于改进的DH方法建立串联机器人的运动学模型,得到机器人基坐标到末端执行器的变换矩阵;(1) Based on the improved DH method, the kinematics model of the tandem robot is established, and the transformation matrix from the robot base coordinates to the end effector is obtained;

(2)建立串联机器人的运动学误差模型,得到末端位姿误差向量与运动学参数误差向量间的线性关系,其中ΔPe和ΔRe分别表示机器人末端微小的平移和旋转误差;(2) Establish the kinematics error model of the serial robot, and obtain the end pose error vector Error vector with kinematic parameters The linear relationship between , where ΔP e and ΔR e represent the tiny translation and rotation errors at the end of the robot, respectively;

(3)以相机光轴作为虚拟直线约束,建立基于虚拟直线约束的误差模型,得到对齐误差矩阵E与运动学参数误差向量之间的关系。(3) Taking the camera optical axis as a virtual straight line constraint, an error model based on the virtual straight line constraint is established, and the alignment error matrix E and the kinematic parameter error vector are obtained The relationship between.

2.步骤二所述的基于图像的视觉控制具体为:2. The image-based visual control described in step 2 is specifically:

(1)机器人末端安装一块标定板,选择标定板上一个点位特征点,标定板完全位于相机的视野中;(1) Install a calibration board at the end of the robot, select a feature point on the calibration board, and the calibration board is completely in the field of view of the camera;

(2)基于图像的视觉控制方法由图像特征外环和机器人控制内环组成,期望的图像特征为特征点到达相机的光轴上,根据期望图像特征与相机采集的当前的图像特征的偏差,通过末端位姿调整策略将图像坐标偏差转化为机器人末端的偏差 (2) The image-based visual control method consists of an image feature outer ring and a robot control inner ring. The expected image feature is that the feature point reaches the optical axis of the camera. According to the deviation between the expected image feature and the current image feature collected by the camera, The image coordinates are deviated by the end pose adjustment strategy Converted to the deviation at the end of the robot

(3)根据机器人末端位姿偏差计算末端期望位姿,通过机器人控制内环控制机器人到达期望位姿。(3) According to the end pose deviation of the robot Calculate the desired pose at the end, and control the robot to reach the desired pose through the robot control inner loop.

3.步骤三所述的位置对齐误差矩阵计算具体为:3. The calculation of the position alignment error matrix described in step 3 is as follows:

(1)特征点到达相机的光轴上后,记录此时机器人的关节角,基于正运动学模型计算机器人末端的名义位置 (1) After the feature point reaches the optical axis of the camera, record the joint angle of the robot at this time, and calculate the nominal position of the robot end based on the forward kinematics model

(2)控制机器人运动,使特征点依次到达光轴上多个位置,计算在每个位置处机器人末端的名义位置;(2) Control the movement of the robot so that the feature points arrive at multiple positions on the optical axis in sequence, and calculate the nominal position of the end of the robot at each position;

(3)选择多条光轴向量进行测量,特征点分别运动到每条光轴上的多个位置点,根据每个位置处机械臂末端的名义位置计算位置对齐误差矩阵E:(3) Select multiple optical axis vectors for measurement, the feature points move to multiple positions on each optical axis respectively, and calculate the position alignment error matrix E according to the nominal position of the end of the mechanical arm at each position:

E(i,j,k)=[μk×](v(j,k)-v(i,k))E (i,j,k) =[μ k ×](v (j,k) -v (i,k) )

其中:p为每条光轴上的位置点数,q为光轴数,μk为光轴向量,v(i,k)与v(j,k)分别为点到约束向量μk的距离。Where: p is the number of points on each optical axis, q is the number of optical axes, μ k is the vector of the optical axis, v (i,k) and v (j,k) are points and The distance to the constraint vector μ k .

4.步骤四所述的运动学参数辨识具体为:4. The kinematic parameter identification described in step 4 is specifically:

(1)根据光轴上每个位置处特征点的名义位置,使用最小二乘法拟合光轴向量μk(1) According to the nominal position of the feature point at each position on the optical axis, use the least square method to fit the optical axis vector μ k ;

(2)基于运动学误差模型,使用迭代最小二乘法求解运动学参数误差并计算实际的运动学参数 (2) Based on the kinematic error model, use the iterative least square method to solve the kinematic parameter error and calculate the actual kinematic parameters

与现有技术相比,本发明的有益效果是:本发明利用相机光轴作为虚拟直线约束进行串联机器人的运动学参数标定,仅使用相机的光轴和机械臂的关节角数据即可完成标定,无需其他昂贵的高精度测量仪器或特定的标定设备,成本低且易操作。建立的基于直线约束的运动学误差模型对较低的相机深度测量精度具有鲁棒性,除了相机光轴中心点的图像坐标,不需要标定其他的相机参数,降低了相机参数误差对运动学参数标定结果的影响。对提高串联机械臂的绝对定位精度具有重要的借鉴意义,可直接应用于各领域中的串联机械臂的运动学参数标定。Compared with the prior art, the beneficial effect of the present invention is: the present invention uses the camera optical axis as a virtual linear constraint to calibrate the kinematic parameters of the series robot, and only uses the optical axis of the camera and the joint angle data of the mechanical arm to complete the calibration , without other expensive high-precision measuring instruments or specific calibration equipment, low cost and easy to operate. The established kinematic error model based on linear constraints is robust to low camera depth measurement accuracy. Except for the image coordinates of the center point of the camera optical axis, no other camera parameters need to be calibrated, which reduces the impact of camera parameter errors on kinematic parameters. impact on calibration results. It has important reference significance for improving the absolute positioning accuracy of serial manipulators, and can be directly applied to the calibration of kinematic parameters of serial manipulators in various fields.

附图说明Description of drawings

图1是本发明的标定板上的特征点位置示意图;Fig. 1 is a schematic diagram of the position of feature points on the calibration plate of the present invention;

图2是本发明的基于图像的视觉控制方法原理图;Fig. 2 is a schematic diagram of the image-based visual control method of the present invention;

图3是本发明的基于图像的视觉控制框图;Fig. 3 is the visual control block diagram based on image of the present invention;

图4是本发明的特征点在光轴上的位置图。Fig. 4 is a position diagram of the feature points of the present invention on the optical axis.

具体实施方式Detailed ways

下面结合附图与具体实施方式对本发明作进一步详细描述。The present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

本发明提供了一种基于视觉的串联机器人运动学参数标定方法。针对目前机器人的绝对定位精度较低,而在越来越多的应用领域对机器人的定位精度要求越来越高的问题,提出了机器人的运动学参数标定方法;另外针对目前机器人运动学参数标定方法需要昂贵的高精度测量仪器或特定的标定设备,标定结果受测量仪器影响等问题,设计了基于视觉的运动学参数标定方法。首先,将相机光轴作为虚拟直线约束,在使用改进的DH方法建立机器人的运动学模型的基础上,建立基于直线约束的运动学误差模型;然后,在机器人末端固定的标定板上选择一个固定点作为特征点,使用基于图像的视觉控制方法控制机械臂运动,使特征点到达相机的光轴上,选择多条光轴,在每条光轴上使特征点依次到达光轴上的多个位置,记录每个位置处机器人的关节角;最后,根据机器人的关节角数据,使用正运动学计算特征点的名义位置,计算对齐误差矩阵,基于运动学误差模型,通过迭代最小二乘算法估计运动学参数误差,根据名义的运动学参数计算实际的运动学参数。本发明利用相机的光轴作为虚拟约束,仅使用机器人的关节角数据即可完成标定,具有成本低、易操作的特点,不需要昂贵的高精度测量设备,对串联机器人标定具有通用性,可广泛应用于工业、空间、水下环境中提高机械臂的绝对定位精度。The invention provides a vision-based calibrating method for kinematics parameters of serial robots. Aiming at the problem that the absolute positioning accuracy of robots is relatively low, and the positioning accuracy of robots is required to be higher and higher in more and more application fields, a calibration method for robot kinematic parameters is proposed; in addition, for the current robot kinematic parameter calibration The method requires expensive high-precision measuring instruments or specific calibration equipment, and the calibration results are affected by the measuring instruments. A vision-based kinematic parameter calibration method is designed. First, the camera optical axis is used as a virtual linear constraint, and on the basis of using the improved DH method to establish the kinematics model of the robot, a kinematics error model based on the linear constraint is established; then, a fixed calibration plate is selected on the end of the robot. Points are used as feature points, and the image-based visual control method is used to control the movement of the manipulator, so that the feature points reach the optical axis of the camera, select multiple optical axes, and make the feature points sequentially reach multiple optical axes on each optical axis. Position, record the joint angle of the robot at each position; finally, according to the joint angle data of the robot, use forward kinematics to calculate the nominal position of the feature point, calculate the alignment error matrix, and estimate it by iterative least squares algorithm based on the kinematic error model Kinematic parameter error, calculate the actual kinematic parameters according to the nominal kinematic parameters. The invention uses the optical axis of the camera as a virtual constraint, and can complete the calibration using only the joint angle data of the robot. It has the characteristics of low cost and easy operation, does not require expensive high-precision measuring equipment, and has versatility for calibrating series robots. It is widely used in industrial, space, and underwater environments to improve the absolute positioning accuracy of robotic arms.

本发明采用了以下技术方案:The present invention adopts following technical scheme:

基于视觉的串联机器人运动学参数标定方法,包括建立运动学误差模型、基于图像的视觉控制、位置对齐误差矩阵计算和运动学参数辨识。其中:A vision-based calibrating method for kinematics parameters of serial robots includes establishment of a kinematics error model, image-based vision control, position alignment error matrix calculation, and kinematics parameter identification. in:

步骤一:将相机光轴作为虚拟直线约束,建立基于直线约束的运动学误差模型;Step 1: Use the camera optical axis as a virtual straight line constraint, and establish a kinematic error model based on the straight line constraint;

步骤二:在机器人末端固定的标定板上选择一个固定点作为特征点,使用基于图像的视觉控制方法控制机械臂运动,使特征点到达相机的光轴上;Step 2: Select a fixed point on the calibration plate fixed at the end of the robot as a feature point, and use an image-based visual control method to control the movement of the manipulator so that the feature point reaches the optical axis of the camera;

步骤三:根据机器人的关节角数据,使用正运动学计算特征点的名义位置,计算位置对齐误差矩阵;Step 3: According to the joint angle data of the robot, use forward kinematics to calculate the nominal position of the feature point, and calculate the position alignment error matrix;

步骤四:通过迭代最小二乘算法估计运动学参数误差,根据名义的运动学参数计算实际的运动学参数,完成串联机器人的运动学参数标定。Step 4: Estimate the error of the kinematic parameters by the iterative least squares algorithm, calculate the actual kinematic parameters according to the nominal kinematic parameters, and complete the calibration of the kinematic parameters of the tandem robot.

步骤一中的建立运动学误差模型具体为:The establishment of the kinematic error model in step 1 is specifically:

(1)基于改进的DH方法建立串联机器人的运动学模型,得到机器人基坐标到末端执行器的变换矩阵;(1) Based on the improved DH method, the kinematics model of the tandem robot is established, and the transformation matrix from the robot base coordinates to the end effector is obtained;

(2)建立串联机器人的运动学误差模型,得到末端位姿误差向量与运动学参数误差向量间的线性关系,其中ΔPe和ΔRe分别表示机器人末端微小的平移和旋转误差;(2) Establish the kinematics error model of the serial robot, and obtain the end pose error vector Error vector with kinematic parameters The linear relationship between , where ΔP e and ΔR e represent the tiny translation and rotation errors at the end of the robot, respectively;

(3)以相机光轴作为虚拟直线约束,建立基于虚拟直线约束的误差模型,得到对齐误差矩阵E与运动学参数误差向量之间的关系。(3) Taking the camera optical axis as a virtual straight line constraint, an error model based on the virtual straight line constraint is established, and the alignment error matrix E and the kinematic parameter error vector are obtained The relationship between.

步骤二中的基于图像的视觉控制具体为:The image-based visual control in step 2 is specifically:

(1)机器人末端安装一块标定板,选择标定板上一个点位特征点,标定板完全位于相机的视野中;(1) Install a calibration board at the end of the robot, select a feature point on the calibration board, and the calibration board is completely in the field of view of the camera;

(2)基于图像的视觉控制方法由图像特征外环和机器人控制内环组成,期望的图像特征为特征点到达相机的光轴上,根据期望图像特征与相机采集的当前的图像特征的偏差,通过末端位姿调整策略将图像坐标偏差转化为机器人末端的偏差 (2) The image-based visual control method consists of an image feature outer ring and a robot control inner ring. The expected image feature is that the feature point reaches the optical axis of the camera. According to the deviation between the expected image feature and the current image feature collected by the camera, The image coordinates are deviated by the end pose adjustment strategy Converted to the deviation at the end of the robot

(3)根据机器人末端位姿偏差计算末端期望位姿,通过机器人控制内环控制机器人到达期望位姿。(3) According to the end pose deviation of the robot Calculate the desired pose at the end, and control the robot to reach the desired pose through the robot control inner loop.

步骤三中的位置对齐误差矩阵计算具体为:The calculation of the position alignment error matrix in step 3 is as follows:

(1)特征点到达相机的光轴上后,记录此时机器人的关节角;(1) After the feature point reaches the optical axis of the camera, record the joint angle of the robot at this time;

(2)控制机器人运动,使特征点依次到达光轴上多个位置,根据关节角计算在每个位置处机器人末端的名义位置 (2) Control the movement of the robot so that the feature points reach multiple positions on the optical axis in sequence, and calculate the nominal position of the end of the robot at each position according to the joint angle

(3)选择多条光轴向量进行测量,特征点分别运动到每条光轴上的多个位置点,根据每个位置处机械臂末端的名义位置计算对齐误差矩阵E:(3) Select multiple optical axis vectors for measurement, and move the feature points to multiple positions on each optical axis, and calculate the alignment error matrix E according to the nominal position of the end of the mechanical arm at each position:

E(i,j,k)=[μk×](v(j,k)-v(i,k))E(i,j,k)=[μ k ×](v (j,k) -v (i,k) )

其中,p为每条光轴上的位置点数,q为光轴数,μk为光轴向量,v(i,k)与v(j,k)分别为点到约束向量μk的距离。Among them, p is the number of points on each optical axis, q is the number of optical axes, μ k is the optical axis vector, v (i,k) and v (j,k) are points and The distance to the constraint vector μ k .

步骤四中的运动学参数辨识具体为:The identification of kinematic parameters in step 4 is as follows:

(1)根据光轴上每个位置处特征点的名义位置,使用最小二乘法拟合光轴向量μk(1) According to the nominal position of the feature point at each position on the optical axis, use the least square method to fit the optical axis vector μ k ;

(2)基于运动学误差模型,使用迭代最小二乘法求解运动学参数误差并计算实际的运动学参数 (2) Based on the kinematic error model, use the iterative least square method to solve the kinematic parameter error and calculate the actual kinematic parameters

下面结合具体数值给出本发明的一个具体实施例:A specific embodiment of the present invention is provided below in conjunction with specific numerical values:

(1)建立运动学误差模型(1) Establish a kinematic error model

定义连杆参数向量为:与前一个关节轴近似平行的连杆的参数向量其他连杆的参数向量其中,ai,diiii为第i(i=1,2,3,4,5,6)个连杆的运动学参数,分别为连杆长度、连杆偏距、连杆转角、关节角及平行关节轴的连杆转角。Define the connecting rod parameter vector as: vector of parameters for the link approximately parallel to the previous joint axis Parameter vectors for other linkages Among them, a i , d i , α i , θ i , and β i are the kinematic parameters of the i-th (i=1,2,3,4,5,6) connecting rod, which are respectively the length of the connecting rod and the length of the connecting rod Offset, Link Angle, Joint Angle, and Link Angle Parallel to Joint Axis.

假设机器人实际的运动学参数向量为其名义值为运动学参数误差向量基于改进的DH模型可以得到机器人运动学误差模型:Assume that the actual kinematic parameter vector of the robot is Its nominal value is Kinematic parameter error vector Based on the improved DH model, the robot kinematics error model can be obtained:

其中,ΔPe和ΔRe分别表示机器人末端微小的平移和旋转误差,分别为相对于位置和姿态的误差雅克比矩阵。Among them, ΔP e and ΔR e represent the tiny translation and rotation errors of the robot end, respectively, and are the error Jacobian matrices with respect to position and attitude, respectively.

定义第k条光轴约束向量为μk,设特征点在光轴上的第i个实际位置为Pe (i,k),对应的机器人的关节角为θe (i,k)。根据运动学名义参数计算的特征点名义位置为特征点的名义位置与实际位置的差为:Define the k-th optical axis constraint vector as μ k , let the i-th actual position of the feature point on the optical axis be P e (i,k) , and the corresponding joint angle of the robot be θ e (i,k) . The nominal position of the feature point calculated according to the nominal parameters of kinematics is The difference between the nominal position and the actual position of the feature point is:

其中,0R6′为机械臂基座到末端的名义姿态变换矩阵。Among them, 0 R 6 ′ is the nominal attitude transformation matrix from the base to the end of the manipulator.

参见图1,为相机的坐标原点,为特征点在光轴向量μk上的实际位置,为对应的名义位置,Pe (i,k)分别表示为:See Figure 1, is the coordinate origin of the camera, and is the actual position of the feature point on the optical axis vector μ k , and is the corresponding nominal position, P e (i,k) and Respectively expressed as:

其中,s(i,k)是点Pe (i,k)到相机坐标原点的垂直距离;s′(i,k)为点到相机坐标原点的垂直距离,v(i,k)为点到约束向量μk的距离。Among them, s (i, k) is the point P e (i, k) to the origin of the camera coordinates vertical distance of ; s′ (i,k) is the point to the origin of the camera coordinates The vertical distance of , v (i,k) is the point The distance to the constraint vector μ k .

定义算子[μk×]为:显然[μk×]μk=0。特征点的名义位置与实际位置的差经过变换得到:Define the operator [μ k ×] as: Obviously [μ k ×]μ k =0. The difference between the nominal position and the actual position of the feature point is obtained by transformation:

将虚拟约束上的两个不同位置Pe (i,k)和Pe (j,k)分别代入方程,两个方程作差得到:Substitute two different positions P e (i,k) and P e (j,k) on the virtual constraint into the equation respectively, and make difference between the two equations to get:

定义E(i,j,k)为对齐误差,为相对对齐误差雅克比矩阵:Define E (i,j,k) as the alignment error, is the relative alignment error Jacobian matrix:

E(i,j,k)=[μk×](v(j,k)-v(i,k))E (i,j,k) =[μ k ×](v (j,k) -v (i,k) )

得到选择q条光轴进行测量,特征点分别运动到每条光轴上的p个位置点,构建位置对齐误差矩阵E和回归矩阵Φ:get Select q optical axes for measurement, move the feature points to p positions on each optical axis respectively, and construct the position alignment error matrix E and regression matrix Φ:

得到求解运动学参数误差向量的运动学误差模型:Get the solution kinematic parameter error vector The kinematic error model for :

(2)基于图像的视觉控制(2) Image-based visual control

参见图2,机器人末端标定板上的特征点为Fp,特征点当前位置Pf与光轴μk的距离为d,特征点的图像坐标为(uf,vf),光轴中心点的图像坐标为(u0,v0),视觉控制的目的是使Pf运动到光轴上的位置点P′f,此时d→0,uf→u0,vf→v0Referring to Figure 2, the feature point on the calibration plate at the end of the robot is F p , the distance between the current position of the feature point P f and the optical axis μ k is d, the image coordinates of the feature point are (u f , v f ), and the center point of the optical axis The image coordinates of are (u 0 , v 0 ), the purpose of visual control is to make P f move to the position point P′ f on the optical axis, at this time d→0, u f →u 0 , v f →v 0 .

参见图3,基于图像的视觉控制框图由图像特征外环和机器人位置控制内环组成,控制目标是使特征点的图像坐标(uf,vf)与光轴中心点的图像坐标(u0,v0)重合,相机采集的当前图像作为视觉反馈,特征点在图像坐标中当前的实际坐标与期望坐标的偏差为(uf-u0,vf-v0),根据末端位姿调整策略将图像坐标偏差转化为机械臂末端在基坐标系下的偏差,实现图像空间偏差到笛卡尔空间偏差的转换。在机器人控制内环中,根据反馈的机器人末端位姿偏差计算得到期望位姿,根据逆运动学计算期望关节角,利用机器人的关节位置控制器控制机器人运动到期望的位置和姿态。其中在末端位姿调整策略中,图像偏差到笛卡尔空间偏差的转换矩阵为:Referring to Fig. 3, the image-based visual control block diagram is composed of the image feature outer loop and the robot position control inner loop. The control goal is to make the image coordinates (u f , v f ) of the feature points and the image coordinates (u 0 ,v 0 ) coincide, the current image collected by the camera is used as visual feedback, the deviation between the current actual coordinates of the feature points in the image coordinates and the expected coordinates is (u f -u 0 ,v f -v 0 ), adjusted according to the end pose The strategy converts the image coordinate deviation into the deviation of the end of the manipulator in the base coordinate system, and realizes the transformation from the image space deviation to the Cartesian space deviation. In the inner loop of robot control, the expected pose is calculated according to the feedback of the end pose deviation of the robot, the expected joint angle is calculated according to inverse kinematics, and the joint position controller of the robot is used to control the robot to move to the desired position and attitude. Among them, in the terminal pose adjustment strategy, the conversion matrix from image deviation to Cartesian space deviation is:

其中,为特征点在基坐标系下实际位置与期望位置的偏差,为相机坐标系到机器人基坐标系的旋转矩阵,为特征点在相机坐标系下的位置偏差,可根据相机模型计算得到,(u0,v0)可以通过相机标定得到,kx,ky为常系数。in, is the deviation between the actual position and the expected position of the feature point in the base coordinate system, is the rotation matrix from the camera coordinate system to the robot base coordinate system, is the position deviation of the feature point in the camera coordinate system, which can be calculated according to the camera model, (u 0 , v 0 ) can be obtained through camera calibration, and k x , ky are constant coefficients.

在位置控制内环中,由于存在运动学参数误差,不能根据逆运动学将机械臂运动到期望位置,需要控制机械臂以小增量的方式运动,使特征点逐渐靠近光轴,直至特征点与光轴的距离到达可接受的误差范围。设k为小于1的正常数,机械臂末端的实际运动增量为:In the position control inner loop, due to kinematic parameter errors, the manipulator cannot be moved to the desired position according to inverse kinematics. It is necessary to control the manipulator to move in small increments, so that the feature point gradually approaches the optical axis until the feature point The distance from the optical axis reaches the acceptable error range. Assuming k is a normal number less than 1, the actual movement increment at the end of the mechanical arm is:

(3)位置对齐误差矩阵计算(3) Position alignment error matrix calculation

参见图4,特征点依次到达光轴μk上的多个位置,记录每个位置处机器人的关节角θ,基于机器人的正运动学模型计算特征点在基坐标系下的名义位置为机器人基坐标系到末端坐标系的变换矩阵0Tn中的元素,其中:Referring to Figure 4, the feature points arrive at multiple positions on the optical axis μ k in turn, record the joint angle θ of the robot at each position, and calculate the nominal position of the feature points in the base coordinate system based on the forward kinematics model of the robot is the element in the transformation matrix 0 T n from the base coordinate system of the robot to the end coordinate system, where:

其中,n为机器人的连杆数,i-1Ti为第i-1个坐标系与第i个坐标系之间的变换矩阵,对于关节轴与前一个关节轴近似平行的连杆:Among them, n is the number of connecting rods of the robot, i-1 T i is the transformation matrix between the i-1 coordinate system and the i coordinate system, for the connecting rod whose joint axis is approximately parallel to the previous joint axis:

其他连杆的变换矩阵i-1Ti为:The transformation matrix i-1 T i of other connecting rods is:

选择多个光轴向量,特征点依次到达每条光轴上的多个位置点,根据计算的每个位置处特征点的名义位置计算位置对齐误差矩阵E。Select multiple optical axis vectors, and the feature points arrive at multiple position points on each optical axis in turn, according to the calculated nominal position of the feature point at each position Compute the position alignment error matrix E.

(4)运动学参数辨识(4) Identification of kinematic parameters

根据计算的一条光轴上特征点的名义位置使用最小二乘法拟合得到误差模型中的光轴向量μk。假设第k条光轴上位置i(i≤s)的名义值坐标为(xi,yi,zi),光轴k的直线方程为:According to the calculated nominal position of the feature point on an optical axis The optical axis vector μ k in the error model is obtained by fitting with the least square method. Assuming that the nominal value coordinates of position i (i≤s) on the kth optical axis are (x i , y i , z i ), the straight line equation of optical axis k is:

其中,参数x0,y0,m,n为:Among them, the parameters x 0 , y 0 , m, n are:

根据回归矩阵Φ和上文计算的测量矩阵E,基于误差模型使用使用迭代最小二乘法估计得到运动学参数误差向量第t次迭代的估计值为:According to the regression matrix Φ and the measurement matrix E calculated above, the error vector of the kinematic parameters is estimated using the iterative least squares method based on the error model Estimated value at iteration t for:

其中,Φ(t)和E(t)分别为每次迭代更新后计算的回归矩阵与测量矩阵,λ(t)为惩罚因子:Among them, Φ(t) and E(t) are updated for each iteration respectively The post-calculated regression matrix and measurement matrix, λ(t) is the penalty factor:

其中,h通常为2~10之间的常数,λ(0)为0.001~0.1之间的常数。Wherein, h is usually a constant between 2 and 10, and λ(0) is a constant between 0.001 and 0.1.

根据估计的串联机器人的运动学参数误差计算实际运动学参数 According to the estimated kinematic parameter error of the tandem robot Calculation of actual kinematic parameters

综上,本发明属于机器人标定领域,涉及一种基于视觉的串联机器人运动学参数标定方法。本发明包括:将相机光轴作为虚拟直线约束,建立基于直线约束的运动学误差模型;在机器人末端固定的标定板上选择一个固定点作为特征点,使用基于图像的视觉控制方法控制机械臂运动,使特征点到达相机的光轴上;根据机器人的关节角数据,使用正运动学计算特征点的名义位置,计算对齐误差矩阵;通过迭代最小二乘算法估计运动学参数误差,根据名义的运动学参数计算实际的运动学参数。本发明利用相机的光轴作为虚拟约束,仅使用机器人的关节角数据即可完成标定,具有成本低、易操作的特点,不需要昂贵的高精度测量设备,对串联机器人标定具有通用性,可广泛应用于工业、空间、水下环境中提高机械臂的绝对定位精度。In summary, the present invention belongs to the field of robot calibration, and relates to a vision-based calibrating method for kinematic parameters of series robots. The invention includes: using the optical axis of the camera as a virtual straight line constraint, establishing a kinematics error model based on the straight line constraint; selecting a fixed point on a calibration plate fixed at the end of the robot as a feature point, and using an image-based visual control method to control the movement of the mechanical arm , so that the feature point reaches the optical axis of the camera; according to the joint angle data of the robot, the nominal position of the feature point is calculated using forward kinematics, and the alignment error matrix is calculated; the kinematic parameter error is estimated by the iterative least squares algorithm, and according to the nominal motion kinematic parameters to calculate the actual kinematic parameters. The invention uses the optical axis of the camera as a virtual constraint, and can complete the calibration using only the joint angle data of the robot. It has the characteristics of low cost and easy operation, does not require expensive high-precision measuring equipment, and has versatility for calibrating series robots. It is widely used in industrial, space, and underwater environments to improve the absolute positioning accuracy of robotic arms.

Claims (5)

1. A serial robot kinematic parameter calibration method based on vision is characterized in that: the method comprises the following steps:
the method comprises the following steps: taking an optical axis of a camera as virtual straight line constraint, and establishing a kinematic error model based on the straight line constraint;
step two: selecting a fixed point on a calibration plate fixed at the tail end of the robot as a characteristic point, and controlling the movement of the mechanical arm by using an image-based visual control method to enable the characteristic point to reach the optical axis of the camera;
step three: calculating the nominal position of the characteristic point based on a positive kinematic model according to the joint angle data of the robot, and calculating a position alignment error matrix;
step four: and estimating the kinematic parameter error by an iterative least square algorithm, and calculating the actual kinematic parameter according to the nominal kinematic parameter to finish the kinematic parameter calibration of the serial robot.
2. The vision-based serial robot kinematic parameter calibration method according to claim 1, characterized in that: the establishing of the kinematic error model in the first step is specifically as follows:
(1) establishing a kinematic model of the serial robot based on an improved DH method to obtain a transformation matrix from a robot base coordinate to an end effector;
(2) establishing a kinematic error model of the serial robot to obtain a terminal pose error vectorAnd kinematic parameter error vectorA linear relationship therebetween, wherein Δ PeAnd Δ ReRespectively representing small translation and rotation errors of the tail end of the robot;
(3) using the optical axis of the camera as virtual straight line constraint, establishing an error model based on the virtual straight line constraint to obtain an alignment error matrix E and a kinematic parameter error vectorThe relationship between them.
3. The vision-based serial robot kinematic parameter calibration method according to claim 2, characterized in that: the visual control based on the image in the second step is specifically as follows:
(1) a calibration plate is arranged at the tail end of the robot, a point location characteristic point on the calibration plate is selected, and the calibration plate is completely positioned in the visual field of the camera;
(2) the vision control method based on the image is composed of an image characteristic outer ring and a robot control inner ring, an expected image characteristic is that a characteristic point arrives on an optical axis of a camera, and according to deviation of the expected image characteristic and current image characteristic acquired by the camera, image coordinate deviation is carried out through a terminal pose adjusting strategyInto deviations of the robot ends
(3) According to the terminal pose deviation of the robotAnd calculating the expected pose of the tail end, and controlling the robot to reach the expected pose through the inner ring controlled by the robot.
4. The vision-based serial robot kinematic parameter calibration method according to claim 3, characterized in that: the position alignment error matrix calculation in the third step is specifically as follows:
(1) after the characteristic points reach the optical axis of the camera, the joint angle of the robot at the moment is recorded, and the nominal position of the tail end of the robot is calculated based on the positive kinematics model
(2) Controlling the robot to move, enabling the characteristic points to sequentially reach a plurality of positions on the optical axis, and calculating the nominal position of the tail end of the robot at each position;
(3) selecting a plurality of optical axis vectors for measurement, respectively moving the characteristic points to a plurality of position points on each optical axis, and calculating a position alignment error matrix E according to the nominal position of the tail end of the mechanical arm at each position:
E(i,j,k)=[μk×](v(j,k)-v(i,k))
wherein: p is the number of position points on each optical axis, q is the number of optical axes, mukIs an optical axial quantity, v(i,k)And v(j,k)Are respectively a pointAndto the constraint vector mukThe distance of (c).
5. The vision-based serial robot kinematic parameter calibration method according to claim 4, characterized in that: the kinematic parameter identification in the step four is specifically as follows:
(1) fitting the optical axis vector mu by using a least square method according to the nominal position of the characteristic point at each position on the optical axisk
(2) Solving kinematic parameter errors by using an iterative least square method based on a kinematic error modelAnd calculating the actual kinematic parameters
CN201810510980.8A 2018-05-25 2018-05-25 A vision-based method for calibrating kinematic parameters of serial robots Active CN108789404B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810510980.8A CN108789404B (en) 2018-05-25 2018-05-25 A vision-based method for calibrating kinematic parameters of serial robots

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810510980.8A CN108789404B (en) 2018-05-25 2018-05-25 A vision-based method for calibrating kinematic parameters of serial robots

Publications (2)

Publication Number Publication Date
CN108789404A true CN108789404A (en) 2018-11-13
CN108789404B CN108789404B (en) 2021-06-18

Family

ID=64091681

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810510980.8A Active CN108789404B (en) 2018-05-25 2018-05-25 A vision-based method for calibrating kinematic parameters of serial robots

Country Status (1)

Country Link
CN (1) CN108789404B (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109571477A (en) * 2018-12-17 2019-04-05 西安工程大学 A kind of improved robot vision and conveyer belt composite calibration method
CN110125944A (en) * 2019-05-14 2019-08-16 中国地质大学(武汉) A kind of mechanical arm teaching system and method
CN110480658A (en) * 2019-08-15 2019-11-22 同济大学 A kind of six-joint robot control system merging vision self-calibration
CN111360812A (en) * 2018-12-26 2020-07-03 中国科学院沈阳自动化研究所 A camera vision-based industrial robot DH parameter calibration method and calibration device
CN111421552A (en) * 2020-05-09 2020-07-17 云南电网有限责任公司电力科学研究院 A collaborative control method of double manipulators for inspection robots
CN111912310A (en) * 2020-08-10 2020-11-10 深圳市智流形机器人技术有限公司 Calibration method, device and equipment
CN112643658A (en) * 2019-10-10 2021-04-13 南京邮电大学 Calibration method for adaptive error modeling of series robot based on SIR dimension reduction DH model
CN112894814A (en) * 2021-01-25 2021-06-04 江苏集萃智能制造技术研究所有限公司 Mechanical arm DH parameter identification method based on least square method
CN112975913A (en) * 2021-03-10 2021-06-18 清华大学 Self-calibration method and system for cable-driven parallel mechanism
CN113733088A (en) * 2021-09-07 2021-12-03 河南大学 Mechanical arm kinematics self-calibration method based on binocular vision
CN113977574A (en) * 2021-09-16 2022-01-28 南京邮电大学 Mechanical arm point constraint control method
CN114043528A (en) * 2021-11-25 2022-02-15 成都飞机工业(集团)有限责任公司 Robot positioning performance testing method, system, equipment and medium
CN114161411A (en) * 2021-11-18 2022-03-11 浙江大学 A vision-based method for kinematic parameter calibration of multi-legged robots
CN114536347A (en) * 2022-04-08 2022-05-27 上海电气集团股份有限公司 Mechanical arm calibration position determining method, mechanical arm calibration system and electronic equipment
CN115026814A (en) * 2022-06-01 2022-09-09 中科苏州智能计算技术研究院 Camera automatic calibration method for mechanical arm motion space reconstruction
CN115250616A (en) * 2020-03-31 2022-10-28 美蓓亚三美株式会社 Calibration system, information processing system, robot control system, calibration method, information processing method, robot control method, calibration program, information processing program, calibration device, information processing device, and robot control device
CN115890654A (en) * 2022-10-09 2023-04-04 北京微链道爱科技有限公司 Depth camera automatic calibration algorithm based on three-dimensional feature points

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4675499A (en) * 1985-01-31 1987-06-23 Shibuya Kogyo Co., Ltd. Laser beam machining robot
CN101096101A (en) * 2006-06-26 2008-01-02 北京航空航天大学 Robot Foot Targeting Method and Calibration Device
CN104608129A (en) * 2014-11-28 2015-05-13 江南大学 Planar constraint based robot calibration method
CN106017339A (en) * 2016-06-06 2016-10-12 河北工业大学 Three-dimensional measurement method for projecting non-uniform stripes in non-complete constraint system
CN106061427A (en) * 2014-02-28 2016-10-26 索尼公司 Robot arm apparatus, robot arm control method, and program
CN107214703A (en) * 2017-07-11 2017-09-29 江南大学 A kind of robot self-calibrating method of view-based access control model auxiliary positioning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4675499A (en) * 1985-01-31 1987-06-23 Shibuya Kogyo Co., Ltd. Laser beam machining robot
CN101096101A (en) * 2006-06-26 2008-01-02 北京航空航天大学 Robot Foot Targeting Method and Calibration Device
CN106061427A (en) * 2014-02-28 2016-10-26 索尼公司 Robot arm apparatus, robot arm control method, and program
CN104608129A (en) * 2014-11-28 2015-05-13 江南大学 Planar constraint based robot calibration method
CN106017339A (en) * 2016-06-06 2016-10-12 河北工业大学 Three-dimensional measurement method for projecting non-uniform stripes in non-complete constraint system
CN107214703A (en) * 2017-07-11 2017-09-29 江南大学 A kind of robot self-calibrating method of view-based access control model auxiliary positioning

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109571477A (en) * 2018-12-17 2019-04-05 西安工程大学 A kind of improved robot vision and conveyer belt composite calibration method
CN111360812A (en) * 2018-12-26 2020-07-03 中国科学院沈阳自动化研究所 A camera vision-based industrial robot DH parameter calibration method and calibration device
CN111360812B (en) * 2018-12-26 2022-11-29 中国科学院沈阳自动化研究所 Industrial robot DH parameter calibration method and calibration device based on camera vision
CN110125944A (en) * 2019-05-14 2019-08-16 中国地质大学(武汉) A kind of mechanical arm teaching system and method
CN110480658A (en) * 2019-08-15 2019-11-22 同济大学 A kind of six-joint robot control system merging vision self-calibration
CN110480658B (en) * 2019-08-15 2022-10-25 同济大学 Six-axis robot control system integrating vision self-calibration
CN112643658A (en) * 2019-10-10 2021-04-13 南京邮电大学 Calibration method for adaptive error modeling of series robot based on SIR dimension reduction DH model
CN115250616A (en) * 2020-03-31 2022-10-28 美蓓亚三美株式会社 Calibration system, information processing system, robot control system, calibration method, information processing method, robot control method, calibration program, information processing program, calibration device, information processing device, and robot control device
CN111421552B (en) * 2020-05-09 2022-06-07 云南电网有限责任公司电力科学研究院 A collaborative control method of double manipulators for inspection robots
CN111421552A (en) * 2020-05-09 2020-07-17 云南电网有限责任公司电力科学研究院 A collaborative control method of double manipulators for inspection robots
CN111912310A (en) * 2020-08-10 2020-11-10 深圳市智流形机器人技术有限公司 Calibration method, device and equipment
CN112894814A (en) * 2021-01-25 2021-06-04 江苏集萃智能制造技术研究所有限公司 Mechanical arm DH parameter identification method based on least square method
CN112975913A (en) * 2021-03-10 2021-06-18 清华大学 Self-calibration method and system for cable-driven parallel mechanism
CN113733088A (en) * 2021-09-07 2021-12-03 河南大学 Mechanical arm kinematics self-calibration method based on binocular vision
CN113733088B (en) * 2021-09-07 2024-05-14 河南大学 Mechanical arm kinematics self-calibration method based on binocular vision
CN113977574B (en) * 2021-09-16 2023-02-14 南京邮电大学 Mechanical arm point constraint control method
CN113977574A (en) * 2021-09-16 2022-01-28 南京邮电大学 Mechanical arm point constraint control method
CN114161411A (en) * 2021-11-18 2022-03-11 浙江大学 A vision-based method for kinematic parameter calibration of multi-legged robots
CN114161411B (en) * 2021-11-18 2023-09-01 浙江大学 A Vision-Based Calibration Method for Kinematics Parameters of Multi-legged Robot
CN114043528A (en) * 2021-11-25 2022-02-15 成都飞机工业(集团)有限责任公司 Robot positioning performance testing method, system, equipment and medium
CN114043528B (en) * 2021-11-25 2023-08-04 成都飞机工业(集团)有限责任公司 Robot positioning performance test method, system, equipment and medium
CN114536347A (en) * 2022-04-08 2022-05-27 上海电气集团股份有限公司 Mechanical arm calibration position determining method, mechanical arm calibration system and electronic equipment
CN115026814A (en) * 2022-06-01 2022-09-09 中科苏州智能计算技术研究院 Camera automatic calibration method for mechanical arm motion space reconstruction
CN115026814B (en) * 2022-06-01 2024-04-12 中科苏州智能计算技术研究院 Camera automatic calibration method for mechanical arm movement space reconstruction
CN115890654A (en) * 2022-10-09 2023-04-04 北京微链道爱科技有限公司 Depth camera automatic calibration algorithm based on three-dimensional feature points
CN115890654B (en) * 2022-10-09 2023-08-11 北京微链道爱科技有限公司 Depth camera automatic calibration algorithm based on three-dimensional feature points

Also Published As

Publication number Publication date
CN108789404B (en) 2021-06-18

Similar Documents

Publication Publication Date Title
CN108789404B (en) A vision-based method for calibrating kinematic parameters of serial robots
CN109877840B (en) Double-mechanical-arm calibration method based on camera optical axis constraint
CN106777656B (en) A PMPSD-based Absolute Precision Calibration Method for Industrial Robots
CN112975973B (en) A hybrid calibration method and device applied to a flexible robot
CN111055273B (en) Two-step error compensation method for robot
CN113146620B (en) Dual-arm collaborative robot system and control method based on binocular vision
CN110253574B (en) Multi-task mechanical arm pose detection and error compensation method
CN111203861B (en) Calibration method and calibration system for robot tool coordinate system
CN105382835B (en) A kind of robot path planning method for passing through wrist singular point
CN105773622B (en) A kind of industrial robot absolute precision calibration method based on IEKF
CN105773609A (en) Robot kinematics calibration method based on vision measurement and distance error model
CN110370316B (en) Robot TCP calibration method based on vertical reflection
CN106799745A (en) A kind of industrial machinery arm precision calibration method based on collocating kriging
CN114474056B (en) A monocular vision high-precision target positioning method for grasping operation
CN113459094B (en) Industrial robot tool coordinate system and zero point self-calibration method
CN113927599A (en) Absolute precision compensation method, system, device and computer readable storage medium
CN111590566A (en) On-orbit calibration method for kinematic parameters of fully-configured space manipulator
CN109176487A (en) A kind of cooperating joint section scaling method, system, equipment, storage medium
CN108044651A (en) A kind of space manipulator kinematics parameters on-orbit calibration method based on binocular vision
CN113618738A (en) Mechanical arm kinematic parameter calibration method and system
WO2024031922A1 (en) Robot calibration method and device based on equivalent kinematic model
CN112894814B (en) Mechanical arm DH parameter identification method based on least square method
CN110940351A (en) Robot precision compensation method based on parameter dimension reduction identification
CN114161411B (en) A Vision-Based Calibration Method for Kinematics Parameters of Multi-legged Robot
CN114505865A (en) A method and system for path generation of a robotic arm based on pose tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant