TW202035084A - Device and method for calibrating coordinate of 3d camera and robot arm - Google Patents
Device and method for calibrating coordinate of 3d camera and robot arm Download PDFInfo
- Publication number
- TW202035084A TW202035084A TW108110367A TW108110367A TW202035084A TW 202035084 A TW202035084 A TW 202035084A TW 108110367 A TW108110367 A TW 108110367A TW 108110367 A TW108110367 A TW 108110367A TW 202035084 A TW202035084 A TW 202035084A
- Authority
- TW
- Taiwan
- Prior art keywords
- camera
- coordinate system
- plane
- point cloud
- point
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/10—Programme-controlled manipulators characterised by positioning means for manipulator elements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J18/00—Arms
- B25J18/02—Arms extensible
- B25J18/025—Arms extensible telescopic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Manipulator (AREA)
Abstract
Description
本發明有關一種機器手臂,尤其關於利用校正裝置,校正機器手臂與3D相機相對座標系統的方法。 The present invention relates to a robotic arm, in particular to a method for calibrating the relative coordinate system of the robotic arm and a 3D camera by using a correction device.
隨著人工智慧的快速發展,工廠利用設置3D(3-Dimensional)相機,使機器手臂自動化進行加工組裝製造作業,以提高工廠生產效率。而3D相機的座標系統與機器手臂的座標系統正確的相對關係,影響工廠生產的精密性。 With the rapid development of artificial intelligence, factories use 3D (3-Dimensional) cameras to automate the processing and assembly manufacturing operations of robotic arms to improve factory production efficiency. The correct relative relationship between the coordinate system of the 3D camera and the coordinate system of the robotic arm affects the precision of factory production.
如圖7所示,為先前技術機器手臂與3D相機的校正。機器手臂1在基座2形成手臂座標系統R,並在機器手臂1的工作環境中安裝3D相機3,形成相機座標系統C,以進行人工智慧的自動化組裝製造。相機座標系統C與手臂座標系統R進行校正時,需要一個校正裝置4讓3D相機3進行辨識定位。安裝3D相機3時,先將校正裝置4設置在機器手臂1的工作環境,且讓校正裝置4出現在3D相機3的視野內。
As shown in Figure 7, it is the calibration of the prior art robotic arm and the 3D camera. The
3D相機3又稱為深度(Depth)相機,根據原理不同,可能會同時帶有RGB彩色資訊或單色灰階資訊等2D資訊供辨識定位校正裝置4。假如3D相機3不帶有彩色資訊,或2D資訊不足以進行精準辨識定位時,則會以校
正裝置4形狀的3D資訊進行校正。校正裝置4的形狀3D資訊為深度圖(Depth Map)或點雲圖(Point Cloud)兩種位置資訊,深度圖與點雲圖可以透過相機的內部參數進行換算,實質上屬於相同的3D資訊。3D相機3攝取校正裝置4的3D資訊,形成校正裝置4在相機座標系統C的空間位置資訊。分析校正裝置4的形狀特徵,利用外形上容易辨識的邊緣角點,作為機器手臂1與3D相機2校正的定位點K1-K5。當已知定位點K1-K5在相機座標系統C的座標下,只要將機器手臂1上的工具中心點5接觸定位點K1-K5,即可經由機器手臂1移動工具中心點5在手臂座標系統R的座標,校正手臂座標系統R與相機座標系統C。
The
然而,前述先前技術進行座標系統校正時,因為3D相機3的3D資訊,通常在中間部位較為精確,並無法得到精準的校正裝置4邊緣位置資訊,且當校正裝置4的表面斜率變化較大時,邊緣位置資訊的精度亦會跟著下降,致使位在邊緣角點的定位點K1-K5的精度較差,若僅使用3D資訊進行校正,並無法獲得良好的機器手臂與3D相機間的座標轉換關係。因此,3D相機與機械手臂座標在校正的方法上,仍有問題亟待解決
However, when the aforementioned prior art performs coordinate system calibration, because the 3D information of the
本發明的目的提供一種3D相機與機械手臂座標的校正裝置,在校正裝置上設置相對位置固定及相互分離不平行的三平板,且三平板延伸的三個平面相交於一點,作為校正定位點,以提升校正的精準度。 The object of the present invention is to provide a calibration device for the coordinates of a 3D camera and a robotic arm. The calibration device is provided with three flat plates with fixed relative positions and non-parallel separation from each other, and three planes extending from the three flat plates intersect at one point as the calibration positioning point. To improve the accuracy of calibration.
本發明的另一目的提供一種3D相機與機械手臂座標的校正方法,利用3D相機的點雲3D資訊,根據相鄰點雲的Z值差度與向量夾角,劃分點雲群形成三個平面,以避免邊緣誤差。 Another object of the present invention is to provide a method for correcting the coordinates of a 3D camera and a robotic arm. The point cloud 3D information of the 3D camera is used to divide the point cloud group to form three planes according to the Z value difference of adjacent point clouds and the angle between the vectors. To avoid marginal errors.
本發明的另一目的提供一種3D相機與機械手臂座標的校正方法,由三個點雲群平面,利用最小平方法建構三個平面方程式,計算三個平面方程式的交點,作為校正定位點,以提高定位點的正確性。 Another object of the present invention is to provide a method for correcting the coordinates of a 3D camera and a robotic arm. From three point cloud group planes, three plane equations are constructed using the least square method, and the intersection point of the three plane equations is calculated as the correction positioning point. Improve the accuracy of positioning points.
為了達到前述發明的目的,本發明的3D相機與機械手臂座標系統的校正裝置,在座架上設置相對位置固定及相互分離不平行的三平板,且讓三平板延伸的三個空間平面相交於一點,作為外部參數校正的定位點。 In order to achieve the purpose of the foregoing invention, the calibration device of the 3D camera and robotic arm coordinate system of the present invention is equipped with three plates with fixed relative positions and non-parallel separation from each other, and three spatial planes extending from the three plates intersect at one point. , As the positioning point for external parameter correction.
本發明的3D相機與機械手臂座標系統的校正方法,將三平板的校正裝置放置在機器手臂工作環境及3D相機的視野,藉3D相機攝取校正裝置的3D資訊,利用3D資訊點雲圖或深度圖,計算各相鄰點雲的Z值差距及向量夾角,根據相同Z值差距及向量夾角將點雲分群,排除較少量的點雲群,形成三個平面點雲群,以最小平方法建立三個平面方程式,利用三平面交點公式計算三個平面方程式的交點,將交點作為外部參數校正的定位點。 In the calibration method of the 3D camera and the robot arm coordinate system of the present invention, the calibration device of the three flat panel is placed in the working environment of the robot arm and the field of view of the 3D camera, and 3D information of the calibration device is captured by the 3D camera, and the 3D information point cloud or depth map is used , Calculate the Z value gap and vector included angle of each adjacent point cloud, group the point clouds according to the same Z value gap and vector included angle, exclude a small amount of point cloud groups, and form three plane point cloud groups, which are established by the least square method Three plane equations, use the three plane intersection formula to calculate the intersection of the three plane equations, and use the intersection as the positioning point for external parameter correction.
本發明在建立定位點後,可將校正裝置固定在機械手臂上,使校正裝置與機械手臂維持相對固定的位置關係,再控制機械手臂驅動校正裝置在3D相機視野中移動數點,藉3D相機獲得各定位點相對在相機座標系統中的座標,以進行座標系統校正。 After the positioning point is established, the present invention can fix the calibration device on the robotic arm to maintain a relatively fixed positional relationship between the calibration device and the robotic arm, and then control the robotic arm to drive the calibration device to move a few points in the field of view of the 3D camera, using the 3D camera Obtain the relative coordinates of each positioning point in the camera coordinate system to perform coordinate system correction.
本發明3D相機與機械手臂座標系統的校正方法,利用最小平方法建立的一平面方程式為:z=Ax+By+C In the method for correcting the coordinate system of the 3D camera and the robotic arm of the present invention, a plane equation established by the least square method is: z=Ax+By+C
係數
本發明3D相機與機械手臂座標系統的校正方法,改寫三平面的平面方程式,為下列三個平面方程式:a 1 x+b 1 y+c 1 z+d 1=0 a 2 x+b 2 y+c 2 z+d 2=0 a 3 x+b 3 y+c 3 z+d 3=0 The correction method of the coordinate system of the 3D camera and the robot arm of the present invention rewrites the plane equations of the three planes into the following three plane equations: a 1 x + b 1 y + c 1 z + d 1 =0 a 2 x + b 2 y + c 2 z + d 2 =0 a 3 x + b 3 y + c 3 z + d 3 =0
再求解三平面交點T,利用三平面交點公式:,, Then solve the three-plane intersection T, using the three-plane intersection formula: , ,
其中Det為且Det≠0 Where Det is And Det ≠ 0
10‧‧‧機器手臂 10‧‧‧Robot arm
11‧‧‧基座 11‧‧‧Pedestal
12‧‧‧軸節 12‧‧‧Shaft section
13‧‧‧端末部 13‧‧‧End
14‧‧‧工具 14‧‧‧Tools
15‧‧‧致動馬達 15‧‧‧Actuating motor
16‧‧‧控制裝置 16‧‧‧Control device
17‧‧‧3D相機 17‧‧‧3D camera
20‧‧‧校正裝置 20‧‧‧Calibration device
21‧‧‧座架 21‧‧‧Seat frame
22,23,24‧‧‧平板 22,23,24‧‧‧Plate
25,27,28‧‧‧主平面 25,27,28‧‧‧Main plane
26‧‧‧邊緣平面 26‧‧‧Edge plane
圖1 為本發明機器手臂與3D相機的校正示意圖。 Fig. 1 is a schematic diagram of the calibration of the robot arm and the 3D camera of the present invention.
圖2 為本發明校正裝置的點雲圖。 Figure 2 is a point cloud diagram of the correction device of the present invention.
圖3 為本發明確認點雲群成三個平面的示意圖。 Fig. 3 is a schematic diagram of confirming that the point cloud group is divided into three planes according to the present invention.
圖4 為本發明形成三個平面方程式交點的校正定位點的示意圖。 Fig. 4 is a schematic diagram of the correction positioning point forming the intersection of three plane equations according to the present invention.
圖5 為本發明3D相機與機械手臂座標系統校正方法的流程圖。 Fig. 5 is a flowchart of the calibration method of the 3D camera and the robot arm coordinate system of the present invention.
圖6 為本發明校正裝置固定在機械手臂的校正示意圖。 Fig. 6 is a schematic diagram of the correction of the correction device of the present invention fixed on the mechanical arm.
圖7 為先前技術D3相機與機械手臂座標系統的校正示意圖。 Fig. 7 is a schematic diagram of the correction of the prior art D3 camera and the robot arm coordinate system.
有關本發明為達成上述目的,所採用之技術手段及其功效,茲舉較佳實施例,並配合圖式加以說明如下。 With regard to the technical means adopted by the present invention in order to achieve the above-mentioned objects and their effects, preferred embodiments are described below with the drawings.
請同時參圖1、圖2、圖3及圖4,圖1為本發明機器手臂與3D相機的校正示意圖,圖2為本發明校正裝置的點雲圖,圖3為本發明確認點雲群成三個平面的示意圖,圖4為本發明形成三個平面方程式交點的校正定位點的示意圖。圖1中,本發明的機器手臂10一端為固定的基座11,串接多軸節12形成另一端活動的端末部13,端末部13上設工具14,各軸節12設致動馬達15,並連線至控制裝置16。機器手臂10經由控制裝置16控制各軸節12的致動馬達15轉動角度,移動機器手臂10端末部13的工具14。本發明的機器手臂10利用固定的基座11作為基準點,形成機器手臂10的手臂座標系統R。並藉由機器手臂10已知的各軸節12與端末部13的工具14長度,以及控制各肘節12致動馬達15轉動的角度,利用控制裝置16計算出工具14的移動位置,定位工具14在手臂座標系統R的座標,以精確控制移動工具14。
Please refer to Figure 1, Figure 2, Figure 3 and Figure 4 at the same time. Figure 1 is a schematic diagram of the calibration of the robot arm and the 3D camera of the present invention. Figure 2 is the point cloud diagram of the calibration device of the present invention. The schematic diagram of the three planes, FIG. 4 is a schematic diagram of the correction positioning point forming the intersection of the three plane equations according to the present invention. In Fig. 1, one end of the
本發明另在機器手臂10的工作環境中安裝3D相機17,3D相機17視窗拍攝的空間自成相機座標系統C,並將3D相機17拍攝的資訊,連線
至控制裝置16進行處理,但是3D相機17的相機座標系統C,相對於機器手臂10的手臂座標系統R的位置關係不明,需要進行相機座標系統C與手臂座標系統R座標間的轉換校正,才能協調統合3D相機17與機器手臂10的作業。
In the present invention, a
本發明機器手臂10與3D相機17進行校正時,先將校正裝置20設置在機器手臂10的工作環境中,並使校正裝置20出現在3D相機17的視野內。本發明的校正裝置20,在座架21上設置相對位置固定而相互分離不平行的三平板22,23,24,且讓三平板22,23,24延伸的三個空間平面相交於一點,作為外部參數校正的定位點。
When the
由於本發明的校正裝置20的三平板22,23,24延伸三個空間平面相交的定位點為虛擬,必須經由計算取得。首先利用3D相機17攝取校正裝置20的3D資訊,如圖2取得校正裝置20的點雲圖,本實施例雖以點雲圖為例,但包含且不限於點雲圖,深度圖亦可適用。為確認校正裝置20的三平板22,23,24的點雲圖位置,本發明針對3D相機17攝取點雲圖的各點雲,在相機座標系統C中的座標(X,Y,Z),計算相鄰點雲的Z值差距及向量夾角Θ,其中Z值差距為相鄰點雲在相機座標系統C中的Z軸座標的差距,向量夾角Θ為相鄰點雲構成的位置向量V與3D相機17水平或垂直軸線的夾角。
Since the three
以平板22的點雲圖舉例說明,圖3中以側面展示平板22的點雲,平板22主平面25上的相鄰點雲Pn-1,Pn,Pn+1,由於3D相機17設定點雲圖的解析度及平板22的主平面25斜率相同,相鄰點雲的Pn-1與Pn,或Pn與Pn+1在相機座標系統C中的Z軸座標的差距△Z0都相同,因此在平板22的同一主平面25上的相鄰點雲的Z值差距均相同。同理由於3D相機17設定點雲圖的解析度及平板22的主平面25斜率相同,平板22主平面25上的相鄰點雲的Pn-1
與Pn,或Pn與Pn+1構成的位置向量V0,方向及大小會相同一致,位置向量V0與3D相機17水平或垂直軸線的夾角Θ0也會相同,因此在平板22的同一主平面25上的相鄰點雲的向量夾角Θ均相同。
Take the point cloud diagram of the
至於在平板22邊緣平面26上的相鄰點雲Pk-1,Pk,由於與主平面25斜率不相同,在Z值差距△Z1及位置向量V1的向量夾角Θ1,與主平面25的Z值差距△Z0及向量夾角Θ均不同。因此極易利用相同的Z值差距及向量夾角劃分同類的點雲群,並如圖4所示,排除較少的點雲群,加以區分保留主平面25的點雲群。同理亦可區分保留其他平板23,24的主平面27,28的點雲群。如圖5所示,為本發明機器手臂校正工具中心點的方法的流程圖。本發明校正機器手臂工具中心點的詳細步驟說明如下:步驟S1,本發明進行校正時,首先在機器手臂完成安裝工具;在步驟S2,機器手臂移動工具至任一姿勢,利用機器手臂的力感測器,偵測及記錄工具重力及工具力矩;步驟S3,使工具的工具中心點進行碰觸;接著步驟S4,由力感測器偵測力的改變,判斷工具中心點受到碰觸時,記錄活動端的座標,同時記錄碰觸力及碰觸力矩;步驟S5,由記錄的碰觸力扣除工具重力計算淨碰觸力,記錄的碰觸力矩扣除工具力矩計算淨碰觸力矩;步驟S6,將淨碰觸力矩除以淨碰觸力,計算淨力臂;步驟S7,由淨力臂及活動端的座標,計算工具中心點的座標,校正工具中心點。
As for the adjacent point clouds Pk-1, Pk on the
接著利用各平板22,23,24的主平面25,27,28的點雲群,分別利用習知最小平方法建構各主平面25,27,28的平面方程式。首先設每一平面方程式為:z=Ax+By+C
Then, using the point cloud groups of the
根據最小平方法,係數
將前述獲得的三個主平面25,27,28的平面方程式,改寫為下列三個平面方程式:a 1 x+b 1 y+c 1 z+d 1=0 a 2 x+b 2 y+c 2 z+d 2=0 a 3 x+b 3 y+c 3 z+d 3=0
Rewrite the plane equations of the three
再求解三個主平面25,27,28交點T,利用習知三平面交點公式:,, Then solve the intersection T of the three
其中Det為且Det≠0 Where Det is And Det ≠ 0
本發明使用3D相機17中間部位最佳精度的3D資訊,計算三個主平面25,27,28的交點T,在相機座標系統C中的座標,作為機器手臂10外部參數校正的定位點。在建立定位點後,舉例說明使用前述定位點的
校正方式,但包含且不限於舉例。如圖5為本發明校正裝置固定在機械手臂的校正示意圖。本發明可將校正裝置20的座架21固定在機械手臂10的端末部13上,使校正裝置20與機械手臂10維持已知相對固定的位置關係,則可確定校正裝置20形成的定位點在手臂座標系統R的座標,再控制機械手臂10驅動校正裝置20在3D相機17視野中移動數點,藉3D相機17利用前述定位點的方法,一一獲得各定位點相對在相機座標系統C中的座標,即可達到3D相機17與機器手臂10的座標系統轉換的校正。
The present invention uses the best precision 3D information of the middle part of the
如圖6所示,為本發明3D相機與機械手臂座標系統校正方法的流程圖。本發明3D相機與機械手臂座標系統進行校正的詳細步驟如下:在步驟M1,將三平板的校正裝置放置在機器手臂工作環境及3D相機視野;步驟M2,利用3D相機攝取校正裝置的3D資訊;步驟M3,利用3D資訊中的點雲圖,計算各相鄰點雲的Z值差距及向量夾角;步驟M4,根據相同Z值差距及向量夾角將點雲分群;步驟M5,排除較少量的點雲群,形成三個主平面點雲群;步驟M6,利用最小平方法建立三個平面方程式;步驟M7,利用三平面交點公式計算三個平面方程式的交點;步驟M8,將交點作為外部參數校正的定位點。 As shown in FIG. 6, it is a flow chart of the calibration method of the 3D camera and the robot arm coordinate system of the present invention. The detailed steps of the 3D camera and the robot arm coordinate system of the present invention are as follows: in step M1, the three-platen calibration device is placed in the robot arm working environment and the 3D camera field of view; step M2, the 3D camera is used to capture the 3D information of the calibration device; Step M3, use the point cloud image in the 3D information to calculate the Z value gap and vector included angle of each adjacent point cloud; Step M4, group the point clouds according to the same Z value gap and vector included angle; Step M5, exclude a smaller number of points Cloud clusters to form three main plane point cloud clusters; Step M6, use the least square method to establish three plane equations; Step M7, use the three-plane intersection formula to calculate the intersection of the three plane equations; Step M8, use the intersection as an external parameter correction The anchor point.
因此本發明3D相機與機械手臂座標的校正裝置及方法,在校正裝置上相對位置固定及相互分離不平行的三平板,且三平板延伸的三個平面相交於一點,接著利用3D相機攝取的點雲3D資訊,根據相鄰點雲的相同Z值差度與向量夾角,劃分及排除點雲群,形成三個主平面點雲群,再由三個主平面的點雲群,利用最小平方法建構三個平面方程式,並利用公式計算三個平面方程式的交點,作為校正定位點。由於本發明利用3D相機 最佳精度的3D資訊,直接計算定位點,可避免儀器測量的誤差及成本,亦避免選定邊緣形狀特徵的變形誤差,達到提高定位點的正確性的發明目的。 Therefore, the coordinate correction device and method for the 3D camera and the robotic arm of the present invention have three plates that are fixed in relative positions and separated from each other on the correction device, and the three planes extending from the three plates intersect at one point, and then use the point captured by the 3D camera Cloud 3D information, according to the same Z value difference of adjacent point clouds and vector included angle, divide and exclude point cloud groups to form three main plane point cloud groups, and then three main plane point cloud groups, using the least square method Construct three plane equations, and use the formula to calculate the intersection point of the three plane equations as the correction positioning point. Because the present invention uses a 3D camera The 3D information with the best accuracy can directly calculate the positioning point, which can avoid the error and cost of the instrument measurement, and also avoid the deformation error of the selected edge shape feature, so as to achieve the purpose of improving the accuracy of the positioning point.
以上所述者,僅為用以方便說明本發明之較佳實施例,本發明之範圍不限於該等較佳實施例,凡依本發明所做的任何變更,於不脫離本發明之精神下,皆屬本發明申請專利之範圍。 The above are only for the convenience of describing the preferred embodiments of the present invention. The scope of the present invention is not limited to these preferred embodiments. Any changes made according to the present invention will not depart from the spirit of the present invention. , All belong to the scope of the invention patent application.
11‧‧‧基座 11‧‧‧Pedestal
12‧‧‧軸節 12‧‧‧Shaft section
13‧‧‧端末部 13‧‧‧End
14‧‧‧工具 14‧‧‧Tools
15‧‧‧致動馬達 15‧‧‧Actuating motor
16‧‧‧控制裝置 16‧‧‧Control device
17‧‧‧3D相機 17‧‧‧3D camera
20‧‧‧校正裝置 20‧‧‧Calibration device
21‧‧‧座架 21‧‧‧Seat frame
22,23,24‧‧‧平板 22,23,24‧‧‧Plate
Claims (9)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW108110367A TWI706841B (en) | 2019-03-22 | 2019-03-22 | Device and method for calibrating coordinate of 3d camera and robot arm |
CN202010117159.7A CN111716340B (en) | 2019-03-22 | 2020-02-25 | Correcting device and method for coordinate system of 3D camera and mechanical arm |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW108110367A TWI706841B (en) | 2019-03-22 | 2019-03-22 | Device and method for calibrating coordinate of 3d camera and robot arm |
Publications (2)
Publication Number | Publication Date |
---|---|
TW202035084A true TW202035084A (en) | 2020-10-01 |
TWI706841B TWI706841B (en) | 2020-10-11 |
Family
ID=72564064
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW108110367A TWI706841B (en) | 2019-03-22 | 2019-03-22 | Device and method for calibrating coordinate of 3d camera and robot arm |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111716340B (en) |
TW (1) | TWI706841B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI818715B (en) * | 2022-09-06 | 2023-10-11 | 正崴精密工業股份有限公司 | A method for visual inspection of curved objects |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5365379B2 (en) * | 2009-07-06 | 2013-12-11 | 富士電機株式会社 | Robot system and robot system calibration method |
US9393694B2 (en) * | 2010-05-14 | 2016-07-19 | Cognex Corporation | System and method for robust calibration between a machine vision system and a robot |
TWI408037B (en) * | 2010-12-03 | 2013-09-11 | Ind Tech Res Inst | A position method and a calibrating method for the robot arm |
US20130329012A1 (en) * | 2012-06-07 | 2013-12-12 | Liberty Reach Inc. | 3-d imaging and processing system including at least one 3-d or depth sensor which is continually calibrated during use |
CN103337066B (en) * | 2013-05-27 | 2016-05-18 | 清华大学 | 3D obtains the calibration steps of system |
CN104143194B (en) * | 2014-08-20 | 2017-09-08 | 清华大学 | A kind of point cloud segmentation method and device |
CN104463851B (en) * | 2014-11-19 | 2018-05-22 | 哈尔滨工业大学深圳研究生院 | A kind of sole edge line automatic tracking method based on robot |
CN105045263B (en) * | 2015-07-06 | 2016-05-18 | 杭州南江机器人股份有限公司 | A kind of robot method for self-locating based on Kinect depth camera |
US9965870B2 (en) * | 2016-03-29 | 2018-05-08 | Institut National D'optique | Camera calibration method using a calibration target |
TW201739587A (en) * | 2016-05-04 | 2017-11-16 | 廣明光電股份有限公司 | Calibration device and control method for a robot arm |
US10380767B2 (en) * | 2016-08-01 | 2019-08-13 | Cognex Corporation | System and method for automatic selection of 3D alignment algorithms in a vision system |
CN107945198B (en) * | 2016-10-13 | 2021-02-23 | 北京百度网讯科技有限公司 | Method and device for marking point cloud data |
WO2018182538A1 (en) * | 2017-03-31 | 2018-10-04 | Agency For Science, Technology And Research | Systems and methods that improve alignment of a robotic arm to an object |
CN107590836B (en) * | 2017-09-14 | 2020-05-22 | 斯坦德机器人(深圳)有限公司 | Kinect-based charging pile dynamic identification and positioning method and system |
CN107862716A (en) * | 2017-11-29 | 2018-03-30 | 合肥泰禾光电科技股份有限公司 | Mechanical arm localization method and positioning mechanical arm |
CN108346165B (en) * | 2018-01-30 | 2020-10-30 | 深圳市易尚展示股份有限公司 | Robot and three-dimensional sensing assembly combined calibration method and device |
CN108399639B (en) * | 2018-02-12 | 2021-01-26 | 杭州蓝芯科技有限公司 | Rapid automatic grabbing and placing method based on deep learning |
CN108629315B (en) * | 2018-05-07 | 2020-09-25 | 河海大学 | Multi-plane identification method for three-dimensional point cloud |
CN108830191B (en) * | 2018-05-30 | 2022-04-01 | 上海电力学院 | Mobile robot SLAM method based on improved environment measurement module EMM and ORB algorithm |
CN109202912B (en) * | 2018-11-15 | 2020-09-11 | 太原理工大学 | Method for registering target contour point cloud based on monocular depth sensor and mechanical arm |
CN109323656B (en) * | 2018-11-24 | 2024-06-21 | 上海勘察设计研究院(集团)股份有限公司 | Novel target for point cloud registration and extraction algorithm thereof |
-
2019
- 2019-03-22 TW TW108110367A patent/TWI706841B/en active
-
2020
- 2020-02-25 CN CN202010117159.7A patent/CN111716340B/en active Active
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI818715B (en) * | 2022-09-06 | 2023-10-11 | 正崴精密工業股份有限公司 | A method for visual inspection of curved objects |
Also Published As
Publication number | Publication date |
---|---|
TWI706841B (en) | 2020-10-11 |
CN111716340B (en) | 2022-10-14 |
CN111716340A (en) | 2020-09-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108453701B (en) | Method for controlling robot, method for teaching robot, and robot system | |
JP6280525B2 (en) | System and method for runtime determination of camera miscalibration | |
JP6685199B2 (en) | System and method for combining machine vision coordinate spaces in a guided assembly environment | |
CN103302666B (en) | Messaging device and information processing method | |
JP2020116734A (en) | System and method for automatic hand-eye calibration of vision system for robot motion | |
US8520067B2 (en) | Method for calibrating a measuring system | |
US11548156B2 (en) | Device and method for calibrating coordinate system of 3D camera and robotic arm | |
CN110136208A (en) | A kind of the joint automatic calibration method and device of Visual Servoing System | |
JP7189988B2 (en) | System and method for three-dimensional calibration of vision systems | |
CN111612794A (en) | Multi-2D vision-based high-precision three-dimensional pose estimation method and system for parts | |
CN109272555B (en) | External parameter obtaining and calibrating method for RGB-D camera | |
CN108326850A (en) | A kind of accurate mobile mechanical arm of robot reaches the method and system of designated position | |
CN112258583B (en) | Distortion calibration method for close-range image based on equal distortion partition | |
US20230083150A1 (en) | Scanning system and calibration thereof | |
TW202212081A (en) | Calibration apparatus and calibration method for coordinate system of robotic arm | |
CN114001651B (en) | Large-scale slender barrel type component pose in-situ measurement method based on binocular vision measurement and priori detection data | |
CN114310868A (en) | Coordinate system correction device and method for robot arm | |
TWI706841B (en) | Device and method for calibrating coordinate of 3d camera and robot arm | |
JP6807450B2 (en) | Articulated robot parallelism determination method and articulated robot tilt adjustment device | |
US11418771B1 (en) | Method for calibrating 3D camera by employing calibrated 2D camera | |
TWI749376B (en) | Method for calibrating 3d camera | |
JP7414850B2 (en) | robot system | |
JP6965422B2 (en) | Camera parallelism judgment method | |
KR101626374B1 (en) | Precision position alignment technique using edge based corner estimation | |
TW201900358A (en) | Method for calibrating coordinator of robot arm |