Laser Scanning and Parametrization of Weld Grooves with Reflective Surfaces
<p>A commercial laser triangulation sensor (1) pointing at the weld groove of a test object (2). A local sensor fixed frame is denoted Frame 0 and all data points <math display="inline"><semantics> <msub> <mi mathvariant="bold">p</mi> <mi>j</mi> </msub> </semantics></math> are obtained in the coordinates of Frame 0.</p> "> Figure 2
<p>Three test objects used in the tests. The objects (<span class="html-italic">a</span>) and (<span class="html-italic">b</span>) represent a typical stub–joint groove, while the object (<span class="html-italic">c</span>) represents a typical butt-joint groove. The objects are welded with root welds of different widths. Schematic arrangement of the plates and the weld root pass in the test objects is shown to the right.</p> "> Figure 3
<p>Two examples of point sets <math display="inline"><semantics> <msub> <mi mathvariant="bold">S</mi> <mi>i</mi> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi mathvariant="bold">S</mi> <mi>j</mi> </msub> </semantics></math>. Each of the point sets has the center of mass <math display="inline"><semantics> <msub> <mi mathvariant="bold">m</mi> <mi>i</mi> </msub> </semantics></math> or <math display="inline"><semantics> <msub> <mi mathvariant="bold">m</mi> <mi>j</mi> </msub> </semantics></math> and the line fit <math display="inline"><semantics> <msub> <mi mathvariant="sans-serif">L</mi> <mi>i</mi> </msub> </semantics></math> or <math display="inline"><semantics> <msub> <mi mathvariant="sans-serif">L</mi> <mi>j</mi> </msub> </semantics></math>. The point sets are marked with different color and the line fit is shown using the same colors.</p> "> Figure 4
<p>A line segment <span class="html-italic">i</span> is shown in blue color, with three different cases of point-to-segment distances <math display="inline"><semantics> <msub> <mi>d</mi> <mrow> <mi>j</mi> <mi>i</mi> </mrow> </msub> </semantics></math> for <math display="inline"><semantics> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mo>,</mo> <mn>3</mn> </mrow> </semantics></math>.</p> "> Figure 5
<p>Inliers for a line segment with the inlier tolerance <math display="inline"><semantics> <msub> <mover accent="true"> <mi>d</mi> <mo>¯</mo> </mover> <mrow> <mi>i</mi> <mi>n</mi> </mrow> </msub> </semantics></math>. The gap between points along the <math display="inline"><semantics> <msub> <mi>x</mi> <mn>0</mn> </msub> </semantics></math> axis is denoted <math display="inline"><semantics> <msub> <mi>d</mi> <mrow> <mi>g</mi> <mo>,</mo> <mi>i</mi> <mi>n</mi> </mrow> </msub> </semantics></math>.</p> "> Figure 6
<p>Two models for groove profiles: (<b>a</b>) a T-joint (i.e., stub joint) groove and (<b>b</b>) a butt joint groove (a V-groove) with a root weld. A stub groove and a butt groove consist of five or six corner points, respectively, which can also be represented as four or five sequentially connected line segments. The vectors <math display="inline"><semantics> <msub> <mi mathvariant="bold">v</mi> <mi>i</mi> </msub> </semantics></math> show the directions of the segments.</p> "> Figure 7
<p>Procedure of searching the two first line segments using RANSAC. Inliers of segment 1 are shown in blue. These inliers, excluding the inliers separated by the gap <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>g</mi> <mo>,</mo> <mi>i</mi> <mi>n</mi> </mrow> </msub> <mo>></mo> <msub> <mover accent="true"> <mi>d</mi> <mo>¯</mo> </mover> <mrow> <mi>g</mi> <mo>,</mo> <mi>i</mi> <mi>n</mi> </mrow> </msub> </mrow> </semantics></math>, are used for defining the segment. After segment 1 is defined, the best inlier set for segment 2 is found, which is shown in green, and segment 2 is defined.</p> "> Figure 8
<p>(<b>a</b>) Two segments 1 and 2 are found. The segments are divided since the gap <math display="inline"><semantics> <mrow> <msub> <mi>d</mi> <mrow> <mi>g</mi> <mo>,</mo> <mi>i</mi> <mi>n</mi> </mrow> </msub> <mo>></mo> <msub> <mover accent="true"> <mi>d</mi> <mo>¯</mo> </mover> <mrow> <mi>g</mi> <mo>,</mo> <mi>i</mi> <mi>n</mi> </mrow> </msub> </mrow> </semantics></math>. (<b>b</b>) If the angle between <math display="inline"><semantics> <msub> <mi mathvariant="bold">v</mi> <mn>1</mn> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi mathvariant="bold">v</mi> <mn>2</mn> </msub> </semantics></math> is less than the defined angle tolerance <math display="inline"><semantics> <msub> <mover accent="true"> <mi>α</mi> <mo>¯</mo> </mover> <mn>12</mn> </msub> </semantics></math>, then the two segments are merged.</p> "> Figure 9
<p>(<b>a</b>) Two segments 1 and 2 are found. (<b>b</b>) If the angle <math display="inline"><semantics> <mrow> <msub> <mi>α</mi> <mn>12</mn> </msub> <mo>≥</mo> <msub> <mover accent="true"> <mi>α</mi> <mo>¯</mo> </mover> <mn>12</mn> </msub> </mrow> </semantics></math>, then the start point <math display="inline"><semantics> <msub> <mi mathvariant="bold">p</mi> <mrow> <mi>s</mi> <mn>2</mn> </mrow> </msub> </semantics></math> is moved to the intersection between the segments 1 and 2.</p> "> Figure 10
<p>(<b>a</b>) Segment <span class="html-italic">i</span> and its local Frame <math display="inline"><semantics> <mrow> <mi>l</mi> <mi>i</mi> </mrow> </semantics></math> are defined over the inlier set, (<b>b</b>) A line fit <math display="inline"><semantics> <msubsup> <mi mathvariant="sans-serif">L</mi> <mrow> <mi>i</mi> <mi>n</mi> <mo>,</mo> <mi>i</mi> </mrow> <mrow> <mi>l</mi> <mi>i</mi> </mrow> </msubsup> </semantics></math> for the inliers is found and the <math display="inline"><semantics> <msub> <mi>a</mi> <mrow> <mi>i</mi> <mi>n</mi> <mo>,</mo> <mi>i</mi> </mrow> </msub> </semantics></math> coefficient of the line equation is evaluated, (<b>c</b>) several points are removed from the inlier set and a line fit is evaluated again, (<b>d</b>) when <math display="inline"><semantics> <msub> <mi>a</mi> <mrow> <mi>i</mi> <mi>n</mi> <mo>,</mo> <mi>i</mi> </mrow> </msub> </semantics></math> is within the tolerance, segment <span class="html-italic">i</span> is defined over the rest of the inliers.</p> "> Figure 11
<p>Input data (<b>a</b>) and the output (<b>b</b>) of the iterative error elimination algorithm. The different point colors indicate point sets associated with different segments, initial point association is shown in (<b>a</b>) and after the final iteration—in (<b>b</b>).</p> "> Figure 12
<p>Alignment of segment <span class="html-italic">i</span> to the point set <math display="inline"><semantics> <msub> <mi mathvariant="bold">S</mi> <mi>i</mi> </msub> </semantics></math>. First the segment is translated to the mass center <math display="inline"><semantics> <msub> <mi mathvariant="bold">m</mi> <mi>i</mi> </msub> </semantics></math> of <math display="inline"><semantics> <msub> <mi mathvariant="bold">S</mi> <mi>i</mi> </msub> </semantics></math>, then it is rotated to match the direction of <math display="inline"><semantics> <msub> <mi mathvariant="sans-serif">L</mi> <mi>i</mi> </msub> </semantics></math>.</p> "> Figure 13
<p>Schematic representation of the iterative corner error elimination algorithm. Two iterations are shown with the initial and final groove profile arrangements.</p> "> Figure 14
<p>Reflections from laser projection onto shiny metal surface and the corresponding data noise in the graph.</p> "> Figure 15
<p>Schematic representation of the noise detection procedure. When segment 1 is found, the algorithm detects noise in the set of the following points <math display="inline"><semantics> <msub> <mi mathvariant="bold">S</mi> <mi>j</mi> </msub> </semantics></math> (<b>a</b>), then the following point sets <math display="inline"><semantics> <msub> <mi mathvariant="bold">S</mi> <mrow> <mi>j</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi mathvariant="bold">S</mi> <mrow> <mi>j</mi> <mo>+</mo> <mn>2</mn> </mrow> </msub> </semantics></math> are checked for noise (<b>b</b>), segment 2 is defined once a noise free set is found (<b>c</b>).</p> "> Figure 16
<p>Flow diagram the proposed procedure for groove parametrization with noisy data points. The diagram does not include the iterative error elimination step.</p> "> Figure 17
<p>Parametrization results with groove corner points for dataset 1. The blue dots show the result before the iterative error elimination procedure and the red dots show after. A close-up of the graph is shown to the right.</p> "> Figure 18
<p>Parametrization results with groove corner points for dataset 2. The blue dots show the result before the iterative error elimination procedure and the red dots show after. A close-up of the graph is shown to the right.</p> "> Figure 19
<p>Parametrization results with groove corner points for dataset 3. The blue dots show the result before the iterative error elimination procedure and the red dots show after. A close-up of the graph is shown to the right.</p> "> Figure 20
<p>Parametrization results with groove corner points for dataset 2 (to the left) and 3 (to the right), when the correction of assigned corners step is omitted. The blue dots show the result before the iterative error elimination procedure and the red dots show after.</p> "> Figure 21
<p>Parametrization results with groove corner points for datasets 4 (to the left) and 5 (to the right). The blue dots show the result before the iterative error elimination procedure and the red dots show after.</p> "> Figure 22
<p>Parametrization results with groove corner points for datasets 6 (to the left) and 7 (to the right). The blue dots show the result before the iterative error elimination procedure and the red dots show after. The grey points have been identified as noise by the noise detection algorithm.</p> "> Figure 23
<p>Parametrization results with groove corner points for datasets 8 (to the left) and 9 (to the right). The blue dots show the result before the iterative error elimination procedure and the red dots show after. The grey points have been identified as noise by the noise detection algorithm.</p> "> Figure 24
<p>Parametrization results with groove corner points for datasets 10 (to the left) and 11 (to the right). The blue dots show the result before the iterative error elimination procedure and the red dots show after. The grey points have been identified as noise by the noise detection algorithm.</p> "> Figure 25
<p>Parametrization results with groove corner points for datasets 12 (to the left) and 13 (to the right). The red dots show the result from the proposed procedure and the black dots show results from the conventional RANSAC. The grey points have been identified as noise by the noise detection algorithm.</p> "> Figure 26
<p>Parametrization results with groove corner points for datasets 14 (to the left) and 15 (to the right). The red dots show the result from the proposed procedure and the black dots show results from the conventional RANSAC. The grey points have been identified as noise by the noise detection algorithm.</p> ">
Abstract
:1. Introduction
2. Materials and System Description
3. Theoretical Basis of the Method
3.1. Geometrical Definitions
3.2. Detection of Weld Groove Corners from Noise-Free Data
3.2.1. Sequential RANSAC
3.2.2. Merging and Intersecting Segments
3.2.3. Correction of Assigned Corners
3.2.4. Iterative Error Elimination
3.3. Detection of Weld Groove Corners from Noisy Data
3.3.1. Noise Detection
3.3.2. Iterative Error Elimination for Noisy Data
4. Experimental Results
4.1. Results Using Noise-Free Data
4.1.1. Stub Joints
4.1.2. Butt Joints
4.2. Results Using Noisy Data
5. Discussion and Comparison to the Conventional RANSAC Algorithm
- Can be used with commercial sensors where only triangulated data points are available;
- Can be used with strong data noise, cause by laser reflections;
- Can be used for grooves in both stub and butt joints;
- Does not require pre-knowledge of the weld groove geometry,
- Might be too slow for some real-time robot path generators;
- Does not account for a gap between plates;
- Is not developed to filter out arc and splash noise.
5.1. Conventional RANSAC Algorithm
5.2. Comparison of Experimental Results
6. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Acknowledgments
Conflicts of Interest
References
- Russell, A.; Becker, A.; Chumbley, L.; Enyart, D.; Bowersox, B.; Hanigan, T.; Labbe, J.; Moran, J.; Spicher, E.; Zhong, L. A survey of flaws near welds detected by side angle ultrasound examination of anhydrous ammonia nurse tanks. J. Loss Prev. Process. Ind. 2016, 43, 263–272. [Google Scholar] [CrossRef]
- Yu, P.; Xu, G.; Gu, X.; Zhou, G.; Tian, Y. A low-cost infrared sensing system for monitoring the MIG welding process. Int. J. Adv. Manuf. Technol. 2017, 92, 4031–4038. [Google Scholar] [CrossRef]
- Le, J.; Zhang, H.; Chen, X. Realization of rectangular fillet weld tracking based on rotating arc sensors and analysis of experimental results in gas metal arc welding. Robot. Comput. Integr. Manuf. 2018, 49, 263–276. [Google Scholar] [CrossRef]
- Gao, X.; Mo, L.; Xiao, Z.; Chen, X.; Katayama, S. Seam tracking based on Kalman filtering of micro-gap weld using magneto-optical image. Int. J. Adv. Manuf. Technol. 2016, 83, 21–32. [Google Scholar] [CrossRef]
- Yang, L.; Liu, Y.; Peng, J. Advances techniques of the structured light sensing in intelligent welding robots: A review. Int. J. Adv. Manuf. Technol. 2020, 110, 1027–1046. [Google Scholar] [CrossRef]
- Lei, T.; Rong, Y.; Wang, H.; Huang, Y.; Li, M. A review of vision-aided robotic welding. Comput. Ind. 2020, 123, 103326. [Google Scholar] [CrossRef]
- Patil, V.; Patil, I.; Kalaichelvi, V.; Karthikeyan, R. Extraction of weld seam in 3d point clouds for real time welding using 5 dof robotic arm. In Proceedings of the 2019 5th International Conference on Control, Automation and Robotics (ICCAR), Beijing, China, 19–22 April 2019; pp. 727–733. [Google Scholar]
- Ahmed, S.M.; Tan, Y.Z.; Chew, C.M.; Al Mamun, A.; Wong, F.S. Edge and corner detection for unorganized 3d point clouds with application to robotic welding. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 7350–7355. [Google Scholar]
- Peng, R.; Navarro-Alarcon, D.; Wu, V.; Yang, W. A Point Cloud-Based Method for Automatic Groove Detection and Trajectory Generation of Robotic Arc Welding Tasks. In Proceedings of the 2020 17th International Conference on Ubiquitous Robots (UR), Kyoto, Japan, 22–26 June 2020; pp. 380–386. [Google Scholar]
- Wang, N.; Zhong, K.; Shi, X.; Zhang, X. A robust weld seam recognition method under heavy noise based on structured-light vision. Robot. Comput. Integr. Manuf. 2020, 61, 101821. [Google Scholar] [CrossRef]
- Yang, L.; Li, E.; Long, T.; Fan, J.; Liang, Z. A high-speed seam extraction method based on the novel structured-light sensor for arc welding robot: A review. IEEE Sens. J. 2018, 18, 8631–8641. [Google Scholar] [CrossRef]
- Zhang, G.; Zhang, Y.; Tuo, S.; Hou, Z.; Yang, W.; Xu, Z.; Wu, Y.; Yuan, H.; Shin, K. A Novel Seam Tracking Technique with a Four-Step Method and Experimental Investigation of Robotic Welding Oriented to Complex Welding Seam. Sensors 2021, 21, 3067. [Google Scholar] [CrossRef]
- Manorathna, R.; Phairatt, P.; Ogun, P.; Widjanarko, T.; Chamberlain, M.; Justham, L.; Marimuthu, S.; Jackson, M.R. Feature extraction and tracking of a weld joint for adaptive robotic welding. In Proceedings of the 2014 13th International Conference on Control Automation Robotics & Vision (ICARCV), Singapore, 10–12 December 2014; pp. 1368–1372. [Google Scholar]
- Lü, X.; Gu, D.; Wang, Y.; Qu, Y.; Qin, C.; Huang, F. Feature extraction of welding seam image based on laser vision. IEEE Sens. J. 2018, 18, 4715–4724. [Google Scholar] [CrossRef]
- Chang, D.; Son, D.; Lee, J.; Lee, D.; Kim, T.w.; Lee, K.Y.; Kim, J. A new seam-tracking algorithm through characteristic-point detection for a portable welding robot. Robot. Comput. Integr. Manuf. 2012, 28, 1–13. [Google Scholar] [CrossRef]
- Li, X.; Li, X.; Ge, S.S.; Khyam, M.O.; Luo, C. Automatic welding seam tracking and identification. IEEE Trans. Ind. Electron. 2017, 64, 7261–7271. [Google Scholar] [CrossRef]
- Kiddee, P.; Fang, Z.; Tan, M. An automated weld seam tracking system for thick plate using cross mark structured light. Int. J. Adv. Manuf. Technol. 2016, 87, 3589–3603. [Google Scholar] [CrossRef]
- Han, Y.; Fan, J.; Yang, X. A structured light vision sensor for on-line weld bead measurement and weld quality inspection. Int. J. Adv. Manuf. Technol. 2020, 106, 2065–2078. [Google Scholar] [CrossRef]
- Yao, T.; Gai, Y.; Liu, H. Development of a robot system for pipe welding. In Proceedings of the 2010 International Conference on Measuring Technology and Mechatronics Automation, Changsha, China, 13–14 March 2010; Volume 1, pp. 1109–1112. [Google Scholar]
- Chen, X.; Dharmawan, A.G.; Foong, S.; Soh, G.S. Seam tracking of large pipe structures for an agile robotic welding system mounted on scaffold structures. Robot. Comput. Integr. Manuf. 2018, 50, 242–255. [Google Scholar] [CrossRef]
- Liu, Y.; Tian, X. Robot path planning with two-axis positioner for non-ideal sphere-pipe joint welding based on laser scanning. Int. J. Adv. Manuf. Technol. 2019, 105, 1295–1310. [Google Scholar] [CrossRef]
- Yan, S.; Ong, S.; Nee, A. Optimal pass planning for robotic welding of large-dimension joints with deep grooves. Procedia CIRP 2016, 56, 188–192. [Google Scholar] [CrossRef] [Green Version]
- Yan, S.; Fang, H.; Ong, S.; Nee, A. Optimal pass planning for robotic welding of large-dimension joints with nonuniform grooves. Proc. Inst. Mech. Eng. Part J. Eng. Manuf. 2018, 232, 2386–2397. [Google Scholar] [CrossRef]
- Fang, H.; Ong, S.; Nee, A. Adaptive pass planning and optimization for robotic welding of complex joints. Adv. Manuf. 2017, 5, 93–104. [Google Scholar] [CrossRef]
- Rout, A.; Deepak, B.; Biswal, B. Advances in weld seam tracking techniques for robotic welding: A review. Robot. Comput. Integr. Manuf. 2019, 56, 12–37. [Google Scholar] [CrossRef]
- Chen, H.; Liu, W.; Huang, L.; Xing, G.; Wang, M.; Sun, H. The decoupling visual feature extraction of dynamic three-dimensional V-type seam for gantry welding robot. Int. J. Adv. Manuf. Technol. 2015, 80, 1741–1749. [Google Scholar] [CrossRef]
- Wu, Q.Q.; Lee, J.P.; Park, M.H.; Jin, B.J.; Kim, D.H.; Park, C.K.; Kim, I.S. A study on the modified Hough algorithm for image processing in weld seam tracking. J. Mech. Sci. Technol. 2015, 29, 4859–4865. [Google Scholar] [CrossRef]
- Ratava, J.; Penttilä, S.; Lohtander, M.; Kah, P. Optical measurement of groove geometry. Procedia Manuf. 2018, 25, 111–117. [Google Scholar] [CrossRef]
- Siciliano, B.; Sciavicco, L.; Villani, L.; Oriolo, G. Robotics: Modelling, Planning and Control; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
No. | Type | Root, | Noise | No. | Type | Root, | Noise | No. | Type | Root, | Noise |
---|---|---|---|---|---|---|---|---|---|---|---|
[mm] | [mm] | [mm] | |||||||||
1 | Stub | 0 | No | 6 | Stub | 3 | Low N1 | 11 | Stub | 0 | Medium N3 |
2 | Stub | 11 | No | 7 | Stub | 3 | High N1 | 12 | Butt | 7 | Medium N1 |
3 | Stub | 3 | No | 8 | Stub | 3 | Medium N2 | 13 | Butt | 7 | Medium N2 |
4 | Butt | 0 | No | 9 | Stub | 3 | High N2 | 14 | Butt | 7 | Medium N3 |
5 | Butt | 7 | No | 10 | Stub | 0 | Low N3 | 15 | Butt | 7 | Low N3 |
Dataset | 1 | 2 with IEES | 2 without IEES | 3 | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Max | Min | Mean | Max | Min | Mean | Max | Min | Mean | Max | Min | Mean | |
Time, [s] | 0.076 | 0.043 | 0.061 | 0.125 | 0.056 | 0.092 | - | - | - | 0.145 | 0.058 | 0.126 |
, [mm] | 0.034 | 0.034 | 0.034 | 0.064 | 0.061 | 0.062 | 0.062 | 0.062 | 0.062 | 0.056 | 0.056 | 0.056 |
, [mm] | 0.070 | 0.070 | 0.070 | 0.114 | 0.095 | 0.109 | 1.058 | 0.040 | 0.192 | 0.041 | 0.041 | 0.041 |
, [mm] | 0.050 | 0.050 | 0.050 | 0.123 | 0.115 | 0.118 | 1.152 | 0.122 | 0.740 | 0.095 | 0.095 | 0.095 |
, [mm] | - | - | - | 0.046 | 0.044 | 0.044 | 0.659 | 0.400 | 0.618 | 0.074 | 0.074 | 0.074 |
Dataset | 4 | 5 | ||||
---|---|---|---|---|---|---|
Max | Min | Mean | Max | Min | Mean | |
Time, [s] | 0.138 | 0.062 | 0.091 | 0.233 | 0.093 | 0.142 |
, [mm] | 0.030 | 0.028 | 0.030 | 0.062 | 0.030 | 0.046 |
, [mm] | 0.074 | 0.074 | 0.074 | 0.077 | 0.026 | 0.052 |
, [mm] | 0.036 | 0.035 | 0.036 | 0.161 | 0.160 | 0.160 |
, [mm] | 0.032 | 0.032 | 0.032 | 0.042 | 0.041 | 0.041 |
, [mm] | - | - | - | 0.019 | 0.019 | 0.019 |
Dataset | 6 | 7 | 8 | 9 | 10 | 11 | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Max | Mean | Max | Mean | Max | Mean | Max | Mean | Max | Mean | Max | Mean | |
Time, [s] | 0.133 | 0.126 | 0.262 | 0.139 | 0.089 | 0.061 | 0.133 | 0.108 | 0.162 | 0.125 | 0.066 | 0.046 |
, [mm] | 0.258 | 0.258 | 0.178 | 0.115 | 0.086 | 0.086 | 0.106 | 0.105 | 0.077 | 0.077 | 0.070 | 0.070 |
, [mm] | 0.144 | 0.141 | 0.040 | 0.040 | 0.078 | 0.076 | 0.050 | 0.047 | 0.122 | 0.120 | 0.086 | 0.086 |
, [mm] | 0.229 | 0.220 | 0.108 | 0.11 | 0.062 | 0.044 | 0.160 | 0.160 | 0.102 | 0.102 | 0.120 | 0.120 |
, [mm] | 0.077 | 0.077 | 0.039 | 0.039 | 0.049 | 0.048 | 0.157 | 0.157 | - | - | - | - |
Mean Values | Proposed | Conventional |
---|---|---|
Time, [s] | 0.142 | 1.886 |
, [mm] | 0.046 | 0.037 |
, [mm] | 0.052 | 0.062 |
, [mm] | 0.160 | 0.164 |
, [mm] | 0.041 | 0.042 |
, [mm] | 0.019 | 0.036 |
Dataset | 12 | 13 | 14 | 15 | ||||
---|---|---|---|---|---|---|---|---|
Mean Values | Prop. | Conv. | Prop. | Conv. | Prop. | Conv. | Prop. | Conv. |
Time, [s] | 0.095 | 1.858 | 0.082 | 1.258 | 0.087 | 1.205 | 0.087 | 1.665 |
, [mm] | 0.106 | 0.155 | 0.077 | 0.068 | 0.025 | 0.080 | 0.082 | 0.146 |
, [mm] | 0.105 | 0.092 | 0.133 | 0.057 | 0.060 | 0.095 | 0.154 | 0.256 |
, [mm] | 0.151 | 0.241 | 0.061 | 0.072 | 0.152 | 0.153 | 0.135 | 0.277 |
, [mm] | 0.078 | 0.100 | 0.030 | 0.117 | 0.044 | 0.078 | 0.303 | 0.151 |
, [mm] | 0.035 | 0.115 | 0.069 | 0.394 | 0.021 | 0.060 | 0.090 | 0.103 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cibicik, A.; Tingelstad, L.; Egeland, O. Laser Scanning and Parametrization of Weld Grooves with Reflective Surfaces. Sensors 2021, 21, 4791. https://doi.org/10.3390/s21144791
Cibicik A, Tingelstad L, Egeland O. Laser Scanning and Parametrization of Weld Grooves with Reflective Surfaces. Sensors. 2021; 21(14):4791. https://doi.org/10.3390/s21144791
Chicago/Turabian StyleCibicik, Andrej, Lars Tingelstad, and Olav Egeland. 2021. "Laser Scanning and Parametrization of Weld Grooves with Reflective Surfaces" Sensors 21, no. 14: 4791. https://doi.org/10.3390/s21144791
APA StyleCibicik, A., Tingelstad, L., & Egeland, O. (2021). Laser Scanning and Parametrization of Weld Grooves with Reflective Surfaces. Sensors, 21(14), 4791. https://doi.org/10.3390/s21144791