Nothing Special   »   [go: up one dir, main page]

CN103727930B - A kind of laser range finder based on edge matching and camera relative pose scaling method - Google Patents

A kind of laser range finder based on edge matching and camera relative pose scaling method Download PDF

Info

Publication number
CN103727930B
CN103727930B CN201310742582.6A CN201310742582A CN103727930B CN 103727930 B CN103727930 B CN 103727930B CN 201310742582 A CN201310742582 A CN 201310742582A CN 103727930 B CN103727930 B CN 103727930B
Authority
CN
China
Prior art keywords
point set
range finder
laser range
edge
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310742582.6A
Other languages
Chinese (zh)
Other versions
CN103727930A (en
Inventor
熊蓉
李千山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Which Hangzhou Science And Technology Co Ltd
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201310742582.6A priority Critical patent/CN103727930B/en
Publication of CN103727930A publication Critical patent/CN103727930A/en
Application granted granted Critical
Publication of CN103727930B publication Critical patent/CN103727930B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a kind of laser range finder based on edge matching and camera relative pose scaling method <b>, its step of </b> first extracts the edge contour of laser range finder cloud data edge contour and camera image, set up the probability distribution at cloud data edge and the probability distribution of image border, then minimize the KL distance of two distributions, try to achieve the relative pose parameter of laser range finder and camera.The inventive method does not rely on specific environment structure, does not rely on the auxiliary items such as scaling board; Can on-line operation, the relative pose of immediate updating laser range finder and camera; The point cloud edge contour extracted and image border profile can be further used for other application such as environmental objects identification and location.

Description

A kind of laser range finder based on edge matching and camera relative pose scaling method
Technical field
The present invention relates to multi-sensor information fusion field, particularly relate to a kind of laser range finder based on edge matching and camera relative pose scaling method.
Background technology:
Traditional laser range finder and the general scaling board of camera relative pose scaling method realize, by the angle point of visual identity scaling board, set up the constraint that these angle points are positioned at scaling board plane in space, minimum error function represents rotation and the translation matrix (R, T) of relative pose to obtain.
Though separately there is certain methods not need scaling board, also need the ad hoc structure of environment to set up geometrical constraint.
These methods all rely on specific object, need to prepare especially, and are unfavorable for on-line operation.Based on this, the present invention proposes a kind of laser range finder based on edge matching and camera relative pose scaling method, described method extracts the edge line representing environment edge contour respectively from laser data and image, by minimizing the symmetrical KL distance of edge line distribution, calculate the relative pose of laser range finder and camera.
Summary of the invention:
The object of the invention is to overcome the deficiencies in the prior art, a kind of laser range finder based on edge matching and camera relative pose scaling method are provided.
Based on the laser range finder of edge matching and the step of camera relative pose scaling method as follows:
1) three-dimensional point cloud of surrounding environment is obtained by laser range finder, simultaneously by the image of this environment of collected by camera;
2) extract the edge contour of the some cloud that laser range finder collects, obtain the three-dimensional point set L representing three-dimensional edges 3;
3) according to performance parameter and the error model of laser range finder, the probability distribution of three-dimensional edges point set is determined
4) extract the edge contour of camera image, obtain the pixel set C representing two-dimentional edge 2;
5) according to performance parameter and the error model of camera, the probability distribution of two-dimentional edge pixel is determined
6) with one group of coordinate conversion matrix comprising rotation matrix R and translation matrix T to represent the relative pose of laser range finder and camera, by three-dimensional edges point set L 3under projecting to camera coordinates system, obtain two-dimentional edge point set L 2;
7) according to the probability distribution of three-dimensional edges point set two-dimentional edge point set L is determined with projection relation 2probability distribution
8) two probability distribution two dimension edge point set L are calculated 2probability distribution with the probability distribution of two-dimentional edge pixel between symmetrical KL distance, with R, T for parameter, to minimize symmetrical KL distance KL lCfor optimum target, try to achieve optimum laser range finder and camera relative pose transition matrix (R, T).
Described step 2) be: a) for a unordered cloud, search for each radius quantity be less than within the scope of r around and be no more than all nearest neighbor points of n, obtain point set N, for N fit Plane G, with the subpoint position of point set N in plane G for independent variable, the distance of point set N to plane G is functional value, matching Binary quadratic functions f, obtain Hessian matrix H, calculate the eigenvalue λ of Hessian matrix H 1and λ 2, assuming that λ 1> λ 2if, λ 1> thresh 1and λ 2< thresh 2, thresh 1, thresh 2be respectively respective threshold, then think that the center of circle point of current search scope is marginal point in unordered some cloud; B) for putting cloud in order, i.e. depth map, utilizes Canny algorithm to extract edge point set.
The described method for point set N fit Plane G is: the average calculating point set N, obtains the center c of plane G g; Calculate N tthe proper vector of N, its minimal eigenvalue characteristic of correspondence vector is the normal vector n of plane G g; The center c of plane G gwith normal vector n gnamely one is illustrated through center c g, normal vector is n gplane.
Described matching Binary quadratic functions f the method obtaining Hessian matrix H are: for every bit q in point set N, assuming that N tanother two the eigenwert characteristic of correspondence vectors of N are respectively α gand β g, calculate one with x, y for independent variable, f n(x, y) is the key-value pair of value,
x = ( q - ( q - c G ) T n G ) T &alpha; G y = ( q - ( q - c G ) T n G ) T &beta; G f N ( x , y ) = ( q - c G ) T n G
Final formation one group (x, y) is to f nthe key assignments of (x, y) maps, and utilizes least square method to ask for Hessian matrix H
H = argmin H &Sigma; ( x , y , f N ( x , y ) ) ( ( x , y ) H ( x , y ) T - f N ( x , y ) ) 2 ,
F is expressed as
f=(x,y)H(x,y) T
Described step 3) in the probability distribution of three-dimensional edges point set for:
P L 3 ( x ) = &Sigma; q X ( x - q , Cov q )
Wherein X represents Gaussian distribution, Cov qfor the uncertain covariance matrix of a q, depend on sensor performance parameter and error model.
Described step 4) in extract camera image edge contour method be Canny algorithm.
Described step 5) in the probability distribution of two-dimentional edge pixel for:
P C 2 ( x ) = &Sigma; O X ( x - O , Cov O )
Cov ofor the uncertain covariance matrix of pixel O, depend on sensor performance parameter and error model; Wherein X is Gaussian distribution.
Described step 6) in by three-dimensional edges point set L 3under projecting to camera coordinates system, method is: for three-dimensional edges point set L 3in 1 q, the subpoint Q in the camera imaging plane of its correspondence is
Q = 0 - 1 0 0 0 - 1 ( R q - T ) ,
Wherein R and T is respectively rotation matrix and translation matrix.
Described step 7) according to three-dimensional edges point set L 3probability distribution two-dimentional edge point set L is determined with projection relation 2probability distribution method be:
P L 2 ( x ) = &Sigma; Q X ( x - Q , Cov Q )
Wherein
Cov Q = 0 - 1 0 0 0 - 1 R Cov q R T 0 - 1 0 0 0 - 1 T ;
Wherein X is Gaussian distribution, Cov qand Cov qbe respectively the uncertain covariance matrix of q and Q.
Described step 8) in probability distribution two dimension edge point set L 2probability distribution with the probability distribution of two-dimentional edge pixel between symmetrical KL distance KL lCcomputing method be:
KL L C = &Sigma; x P L 2 ( x ) ln P L 2 ( x ) P C 2 ( x ) + &Sigma; x P C 2 ( x ) ln P C 2 ( x ) P L 2 ( x )
Minimize this symmetrical KL distance, relative pose transition matrix (R, T) can be obtained.
The present invention compared with prior art, the beneficial effect had:
1. do not rely on specific environment structure, do not rely on the auxiliary items such as scaling board;
2. can on-line operation, the relative pose of immediate updating laser range finder and camera;
3. extracted some cloud edge contour and image border profile can be further used for other application such as environmental objects identification and location.
Accompanying drawing explanation
Fig. 1 is laser range finder based on edge matching and camera relative pose scaling method operation steps schematic diagram;
Fig. 2 is laser range finder based on edge matching and camera relative pose scaling method implementation result figure.
Embodiment
A kind of laser range finder based on edge matching of the present invention and camera relative pose scaling method, after demarcating, the some cloud of laser range finder collection can carry out corresponding accurately with the image of collected by camera.On the one hand, image can be the colouring of some cloud, obtains colour point clouds, or is that the surface mesh being summit with a cloud pastes color texture, obtain grain surface model; On the other hand, some cloud can the degree of depth of indicating section image-region, for the application such as identification, location based on image provides support.
Timing signal, allows laser range finder and camera image data simultaneously, and ensures that their observation scope major part overlaps, and the some cloud collect laser range finder and the image of collected by camera process, and obtain the relative pose of laser range finder and camera online.
As described in Figure 1, based on the laser range finder of edge matching and the step of camera relative pose scaling method as follows:
1) three-dimensional point cloud of surrounding environment is obtained by laser range finder, simultaneously by the image of this environment of collected by camera;
2) extract the edge contour of the some cloud that laser range finder collects, obtain the three-dimensional point set L representing three-dimensional edges 3;
3) according to performance parameter and the error model of laser range finder, the probability distribution of three-dimensional edges point set is determined
4) extract the edge contour of camera image, obtain the pixel set C representing two-dimentional edge 2;
5) according to performance parameter and the error model of camera, the probability distribution of two-dimentional edge pixel is determined
6) with one group of coordinate conversion matrix comprising rotation matrix R and translation matrix T to represent the relative pose of laser range finder and camera, by three-dimensional edges point set L 3under projecting to camera coordinates system, obtain two-dimentional edge point set L 2;
7) according to the probability distribution of three-dimensional edges point set two-dimentional edge point set L is determined with projection relation 2probability distribution
8) two probability distribution two dimension edge point set L are calculated 2probability distribution with the probability distribution of two-dimentional edge pixel between symmetrical KL distance, with R, T for parameter, to minimize symmetrical KL distance KL lCfor optimum target, try to achieve optimum laser range finder and camera relative pose transition matrix (R, T).
Described step 2) be: a) for a unordered cloud, search for each radius quantity be less than within the scope of r around and be no more than all nearest neighbor points of n, obtain point set N, for N fit Plane G, with the subpoint position of point set N in plane G for independent variable, the distance of point set N to plane G is functional value, matching Binary quadratic functions f, obtain Hessian matrix H, calculate the eigenvalue λ of Hessian matrix H 1and λ 2, assuming that λ 1> λ 2if, λ 1> thresh 1and λ 2< thresh 2, thresh 1, thresh 2be respectively respective threshold, then think that the center of circle point of current search scope is marginal point in unordered some cloud; B) for putting cloud in order, i.e. depth map, utilize Canny algorithm (CannyJ.Acomputationalapproachtoedgedetection [J] .PatternAnalysisandMachineIntelligence, IEEETransactionson, 1986 (6): 679-698.) extract edge point set.
The described method for point set N fit Plane G is: the average calculating point set N, obtains the center c of plane G g; Calculate N tthe proper vector of N, its minimal eigenvalue characteristic of correspondence vector is the normal vector n of plane G g; The center c of plane G gwith normal vector n gnamely one is illustrated through center c g, normal vector is n gplane.
Described matching Binary quadratic functions f the method obtaining Hessian matrix H are: for every bit q in point set N, assuming that N tanother two the eigenwert characteristic of correspondence vectors of N are respectively α gand β g, calculate one with x, y for independent variable, f n(x, y) is the key-value pair of value,
x = ( q - ( q - c G ) T n G ) T &alpha; G y = ( q - ( q - c G ) T n G ) T &beta; G f N ( x , y ) = ( q - c G ) T n G
Final formation one group (x, y) is to f nthe key assignments of (x, y) maps, and utilizes least square method to ask for Hessian matrix H
H = argmin H &Sigma; ( x , y , f N ( x , y ) ) ( ( x , y ) H ( x , y ) T - f N ( x , y ) ) 2 ,
F is expressed as
f=(x,y)H(x,y) T
Described step 3) in the probability distribution of three-dimensional edges point set for:
P L 3 ( x ) = &Sigma; q X ( x - q , Cov q )
Wherein X represents Gaussian distribution, Cov qfor the uncertain covariance matrix of a q, depend on sensor performance parameter and error model (BaeKH, BeltonD, LichtiD.Aframeworkforpositionuncertaintyofunorganisedthr ee-dimensionalpointcloudsfromnear-monostaticlaserscanner susingcovarianceanalysis [C] //ProceedingsoftheISPRSWorkshop " Laserscanning.2005.).
Described step 4) in extract camera image edge contour method be Canny algorithm (CannyJ.Acomputationalapproachtoedgedetection [J] .PatternAnalysisandMachineIntelligence, IEEETransactionson, 1986 (6): 679-698.).
Described step 5) in the probability distribution of two-dimentional edge pixel for:
P C 2 ( x ) = &Sigma; O X ( x - O , Cov O )
Cov ofor the uncertain covariance matrix of pixel O, depend on sensor performance parameter and error model (DeSantoM, LiguoriC, PietrosantoA.Uncertaintycharacterizationinimage-basedmea surements:apreliminarydiscussion [J] .InstrumentationandMeasurement, IEEETransactionson, 2000,49 (5): 1101-1107.); Wherein X is Gaussian distribution.
Described step 6) in by three-dimensional edges point set L 3under projecting to camera coordinates system, method is: for three-dimensional edges point set L 3in 1 q, the subpoint Q in the camera imaging plane of its correspondence is
Q = 0 - 1 0 0 0 - 1 ( R q - T ) ,
Wherein R and T is respectively rotation matrix and translation matrix.
Described step 7) according to three-dimensional edges point set L 3probability distribution two-dimentional edge point set L is determined with projection relation 2probability distribution method be:
P L 2 ( x ) = &Sigma; Q X ( x - Q , Cov Q )
Wherein
Cov Q = 0 - 1 0 0 0 - 1 R Cov q R T 0 - 1 0 0 0 - 1 T ;
Wherein X is Gaussian distribution, Cov qand Cov qbe respectively the uncertain covariance matrix of q and Q.
Described step 8) in probability distribution two dimension edge point set L 2probability distribution with the probability distribution of two-dimentional edge pixel between symmetrical KL distance KL lCcomputing method be:
KL L C = &Sigma; x P L 2 ( x ) ln P L 2 ( x ) P C 2 ( x ) + &Sigma; x P C 2 ( x ) ln P C 2 ( x ) P L 2 ( x )
Minimize this symmetrical KL distance, relative pose transition matrix (R, T) can be obtained.

Claims (10)

1., based on laser range finder and the camera relative pose scaling method of edge matching, it is characterized in that its step is as follows:
1) three-dimensional point cloud of surrounding environment is obtained by laser range finder, simultaneously by the image of this environment of collected by camera;
2) extract the edge contour of the some cloud that laser range finder collects, obtain the three-dimensional point set L representing three-dimensional edges 3;
3) according to performance parameter and the error model of laser range finder, the probability distribution of three-dimensional edges point set is determined
4) extract the edge contour of camera image, obtain the pixel set C representing two-dimentional edge 2;
5) according to performance parameter and the error model of camera, the probability distribution of two-dimentional edge pixel is determined
6) with one group of coordinate conversion matrix comprising rotation matrix R and translation matrix T to represent the relative pose of laser range finder and camera, by three-dimensional edges point set L 3under projecting to camera coordinates system, obtain two-dimentional edge point set L 2;
7) according to the probability distribution of three-dimensional edges point set two-dimentional edge point set L is determined with projection relation 2probability distribution
8) two-dimentional edge point set L is calculated 2probability distribution with the probability distribution of two-dimentional edge pixel between symmetrical KL distance KL lC, with R, T for parameter, to minimize symmetrical KL distance KL lCfor optimum target, try to achieve optimum laser range finder and camera relative pose transition matrix (R, T).
2. the laser range finder based on edge matching according to claim 1 and camera relative pose scaling method, it is characterized in that described step 2) be: a) for unordered some cloud, search for each radius quantity be less than within the scope of r around and be no more than all nearest neighbor points of n, obtain point set N, for N fit Plane G, with the subpoint position of point set N in plane G for independent variable, the distance of point set N to plane G is functional value, matching Binary quadratic functions f, obtain Hessian matrix H, calculate the eigenvalue λ of Hessian matrix H 1and λ 2, assuming that λ 1> λ 2if, λ 1> thresh 1and λ 2< thresh 2, thresh 1, thresh 2be respectively respective threshold, then think that the center of circle point of current search scope is marginal point in unordered some cloud; B) for putting cloud in order, i.e. depth map, utilizes Canny algorithm to extract edge point set.
3. the laser range finder based on edge matching according to claim 2 and camera relative pose scaling method, is characterized in that, the described method for point set N fit Plane G is: the average calculating point set N, obtains the center c of plane G g; Calculate N tthe proper vector of N, its minimal eigenvalue characteristic of correspondence vector is the normal vector n of plane G g; The center c of plane G gwith normal vector n gnamely one is illustrated through center c g, normal vector is n gplane.
4. the laser range finder based on edge matching according to claim 3 and camera relative pose scaling method, is characterized in that, described matching Binary quadratic functions f the method obtaining Hessian matrix H are: for every bit q in point set N, assuming that N tanother two the eigenwert characteristic of correspondence vectors of N are respectively α gand β g, calculate one with x, y for independent variable, f n(x, y) is the key-value pair of value,
x = ( q - ( q - c G ) T n G ) T &alpha; G y = ( q - ( q - c G ) T n G ) T &beta; G f N ( x , y ) = ( q - c G ) T n G
Final formation one group (x, y) is to f nthe key assignments of (x, y) maps, and utilizes least square method to ask for Hessian matrix H
H = argmin H &Sigma; ( x , y , f N ( x , y ) ) ( ( x , y ) H ( x , y ) T - f N ( x , y ) ) 2 ,
F is expressed as
f=(x,y)H(x,y) T
5. the laser range finder based on edge matching according to claim 4 and camera relative pose scaling method, is characterized in that, described step 3) in the probability distribution of three-dimensional edges point set for:
P L 3 ( x ) = &Sigma; q X ( x - q , Cov q )
Wherein X represents Gaussian distribution, Cov qfor the uncertain covariance matrix of a q, depend on sensor performance parameter and error model.
6. the laser range finder based on edge matching according to claim 1 and camera relative pose scaling method, is characterized in that, described step 4) in extract camera image edge contour method be Canny algorithm.
7. the laser range finder based on edge matching according to claim 1 and camera relative pose scaling method, is characterized in that, described step 5) in the probability distribution of two-dimentional edge pixel for:
P C 2 ( x ) = &Sigma; o X ( x - O , Cov o )
Cov ofor the uncertain covariance matrix of pixel O, depend on sensor performance parameter and error model; Wherein X is Gaussian distribution.
8. the laser range finder based on edge matching according to claim 1 and camera relative pose scaling method, is characterized in that, described step 6) in by three-dimensional edges point set L 3under projecting to camera coordinates system, method is: for three-dimensional edges point set L 3in 1 q, the subpoint Q in the camera imaging plane of its correspondence is
Q = 0 - 1 0 0 0 - 1 ( R q - T ) ,
Wherein R and T is respectively rotation matrix and translation matrix.
9. the laser range finder based on edge matching according to claim 8 and camera relative pose scaling method, is characterized in that, described step 7) according to three-dimensional edges point set L 3probability distribution two-dimentional edge point set L is determined with projection relation 2probability distribution method be:
P L 2 ( x ) = &Sigma; Q X ( x - Q , Cov Q )
Wherein
Cov Q = 0 - 1 0 0 0 - 1 R Cov q R T 0 - 1 0 0 0 - 1 T ;
Wherein X is Gaussian distribution, Cov qand Cov qbe respectively the uncertain covariance matrix of q and Q.
10. the laser range finder based on edge matching according to claim 1 and camera relative pose scaling method, is characterized in that, described step 8) in probability distribution two dimension edge point set L 2probability distribution with the probability distribution of two-dimentional edge pixel between symmetrical KL distance KL lCcomputing method be:
KL L C = &Sigma; x P L 2 ( x ) ln P L 2 ( x ) P C 2 ( x ) + &Sigma; x P C 2 ( x ) ln P C 2 ( x ) P L 2 ( x )
Minimize this symmetrical KL distance, relative pose transition matrix (R, T) can be obtained.
CN201310742582.6A 2013-12-30 2013-12-30 A kind of laser range finder based on edge matching and camera relative pose scaling method Active CN103727930B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310742582.6A CN103727930B (en) 2013-12-30 2013-12-30 A kind of laser range finder based on edge matching and camera relative pose scaling method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310742582.6A CN103727930B (en) 2013-12-30 2013-12-30 A kind of laser range finder based on edge matching and camera relative pose scaling method

Publications (2)

Publication Number Publication Date
CN103727930A CN103727930A (en) 2014-04-16
CN103727930B true CN103727930B (en) 2016-03-23

Family

ID=50452110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310742582.6A Active CN103727930B (en) 2013-12-30 2013-12-30 A kind of laser range finder based on edge matching and camera relative pose scaling method

Country Status (1)

Country Link
CN (1) CN103727930B (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111071B (en) * 2014-07-10 2017-01-18 上海宇航系统工程研究所 High-precision position posture calculating method based on laser ranging and camera visual fusion
CN104484887B (en) * 2015-01-19 2017-07-07 河北工业大学 External parameters calibration method when video camera is used in combination with scanning laser range finder
CN106023198A (en) * 2016-05-16 2016-10-12 天津工业大学 Hessian matrix-based method for extracting aortic dissection of human thoracoabdominal cavity CT image
CN106530345B (en) * 2016-11-07 2018-12-25 江西理工大学 A kind of building three-dimensional laser point cloud feature extracting method under same machine Image-aided
CN106931879B (en) * 2017-01-23 2020-01-21 成都通甲优博科技有限责任公司 Binocular error measurement method, device and system
EP3615955A4 (en) 2017-04-28 2020-05-13 SZ DJI Technology Co., Ltd. Calibration of laser and vision sensors
CN107607095A (en) * 2017-09-22 2018-01-19 义乌敦仁智能科技有限公司 A kind of house measurement method of view-based access control model and laser
CN109059902B (en) * 2018-09-07 2021-05-28 百度在线网络技术(北京)有限公司 Relative pose determination method, device, equipment and medium
CN109782811B (en) * 2019-02-02 2021-10-08 绥化学院 Automatic following control system and method for unmanned model vehicle
CN110148185B (en) * 2019-05-22 2022-04-15 北京百度网讯科技有限公司 Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
CN110414558B (en) * 2019-06-24 2021-07-20 武汉大学 Feature point matching method based on event camera
US11288814B2 (en) * 2019-07-15 2022-03-29 Mujin, Inc. System and method of object detection based on image data
CN111912346B (en) * 2020-06-30 2021-12-10 成都飞机工业(集团)有限责任公司 Nest hole online detection method suitable for robot drilling and riveting system on surface of airplane
CN112325767B (en) * 2020-10-16 2022-07-26 华中科技大学鄂州工业技术研究院 Spatial plane dimension measurement method integrating machine vision and flight time measurement
CN113465536A (en) * 2021-06-30 2021-10-01 皖江工学院 Laser holder based on camera guide and working method thereof
CN113587829B (en) * 2021-09-03 2023-08-01 凌云光技术股份有限公司 Edge thickness measuring method and device, edge thickness measuring equipment and medium
CN116027269B (en) * 2023-03-29 2023-06-06 成都量芯集成科技有限公司 Plane scene positioning method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2573584A1 (en) * 2011-09-26 2013-03-27 Honeywell International Inc. Generic surface feature extraction from a set of range data

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004106856A1 (en) * 2003-05-29 2004-12-09 Olympus Corporation Device and method of supporting stereo camera, device and method of detecting calibration, and stereo camera system
CN103257342B (en) * 2013-01-11 2014-11-05 大连理工大学 Three-dimension laser sensor and two-dimension laser sensor combined calibration method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2573584A1 (en) * 2011-09-26 2013-03-27 Honeywell International Inc. Generic surface feature extraction from a set of range data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Kullback-Leibler Divergence based Graph Pruning in Robotic Feature Mapping;Yue Wang,Rong Xiong,et al.;《Mobile Robots(ECMR)》;20130925;全文 *

Also Published As

Publication number Publication date
CN103727930A (en) 2014-04-16

Similar Documents

Publication Publication Date Title
CN103727930B (en) A kind of laser range finder based on edge matching and camera relative pose scaling method
CN110443836B (en) Point cloud data automatic registration method and device based on plane features
CN107093205B (en) A kind of three-dimensional space building window detection method for reconstructing based on unmanned plane image
Sun et al. Aerial 3D building detection and modeling from airborne LiDAR point clouds
CN108052942B (en) Visual image recognition method for aircraft flight attitude
CN104200461B (en) The remote sensing image registration method of block and sift features is selected based on mutual information image
CN103729872B (en) A kind of some cloud Enhancement Method based on segmentation resampling and surface triangulation
CN103295232B (en) Based on the SAR image registration method in straight line and region
CN104463164B (en) It is a kind of based on umbrella frame method and crown height than trees canopy structure information extracting method
CN102629378B (en) Remote sensing image change detection method based on multi-feature fusion
CN104700398A (en) Point cloud scene object extracting method
CN104008553A (en) Crack detection method with image gradient direction histogram and watershed method conflated
CN103714541A (en) Method for identifying and positioning building through mountain body contour area constraint
CN104850850A (en) Binocular stereoscopic vision image feature extraction method combining shape and color
CN103729846B (en) LiDAR point cloud data edge detection method based on triangular irregular network
CN102903109B (en) A kind of optical image and SAR image integration segmentation method for registering
CN105261047A (en) Docking ring circle center extraction method based on close-range short-arc image
CN104463856A (en) Outdoor scene three-dimensional point cloud data ground extraction method based on normal vector ball
CN105335973A (en) Visual processing method for strip steel processing production line
CN103210417A (en) Method for the pre-processing of a three-dimensional image of the surface of a tyre using successive B-spline deformations
CN103136525A (en) High-precision positioning method for special-shaped extended target by utilizing generalized Hough transformation
CN104732546B (en) The non-rigid SAR image registration method of region similitude and local space constraint
CN104200463A (en) Fourier-Merlin transform and maximum mutual information theory based image registration method
CN111508073A (en) Method for extracting roof contour line of three-dimensional building model
CN105160686A (en) Improved scale invariant feature transformation (SIFT) operator based low altitude multi-view remote-sensing image matching method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20180724

Address after: 310052 208, room 6, 1197 Binan Road, Binjiang District, Hangzhou, Zhejiang.

Patentee after: Which Hangzhou science and Technology Co Ltd

Address before: 310027 No. 38, Zhejiang Road, Hangzhou, Zhejiang, Xihu District

Patentee before: Zhejiang University