Nothing Special   »   [go: up one dir, main page]

CN116433740A - Stereo matching method based on laser stripe lines - Google Patents

Stereo matching method based on laser stripe lines Download PDF

Info

Publication number
CN116433740A
CN116433740A CN202310109230.0A CN202310109230A CN116433740A CN 116433740 A CN116433740 A CN 116433740A CN 202310109230 A CN202310109230 A CN 202310109230A CN 116433740 A CN116433740 A CN 116433740A
Authority
CN
China
Prior art keywords
laser stripe
image
point
pixel
representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310109230.0A
Other languages
Chinese (zh)
Inventor
陈迪来
吴玉波
孙效杰
李宁洲
卫晓娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Institute of Technology
Original Assignee
Shanghai Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Institute of Technology filed Critical Shanghai Institute of Technology
Priority to CN202310109230.0A priority Critical patent/CN116433740A/en
Publication of CN116433740A publication Critical patent/CN116433740A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/68Analysis of geometric attributes of symmetry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20228Disparity calculation for image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30172Centreline of tubular or elongated structure

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Geometry (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a three-dimensional matching method based on laser stripe lines, which comprises the following steps of: acquiring left laser stripe images and right laser stripe images acquired by a left camera and a right camera; step 2: preprocessing a left laser stripe image and a right laser stripe image respectively; step 3: solving a laser stripe center line by utilizing an improved Steger algorithm; step 4: sequentially selecting center line points of the left laser stripe image and matching the center line points with the line points in the right laser stripe image; step 5: and calculating a parallax threshold of the matching point. According to the method and the device, three-dimensional matching of the sub-pixel coordinate points can be achieved, the precision value is higher, and recovery of the three-dimensional point cloud is facilitated.

Description

Stereo matching method based on laser stripe lines
Technical Field
The invention relates to the field of computer vision, in particular to a three-dimensional matching method based on laser stripe lines.
Background
The binocular stereo matching algorithm is one of the most important processes in three-dimensional reconstruction, and after the corresponding relation between the three-dimensional space and the image is determined, parallax is needed to be calculated, so that the corresponding relation between the points of the three-dimensional space on the left image and the right image is needed to be known, and the aim of stereo matching is achieved. Through the stereo matching technology, the corresponding relation of the points in the left image and the right image can be clarified, so that parallax is obtained, and the three-dimensional information of the points is recovered.
The stereo matching technology is an important technology in binocular stereo vision, a large number of practical algorithms are proposed at present, a plurality of basic constraint conditions are included, and the constraint conditions are applied to the matching algorithm, so that the matching difficulty can be effectively reduced, and the stereo matching speed is improved. However, compared with a single constraint condition, error matching is easy to cause, and on the basis, local, semi-global and global matching algorithms are provided, so that the matching accuracy is higher, the matching efficiency is lower and the matching speed is slower compared with the prior single constraint. And the algorithm only processes the image pixel points, and the precision can not reach the sub-pixel level. Therefore, it is a problem to be solved to propose a stereo matching algorithm with high accuracy and high efficiency.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a three-dimensional matching method based on laser stripe lines, which is characterized in that on the basis of extracting sub-pixel coordinate values of central line points of the laser stripe lines by utilizing a Steger algorithm, single constraint conditions and local cost matching are combined, and the matched points reach sub-pixel levels.
In order to achieve the above object, the technical scheme adopted for solving the technical problems is as follows:
a three-dimensional matching method based on laser stripe lines comprises the following steps:
step 1: acquiring left laser stripe images and right laser stripe images acquired by a left camera and a right camera;
step 2: preprocessing a left laser stripe image and a right laser stripe image respectively;
step 3: solving a laser stripe center line by utilizing an improved Steger algorithm;
step 4: sequentially selecting center line points of the left laser stripe image and matching the center line points with the line points in the right laser stripe image;
step 5: and calculating a parallax threshold of the matching point.
Further, the step 2 includes: the left and right laser stripe images are subjected to global thresholding treatment, and laser stripe areas are extracted, and the calculation method is as follows:
Figure BDA0004076202500000021
where f (x, y) is a pixel value at (x, y) of the image coordinates, g (x, y) represents the pixel value after thresholding, and T is a set threshold.
Further, the step 3 includes: extracting laser stripe center lines of the left and right laser stripe images after pretreatment by using a Steger algorithm, wherein the specific steps are as follows:
step 31: obtaining r of each pixel point of image x 、r y 、r xx 、r xy And r yy The formula is as follows:
Figure BDA0004076202500000022
wherein r is x Representing the first partial derivative of the image in the x-direction, r y Representing the first partial derivative of the image in the y-direction, r xx Representing the second partial derivative of the image in the x-direction, r xy Second order mixed partial derivative representing image firstly deriving along x direction and then deriving for y direction, r yy Representing the second partial derivative in the y-direction, G (x, y) being a two-dimensional gaussian function, G (x, y) being a one-dimensional gaussian function;
step 32: calculating eigenvalues and eigenvectors by using a Hessian matrix, wherein the eigenvector corresponding to the largest eigenvalue of the Hessian matrix corresponds to the normal of the light barLine direction, using n x And n y Expressed, the Hessian matrix is expressed as:
Figure BDA0004076202500000031
step 33: in points (x) 0 ,y 0 ) As standard points, performing second-order Taylor expansion on the gray level distribution function of the stripe section to obtain sub-pixel coordinates (P) x ,P y )=(x 0 +tn x ,y 0 +tn y ) Wherein the calculation formula of t is as follows:
Figure BDA0004076202500000032
wherein n is x And n y The eigenvectors corresponding to the largest eigenvalues of the Hessian matrix correspond to the normal direction of the light bar, respectively.
Further, the Steger algorithm is large in the calculation amount of the Hessian matrix, wherein each point is performed 5 times (r x 、r y 、r xx 、r xy And r yy ) The two-dimensional Gaussian convolution results in low calculation efficiency and reduces the real-time performance of the system, so that the two-dimensional Gaussian kernel can be equivalently decomposed into one Gaussian line convolution and one Gaussian column convolution by utilizing the separability and the symmetry of the Gaussian convolution, and the calculation amount is reduced from 5n 2 The multiply-add operation is reduced to 10n multiply-add operations.
Further, the step 4 includes: sequentially selecting center line points of the left laser stripe image, and matching the center line points with line points in the right laser stripe image, wherein the specific method for matching the left laser point and the right laser point is as follows:
step 41: finding out a pixel coordinate value of the sub-pixel coordinate, and extracting a 1*n area by taking the pixel as a central coordinate, wherein n is the width of the central line of the laser stripe, if n is an even number, n is n+1, and if the odd number is unchanged;
step 42: taking the same ordinate as an example, the same region on the right laser stripe image is found, and the matching method is as follows: the 1*n area is also extracted from the right laser stripe image, the divisor of the pixel value of the corresponding pixel point is obtained, the average value of all divisor results is calculated, and the calculation formula is as follows:
Figure BDA0004076202500000033
wherein, sl j Pixel value representing pixel point of left image selection area, sr j Representing the pixel value of the pixel point of the selected area of the right image, wherein j is (1-n), and the center coordinate is the pixel coordinate where the selected sub-pixel coordinate is located, S j A pixel value divisor representing a corresponding pixel point, S ave An average value of the divisors of the corresponding pixels;
if the difference between each divisor and the average value is within 0.05, the corresponding pixel value point is found, and the specific judgment formula is as follows:
S j -S ave <0.05
step 43: if the pixel value point has the laser stripe line point, the direct matching is successful, if the pixel value point does not have the laser stripe line point, the nearest laser stripe point is respectively taken from the upper part and the lower part nearest to the pixel value, and then the corresponding laser stripe line point coordinates are calculated by the following method, wherein the specific method is as follows:
x L =(x 0 -x 1 )×(y-y 1 )/(y 0 -y 1 )+x 1
wherein x is L Representing the subpixel coordinates, x, of the laser spot being sought 0 、x 1 Respectively representing the abscissa value, y of the upper and lower laser stripe points nearest to the pixel value 0 、y 1 And respectively represent the ordinate values of the upper and lower laser stripe points nearest to the pixel value, and y represents the ordinate value of the sub-pixel coordinate line point.
Compared with the prior art, the invention has the following advantages and positive effects due to the adoption of the technical scheme:
1. the invention adopts global thresholding treatment to obtain a preliminary denoising image, and then extracts the sub-pixel coordinate value of the central line of the laser stripe after the calculation of a Steger algorithm;
2. the invention adopts an improved Steger algorithm, and compared with the traditional Steger algorithm, the speed is improved;
3. the invention combines epipolar constraint and cost matching, so that the matching precision is higher and the calculated amount is relatively small;
4. the stereo matching algorithm integrates image denoising, improved Steger algorithm, epipolar constraint and cost matching, and greatly improves the accuracy and efficiency of matching results.
Drawings
In order to more clearly illustrate the technical solution of the embodiments of the present invention, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is evident that the drawings in the following description are only some embodiments of the invention and that other drawings may be obtained from these drawings by those skilled in the art without inventive effort. In the accompanying drawings:
fig. 1 is a schematic flow chart of a stereo matching method based on laser stripe lines.
Detailed Description
The following description of the embodiments of the present invention will be made apparent and fully in view of the accompanying drawings, in which some, but not all embodiments of the invention are shown. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1, the embodiment discloses a stereo matching method based on laser stripe lines, which comprises the following steps:
step 1: acquiring left laser stripe images and right laser stripe images acquired by a left camera and a right camera;
step 2: preprocessing a left laser stripe image and a right laser stripe image respectively;
further, the step 2 includes: the left and right laser stripe images are subjected to global thresholding treatment, and laser stripe areas are extracted, and the calculation method is as follows:
Figure BDA0004076202500000051
where f (x, y) is a pixel value at (x, y) of the image coordinates, g (x, y) represents the pixel value after thresholding, and T is a set threshold.
Step 3: solving a laser stripe center line by utilizing an improved Steger algorithm;
further, the step 3 includes: extracting laser stripe center lines of the left and right laser stripe images after pretreatment by using a Steger algorithm, wherein the specific steps are as follows:
step 31: obtaining r of each pixel point of image x 、r y 、r xx 、r xy And r yy The formula is as follows:
Figure BDA0004076202500000052
wherein r is x Representing the first partial derivative of the image in the x-direction, r y Representing the first partial derivative of the image in the y-direction, r xx Representing the second partial derivative of the image in the x-direction, r xy Second order mixed partial derivative representing image firstly deriving along x direction and then deriving for y direction, r yy Representing the second partial derivative in the y-direction, G (x, y) being a two-dimensional gaussian function, G (x, y) being a one-dimensional gaussian function;
in order to reduce the amount of computation, the image pixel is transformed into 10 one-dimensional convolutions by using the separability of Gaussian convolution.
Step 32: calculating eigenvalues and eigenvectors by using a Hessian matrix, wherein the eigenvector corresponding to the largest eigenvalue of the Hessian matrix corresponds to the normal direction of the light bar, using n x And n y Expressed, the Hessian matrix is expressed as:
Figure BDA0004076202500000061
step 33: in points (x) 0 ,y 0 ) As standard points, performing second-order Taylor expansion on the gray level distribution function of the stripe section to obtain sub-pixel coordinates (P) x ,P y )=(x 0 +tn x ,y 0 +tn y ) Wherein the calculation formula of t is as follows:
Figure BDA0004076202500000062
wherein n is x And n y The eigenvectors corresponding to the largest eigenvalues of the Hessian matrix correspond to the normal direction of the light bar, respectively.
Further, the Steger algorithm is large in the calculation amount of the Hessian matrix, wherein each point is performed 5 times (r x 、r y 、r xx 、r xy And r yy ) The two-dimensional Gaussian convolution results in low calculation efficiency and reduces the real-time performance of the system, so that the two-dimensional Gaussian kernel can be equivalently decomposed into one Gaussian line convolution and one Gaussian column convolution by utilizing the separability and the symmetry of the Gaussian convolution, and the calculation amount is reduced from 5n 2 The multiply-add operation is reduced to 10n multiply-add operations.
Step 4: sequentially selecting center line points of the left laser stripe image and matching the center line points with the line points in the right laser stripe image;
further, the step 4 includes: sequentially selecting center line points of the left laser stripe image, and matching the center line points with line points in the right laser stripe image, wherein the specific method for matching the left laser point and the right laser point is as follows:
step 41: finding out a pixel coordinate value of the sub-pixel coordinate, and extracting a 1*n area by taking the pixel as a central coordinate, wherein n is the width of the central line of the laser stripe, if n is an even number, n is n+1, and if the odd number is unchanged;
step 42: by using the polar constraint method, taking the same ordinate as an example, the same region on the right laser stripe image is found, and the matching method is as follows: the 1*n area is also extracted from the right laser stripe image, the divisor of the pixel value of the corresponding pixel point is obtained, the average value of all divisor results is calculated, and the calculation formula is as follows:
Figure BDA0004076202500000071
wherein, sl j Pixel value representing pixel point of left image selection area, sr j Representing the pixel value of the pixel point of the selected area of the right image, wherein j is (1-n), the central coordinate is the pixel coordinate where the selected sub-pixel coordinate is located, S j A pixel value divisor representing a corresponding pixel point, S ave An average value of the divisors of the corresponding pixels;
if the difference between each divisor and the average value is within 0.05, the corresponding pixel value point is found, and the specific judgment formula is as follows:
S j -S ave <0.05
step 43: if the pixel value point has the laser stripe line point, the direct matching is successful, if the pixel value point does not have the laser stripe line point, the nearest laser stripe point is respectively taken from the upper part and the lower part nearest to the pixel value, and then the corresponding laser stripe line point coordinates are calculated by the following method, wherein the specific method is as follows:
x L =(x 0 -x 1 )×(y-y 1 )/(y 0 -y 1 )+x 1
wherein x is L Representing the subpixel coordinates, x, of the laser spot being sought 0 、x 1 Respectively representing the abscissa value, y of the upper and lower laser stripe points nearest to the pixel value 0 、y 1 And respectively represent the ordinate values of the upper and lower laser stripe points nearest to the pixel value, and y represents the ordinate value of the sub-pixel coordinate line point.
Step 5: and calculating a parallax threshold of the matching point.
The present invention is not limited to the above-mentioned embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the technical scope of the present invention are intended to be included in the scope of the present invention. Therefore, the protection scope of the present invention should be subject to the protection scope of the claims.

Claims (5)

1. The stereo matching method based on the laser stripe line is characterized by comprising the following steps of:
step 1: acquiring left laser stripe images and right laser stripe images acquired by a left camera and a right camera;
step 2: preprocessing a left laser stripe image and a right laser stripe image respectively;
step 3: solving a laser stripe center line by utilizing an improved Steger algorithm;
step 4: sequentially selecting center line points of the left laser stripe image and matching the center line points with the line points in the right laser stripe image;
step 5: and calculating a parallax threshold of the matching point.
2. The stereo matching method based on laser stripe line according to claim 1, wherein the step 2 comprises: the left and right laser stripe images are subjected to global thresholding treatment, and laser stripe areas are extracted, and the calculation method is as follows:
Figure FDA0004076202480000011
where f (x, y) is a pixel value at (x, y) of the image coordinates, g (x, y) represents the pixel value after thresholding, and T is a set threshold.
3. The stereo matching method based on laser stripe line according to claim 1, wherein the step 3 comprises: extracting laser stripe center lines of the left and right laser stripe images after pretreatment by using a Steger algorithm, wherein the specific steps are as follows:
step 31: obtaining r of each pixel point of image x 、r y 、r xx 、r xy And r yy The formula is as follows:
Figure FDA0004076202480000012
wherein r is x Representing the first partial derivative of the image in the x-direction, r y Representing the first partial derivative of the image in the y-direction, r xx Representing the second partial derivative of the image in the x-direction, r xy Second order mixed partial derivative representing image firstly deriving along x direction and then deriving for y direction, r yy Representing the second partial derivative in the y-direction, G (x, y) being a two-dimensional gaussian function, G (x, y) being a one-dimensional gaussian function;
step 32: calculating eigenvalues and eigenvectors by using a Hessian matrix, wherein the eigenvector corresponding to the largest eigenvalue of the Hessian matrix corresponds to the normal direction of the light bar, using n x And n y Expressed, the Hessian matrix is expressed as:
Figure FDA0004076202480000021
step 33: in points (x) 0 ,y 0 ) As standard points, performing second-order Taylor expansion on the gray level distribution function of the stripe section to obtain sub-pixel coordinates (P) x ,P y )=(x 0 +tn x ,y 0 +tn y ) Wherein the calculation formula of t is as follows:
Figure FDA0004076202480000022
wherein n is x And n y The eigenvectors corresponding to the largest eigenvalues of the Hessian matrix correspond to the normal direction of the light bar, respectively.
4. A method according to claim 3The stereo matching method based on laser streak line is characterized in that the Steger algorithm has large operand of Hessian matrix, wherein each point needs to be processed 5 times (r x 、r y 、r xx 、r xy And r yy ) The two-dimensional Gaussian convolution results in low calculation efficiency and reduces the real-time performance of the system, so that the two-dimensional Gaussian kernel can be equivalently decomposed into one Gaussian line convolution and one Gaussian column convolution by utilizing the separability and the symmetry of the Gaussian convolution, and the calculation amount is reduced from 5n 2 The multiply-add operation is reduced to 10n multiply-add operations.
5. The stereo matching method based on laser stripe line according to claim 1, wherein the step 4 comprises: sequentially selecting center line points of the left laser stripe image, and matching the center line points with line points in the right laser stripe image, wherein the specific method for matching the left laser point and the right laser point is as follows:
step 41: finding out a pixel coordinate value of the sub-pixel coordinate, and extracting a 1*n area by taking the pixel as a central coordinate, wherein n is the width of the central line of the laser stripe, if n is an even number, n is n+1, and if the odd number is unchanged;
step 42: taking the same ordinate as an example, the same region on the right laser stripe image is found, and the matching method is as follows: the 1*n area is also extracted from the right laser stripe image, the divisor of the pixel value of the corresponding pixel point is obtained, the average value of all divisor results is calculated, and the calculation formula is as follows:
Figure FDA0004076202480000031
wherein, sl j Pixel value representing pixel point of left image selection area, sr j Representing the pixel value of the pixel point of the selected area of the right image, wherein j is (1-n), the central coordinate is the pixel coordinate where the selected sub-pixel coordinate is located, S j A pixel value divisor representing a corresponding pixel point, S ave An average value of the divisors of the corresponding pixels;
if the difference between each divisor and the average value is within 0.05, the corresponding pixel value point is found, and the specific judgment formula is as follows:
S j -S ave <0.05
step 43: if the pixel value point has the laser stripe line point, the direct matching is successful, if the pixel value point does not have the laser stripe line point, the nearest laser stripe point is respectively taken from the upper part and the lower part nearest to the pixel value, and then the corresponding laser stripe line point coordinates are calculated by the following method, wherein the specific method is as follows:
x L =(x 0 -x 1 )×(y-y 1 )/(y 0 -y 1 )+x 1
wherein x is L Representing the subpixel coordinates, x, of the laser spot being sought 0 、x 1 Respectively representing the abscissa value, y of the upper and lower laser stripe points nearest to the pixel value 0 、y 1 And respectively represent the ordinate values of the upper and lower laser stripe points nearest to the pixel value, and y represents the ordinate value of the sub-pixel coordinate line point.
CN202310109230.0A 2023-02-13 2023-02-13 Stereo matching method based on laser stripe lines Pending CN116433740A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310109230.0A CN116433740A (en) 2023-02-13 2023-02-13 Stereo matching method based on laser stripe lines

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310109230.0A CN116433740A (en) 2023-02-13 2023-02-13 Stereo matching method based on laser stripe lines

Publications (1)

Publication Number Publication Date
CN116433740A true CN116433740A (en) 2023-07-14

Family

ID=87084391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310109230.0A Pending CN116433740A (en) 2023-02-13 2023-02-13 Stereo matching method based on laser stripe lines

Country Status (1)

Country Link
CN (1) CN116433740A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808966A (en) * 2023-12-15 2024-04-02 国网江苏省电力有限公司盐城供电分公司 Underground pipeline three-dimensional reconstruction method based on binocular laser scanning

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808966A (en) * 2023-12-15 2024-04-02 国网江苏省电力有限公司盐城供电分公司 Underground pipeline three-dimensional reconstruction method based on binocular laser scanning

Similar Documents

Publication Publication Date Title
CN106780590B (en) Method and system for acquiring depth map
CN110310320B (en) Binocular vision matching cost aggregation optimization method
CN103310421B (en) The quick stereo matching process right for high-definition image and disparity map acquisition methods
CN116309757B (en) Binocular stereo matching method based on machine vision
CN111209770A (en) Lane line identification method and device
US20140210951A1 (en) Apparatus and method for reconstructing three-dimensional information
CN111899295B (en) Monocular scene depth prediction method based on deep learning
CN112991420A (en) Stereo matching feature extraction and post-processing method for disparity map
CN108876861B (en) Stereo matching method for extraterrestrial celestial body patrolling device
CN111105452A (en) High-low resolution fusion stereo matching method based on binocular vision
CN108805915A (en) A kind of close-range image provincial characteristics matching process of anti-visual angle change
CN116310098A (en) Multi-view three-dimensional reconstruction method based on attention mechanism and variable convolution depth network
CN116310095A (en) Multi-view three-dimensional reconstruction method based on deep learning
CN112288788A (en) Monocular image depth estimation method
CN105225233B (en) A kind of stereopsis dense Stereo Matching method and system based on the expansion of two classes
CN116433740A (en) Stereo matching method based on laser stripe lines
WO2018133027A1 (en) Grayscale constraint-based method and apparatus for integer-pixel search for three-dimensional digital speckle pattern
CN109816781B (en) Multi-view solid geometry method based on image detail and structure enhancement
CN111160362A (en) FAST feature homogenization extraction and IMU-based inter-frame feature mismatching removal method
CN113963107B (en) Binocular vision-based large-scale target three-dimensional reconstruction method and system
CN113223074A (en) Underwater laser stripe center extraction method
CN116843715B (en) Multi-view collaborative image segmentation method and system based on deep learning
CN114283081B (en) Depth recovery method based on pyramid acceleration, electronic device and storage medium
da Silva Vieira et al. Stereo vision methods: from development to the evaluation of disparity maps
CN112232372B (en) Monocular stereo matching and accelerating method based on OPENCL

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination