Nothing Special   »   [go: up one dir, main page]

CN107392950A - A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection - Google Patents

A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection Download PDF

Info

Publication number
CN107392950A
CN107392950A CN201710631310.7A CN201710631310A CN107392950A CN 107392950 A CN107392950 A CN 107392950A CN 201710631310 A CN201710631310 A CN 201710631310A CN 107392950 A CN107392950 A CN 107392950A
Authority
CN
China
Prior art keywords
mrow
msup
mtd
msub
munder
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710631310.7A
Other languages
Chinese (zh)
Inventor
卢迪
张美玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin University of Science and Technology
Original Assignee
Harbin University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin University of Science and Technology filed Critical Harbin University of Science and Technology
Priority to CN201710631310.7A priority Critical patent/CN107392950A/en
Publication of CN107392950A publication Critical patent/CN107392950A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection of the present invention belongs to computer vision field, more particularly to the solid matching method to weak texture image, comprises the following steps:Two width coloured images are inputted, two described width coloured images are respectively left image and right image, and weak skin texture detection and segmentation are carried out to picture using the gradient information of left image;Matching power flow is calculated according to the colouring information and gradient information of left image and right image;On the basis of weak skin texture detection and segmentation result in above-mentioned, the interior yardstick based on gaussian filtering and the polymerization of across yardstick cost are carried out;Take policy calculation parallax entirely using the person of winning;Parallax is refined using left and right consistency detection and the method based on adaptive weighting, exports anaglyph.The present invention is realized on the premise of texture region matching accuracy is ensured, improves weak texture region matching accuracy, obtains the technical purpose of more preferable disparity map.

Description

Cross-scale cost aggregation stereo matching method based on weak texture detection
Technical Field
The invention discloses a cross-scale cost aggregation stereo matching method based on weak texture detection, belongs to the field of computer vision, and particularly relates to a stereo matching method for weak texture images.
Background
Binocular stereo vision (binocular stereo vision) is an important form of computer vision, and is a method for acquiring three-dimensional geometric information of an object by acquiring two images of the object to be measured from different positions by using imaging equipment based on a parallax principle and calculating position deviation between corresponding points of the images. And the quality of the three-dimensional information acquisition mainly depends on the accuracy of the disparity map obtained by stereo matching. At present, the problems of stereo matching mainly include external factors such as uneven illumination, overexposure and the like, and the picture has the picture characteristics such as shading, weak texture, repeated texture and the like which are difficult to distinguish by a computer. Although a large number of scholars have studied stereo matching for many years, matching for weakly textured regions remains a difficulty in the field of image processing. On the premise of ensuring the matching accuracy of the texture region, the method for improving the matching accuracy of the weak texture region and obtaining a better disparity map is a major problem.
Disclosure of Invention
The invention provides a cross-scale cost polymerization stereo matching method based on weak texture detection, which can improve the matching accuracy of a weak texture region and obtain a better disparity map on the premise of ensuring the matching accuracy of the texture region.
The purpose of the invention is realized as follows:
a cross-scale cost aggregation stereo matching method based on weak texture detection comprises the following steps:
step a, inputting two color images, wherein the two color images are a left image and a right image respectively, and performing weak texture detection and segmentation on the images by using gradient information of the left image;
b, calculating matching cost according to the color information and the gradient information of the left image and the right image;
step c, taking the weak texture detection and segmentation result in the step a as a reference, and carrying out inner scale and cross-scale cost aggregation based on Gaussian filtering;
d, calculating the parallax by adopting a win person total-taking strategy;
and e, adopting a left-right consistency detection method and a self-adaptive weight-based method to refine the parallax and output a parallax image.
The cross-scale cost aggregation stereo matching method based on weak texture detection specifically comprises the following steps of:
calculating the gradient value g (x, y) of the pixel point at the coordinate (x, y) of the left image and the gradient threshold value gTComparing and judging whether the texture is a weak texture area, wherein the calculation formula is as follows:
g(x,y)<gT
in the formula, N (x, y) represents a window with a pixel (x, y) as the center, M represents the number of pixels in the window, and I (x, y) represents the gray scale value of the pixel.
In the method for cross-scale cost aggregation stereo matching based on weak texture detection, the calculating of the matching cost in the step b specifically includes:
calculating a left image I of a stereoscopic color image pairLAnd a right image IRThe matching cost C (p, d) is calculated by the formula:
C(p,d)=(1-α)·CAD(p,d)+α·(Cgrad_x(p,d)+Cgrad_y(p,d))
in the formula: p is a point in the left image, where i ═ R, G, B represent the three channels of the color image, respectively, and TADAnd TgradCutoff thresholds representing color and gradient, respectively;representing the gradient operators of the picture in the x and y directions, respectively, α is a balance factor between the color difference and the gradient difference.
The cross-scale cost aggregation stereo matching method based on weak texture detection is characterized in that the cost aggregation in the step c specifically comprises the following steps:
wherein,indicating match after aggregationThe cost is that z is an expected optimization target value, W is a Gaussian filter kernel, N is a neighborhood window of a pixel p, q is a neighborhood pixel point of p, S ∈ {0, 1.., S } is a scale parameter, and when S is 0, C is0Representing the original scale matching cost of the image;an aggregate cost representing S +1 scales of the image;
in the formula, lambda is a regularization factor,by usingAn optimization objective function representing equation (11), andcomprises the following steps:
wherein, ThighAnd TlowRespectively representing the texture region and the weak texture region detected in the text; c1And C1/2Respectively representing the matching cost of the original image scale and the half scale, carrying out Gaussian filtering by using windows with different sizes, and obtaining the final matching cost after fusion.
The cross-scale cost aggregation stereo matching method based on weak texture detection includes the following steps:
|D'L(P)-D'R(P-D'R(P))|<
DLRC(P)=min(D'(PL),D'(PR))
wherein, the left chart parallax value D 'of one point p in the parallax chart'L(p) and Right View disparity value D'R(p-D'L(p)), a threshold for LRC; d' (PL) is the disparity value of the first non-occlusion point on the left side, and D (PR) is the disparity value of the first non-occlusion point on the right side; WB (wideband weight division multiple Access)pq(IL) As a function of the left image, Δ cpqAnd Δ spqRespectively the color difference and the spatial euclidean distance of points p and q in the left image,andadjustment parameters of color difference and distance difference respectively; dw(p) a filtered image.
Has the advantages that:
the invention discloses a cross-scale cost polymerization stereo matching method based on weak texture detection, and provides a novel stereo matching algorithm.
The stereo matching image pair processed by the algorithm of the embodiment can obtain better effect in the texture region and the weak texture region of the image, and the error matching rate is reduced (5% lower than that of the stereo matching image pair without the weak texture region segmentation algorithm). The algorithm of the embodiment can improve the matching accuracy of the weak texture region and obtain a better disparity map on the premise of ensuring the matching accuracy of the texture region.
Drawings
Fig. 1 is a flowchart of a cross-scale cost aggregation stereo matching method based on weak texture detection.
Fig. 2 is a Bowling1 disparity map.
Fig. 3 is a Lampshade1 disparity map.
Fig. 4 is a Monopoly disparity map.
FIG. 5 is a Plastic disparity map.
Detailed Description
The following describes embodiments of the present invention in further detail with reference to the accompanying drawings.
Detailed description of the preferred embodiment
A cross-scale cost aggregation stereo matching method based on weak texture detection is disclosed, as shown in FIG. 1, and includes the following steps:
step a, inputting two color images, wherein the two color images are a left image and a right image respectively, and performing weak texture detection and segmentation on the images by using gradient information of the left image;
b, calculating matching cost according to the color information and the gradient information of the left image and the right image;
step c, taking the weak texture detection and segmentation result in the step a as a reference, and carrying out inner scale and cross-scale cost aggregation based on Gaussian filtering;
d, calculating the parallax by adopting a win person total-taking strategy;
and e, adopting a left-right consistency detection method and a self-adaptive weight-based method to refine the parallax and output a parallax image.
According to the above steps, four pictures are selected for comparison, as shown in figures 2, 3, 4 and 5,
in FIG. 2, FIG. 2(a) is the left image of Bowling 1; FIG. 2(b) is a Bowling1 true disparity map; FIG. 2(c) is Bowling1 weak texture detection results; FIG. 2(d) is Bowling1 final disparity map; fig. 2(e) is a disparity map in which no weak texture detection is performed by Bowling 1.
In fig. 3, fig. 3(a) is a left image of Lampshade 1; fig. 3(b) is a real disparity map of Lampshade 1; FIG. 3(c) is the result of the weak texture detection of Lampshade 1; fig. 3(d) is a final disparity map of Lampshade 1; fig. 3(e) is a disparity map in which Lampshade1 does not perform weak texture detection.
In FIG. 4, FIG. 4(a) is a left image of a Monochromatic original image; FIG. 4(b) is a Monochromatic real disparity map; FIG. 4(c) is the Monopoly weak texture detection result; FIG. 4(d) is a Monochromatic final disparity map; fig. 4(e) is a parallax map in which Monopoly does not perform weak texture detection.
In FIG. 5, FIG. 5(a) is a left view of the original image of the Plastic; FIG. 5(b) is a Plastic true disparity map; FIG. 5(c) shows the result of the weak texture test of Plastic; FIG. 5(d) is a final disparity map of Plastic; fig. 5(e) is a parallax map in which Plastic does not perform weak texture detection.
The parallax images in fig. 2(a) to 2(e), fig. 3(a) to 3(e), fig. 4(a) to 4(e), and fig. 5(a) to 5(e) are subjectively evaluated in terms of visual effect. In fig. 2 to 5(c), the black portion indicates the detected weak texture region, and the white portion indicates the texture region. Comparing the disparity maps, it can be seen that in the weak texture region, the disparity map result obtained by applying the algorithm of the present embodiment is much better than the disparity map result obtained by the algorithm without weak texture detection.
The method of the invention is evaluated from objective evaluation indexes.
Table 1 shows the false match rate for processing 4 image pairs of the midlineby image set with distinct weak texture regions using two algorithms.
TABLE 1
As can be seen from table 1, in the test result of processing the stereo matching image pair by the two algorithms, the algorithm mismatching rate of the image pair processed by the algorithm of the present embodiment is reduced by 5% compared to the algorithm without weak texture detection and segmentation. The algorithm of the embodiment can improve the matching accuracy of the weak texture region and obtain a better disparity map on the premise of ensuring the matching accuracy of the texture region.
Detailed description of the invention
The cross-scale cost aggregation stereo matching method based on weak texture detection specifically comprises the following steps of:
calculating the gradient value g (x, y) of the pixel point at the coordinate (x, y) of the left image and the gradient threshold value gTComparing and judging whether the texture is a weak texture area, wherein the calculation formula is as follows:
g(x,y)<gT
in the formula, N (x, y) represents a window with a pixel (x, y) as the center, M represents the number of pixels in the window, and I (x, y) represents the gray scale value of the pixel.
In the method for cross-scale cost aggregation stereo matching based on weak texture detection, the calculating of the matching cost in the step b specifically includes:
calculating a left image I of a stereoscopic color image pairLAnd a right image IRThe matching cost C (p, d) is calculated by the formula:
C(p,d)=(1-α)·CAD(p,d)+α·(Cgrad_x(p,d)+Cgrad_y(p,d))
in the formula: p is a point in the left image, where i ═ R, G, B represent the three channels of the color image, respectively, and TADAnd TgradCutoff thresholds representing color and gradient, respectively;representing the gradient operators of the picture in the x and y directions, respectively, α is a balance factor between the color difference and the gradient difference.
The cross-scale cost aggregation stereo matching method based on weak texture detection is characterized in that the cost aggregation in the step c specifically comprises the following steps:
wherein,representing the aggregated matching cost, wherein z is an expected optimization target value, W is a Gaussian filter kernel, N is a neighborhood window of a pixel p, q is a neighborhood pixel point of p, S ∈ {0, 1.., S } is a scale parameter, and when S is 0, C is the value0Representing the original scale matching cost of the image;an aggregate cost representing S +1 scales of the image;
in the formula, lambda is a regularization factor,by usingAn optimization objective function representing equation (11), andcomprises the following steps:
wherein, ThighAnd TlowRespectively representing the texture region and the weak texture region detected in the text; c1And C1/2Respectively representing the matching cost of the original image scale and the half scale, carrying out Gaussian filtering by using windows with different sizes, and obtaining the final matching cost after fusion.
The cross-scale cost aggregation stereo matching method based on weak texture detection includes the following steps:
|D'L(P)-D'R(P-D'R(P))|<
DLRC(P)=min(D'(PL),D'(PR))
wherein, the left chart parallax value D 'of one point p in the parallax chart'L(p) and Right View disparity value D'R(p-D'L(p)), a threshold for LRC; d '(PL) is the disparity value for the first non-occluded point on the left, and D' (PR) is the disparity value for the first non-occluded point on the right; WB (wideband weight division multiple Access)pq(IL) As a function of the left image, Δ cpqAnd Δ spqRespectively the color difference and the spatial euclidean distance of points p and q in the left image,andadjustment parameters of color difference and distance difference respectively; dw(p) a filtered image.

Claims (5)

1. A cross-scale cost aggregation stereo matching method based on weak texture detection is characterized by comprising the following steps:
step a, inputting two color images, wherein the two color images are a left image and a right image respectively, and performing weak texture detection and segmentation on the images by using gradient information of the left image;
b, calculating matching cost according to the color information and the gradient information of the left image and the right image;
step c, taking the weak texture detection and segmentation result in the step a as a reference, and carrying out inner scale and cross-scale cost aggregation based on Gaussian filtering;
d, calculating the parallax by adopting a win person total-taking strategy;
and e, adopting a left-right consistency detection method and a self-adaptive weight-based method to refine the parallax and output a parallax image.
2. The method for cross-scale cost aggregation stereo matching based on weak texture detection as claimed in claim 1, wherein the weak texture detection and segmentation of the picture in the step a specifically comprises:
calculating the gradient value g (x, y) of the pixel point at the coordinate (x, y) of the left image and the gradient threshold value gTComparing and judging whether the texture is a weak texture area, wherein the calculation formula is as follows:
g(x,y)<gT
<mrow> <mi>g</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mi>M</mi> </mfrac> <mo>&amp;CenterDot;</mo> <munder> <mo>&amp;Sigma;</mo> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> <mo>&amp;Element;</mo> <mi>N</mi> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </munder> <mrow> <mo>(</mo> <mo>|</mo> <mi>I</mi> <mo>(</mo> <mrow> <mi>u</mi> <mo>,</mo> <mi>v</mi> </mrow> <mo>)</mo> <mo>-</mo> <mi>I</mi> <mo>(</mo> <mrow> <mi>u</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mi>v</mi> </mrow> <mo>)</mo> <mo>+</mo> <mo>|</mo> <mi>I</mi> <mo>(</mo> <mrow> <mi>u</mi> <mo>,</mo> <mi>v</mi> </mrow> <mo>)</mo> <mo>-</mo> <mi>I</mi> <mo>(</mo> <mrow> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>+</mo> <mn>1</mn> </mrow> <mo>)</mo> <mo>|</mo> <mo>)</mo> </mrow> </mrow>
in the formula, N (x, y) represents a window with a pixel (x, y) as the center, M represents the number of pixels in the window, and I (x, y) represents the gray scale value of the pixel.
3. The method for cross-scale cost aggregation stereo matching based on weak texture detection as claimed in claim 1, wherein the calculating the matching cost in the step b specifically comprises:
calculating a left image I of a stereoscopic color image pairLAnd a right image IRThe matching cost C (p, d) is calculated by the formula:
C(p,d)=(1-α)·CAD(p,d)+α·(Cgrad_x(p,d)+Cgrad_y(p,d))
<mrow> <msub> <mi>C</mi> <mrow> <mi>A</mi> <mi>D</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>d</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>m</mi> <mi>i</mi> <mi>n</mi> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <mn>3</mn> </mfrac> <munder> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mi>R</mi> <mo>,</mo> <mi>G</mi> <mo>,</mo> <mi>B</mi> </mrow> </munder> <mo>|</mo> <msubsup> <mi>I</mi> <mi>L</mi> <mi>i</mi> </msubsup> <mo>(</mo> <mi>p</mi> <mo>)</mo> <mo>-</mo> <msubsup> <mi>I</mi> <mi>R</mi> <mi>i</mi> </msubsup> <mo>(</mo> <mrow> <mi>p</mi> <mo>,</mo> <mi>d</mi> </mrow> <mo>)</mo> <mo>|</mo> <mo>,</mo> <msub> <mi>T</mi> <mrow> <mi>A</mi> <mi>D</mi> </mrow> </msub> <mo>)</mo> </mrow> </mrow>
Cgrad_x(p,d)=min(|▽xIL(p)-▽xIR(p,d)|,Tgrad)
Cgrad_y(p,d)=min(|▽yIL(p)-▽yIR(p,d)|,Tgrad)
in the formula: p is a point in the left image, where i ═ R, G, B represent the three channels of the color image, respectively, and TADAnd TgradRepresenting the cut-off thresholds for color and gradient, respectively, ▽x、▽yRepresenting the gradient operators of the picture in the x and y directions, respectively, α is a balance factor between the color difference and the gradient difference.
4. The method for cross-scale cost aggregation stereo matching based on weak texture detection as claimed in claim 1, wherein the cost aggregation in the step c is specifically:
<mrow> <mover> <mi>C</mi> <mo>~</mo> </mover> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>d</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mi>argmin</mi> <mi>z</mi> </munder> <munder> <mo>&amp;Sigma;</mo> <mrow> <mi>q</mi> <mo>&amp;Element;</mo> <mi>N</mi> </mrow> </munder> <mi>W</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>|</mo> <mi>z</mi> <mo>-</mo> <mi>C</mi> <mrow> <mo>(</mo> <mi>q</mi> <mo>,</mo> <mi>d</mi> <mo>)</mo> </mrow> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> </mrow>
<mrow> <mover> <mi>C</mi> <mo>~</mo> </mover> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>d</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mo>&amp;Sigma;</mo> <mrow> <mi>q</mi> <mo>&amp;Element;</mo> <mi>N</mi> </mrow> </munder> <mi>W</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>&amp;CenterDot;</mo> <mi>C</mi> <mrow> <mo>(</mo> <mi>q</mi> <mo>,</mo> <mi>d</mi> <mo>)</mo> </mrow> </mrow>
<mrow> <mover> <mi>v</mi> <mo>~</mo> </mover> <mo>=</mo> <munder> <mrow> <mi>a</mi> <mi>r</mi> <mi>g</mi> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> <msubsup> <mrow> <mo>{</mo> <msup> <mi>z</mi> <mi>s</mi> </msup> <mo>}</mo> </mrow> <mrow> <mi>s</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>S</mi> </msubsup> </munder> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>s</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>S</mi> </munderover> <munder> <mo>&amp;Sigma;</mo> <mrow> <msup> <mi>q</mi> <mi>s</mi> </msup> <mo>&amp;Element;</mo> <msub> <mi>N</mi> <mi>s</mi> </msub> </mrow> </munder> <mi>W</mi> <mrow> <mo>(</mo> <msup> <mi>p</mi> <mi>s</mi> </msup> <mo>,</mo> <msup> <mi>q</mi> <mi>s</mi> </msup> <mo>)</mo> </mrow> <mo>|</mo> <mo>|</mo> <msup> <mi>z</mi> <mi>s</mi> </msup> <mo>-</mo> <msup> <mi>C</mi> <mi>s</mi> </msup> <mrow> <mo>(</mo> <msup> <mi>q</mi> <mi>s</mi> </msup> <mo>,</mo> <msup> <mi>d</mi> <mi>s</mi> </msup> <mo>)</mo> </mrow> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> </mrow>
wherein,representing the aggregated matching cost, wherein z is an expected optimization target value, W is a Gaussian filter kernel, N is a neighborhood window of a pixel p, q is a neighborhood pixel point of p, S ∈ {0, 1.., S } is a scale parameter, and when S is 0, C is the value0Representing the original scale matching cost of the image;an aggregate cost representing S +1 scales of the image;
<mrow> <msup> <mover> <mi>C</mi> <mo>~</mo> </mover> <mi>s</mi> </msup> <mrow> <mo>(</mo> <msup> <mi>p</mi> <mi>s</mi> </msup> <mo>,</mo> <msup> <mi>d</mi> <mi>s</mi> </msup> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mo>&amp;Sigma;</mo> <mrow> <msup> <mi>q</mi> <mi>s</mi> </msup> <mo>&amp;Element;</mo> <msub> <mi>N</mi> <mi>s</mi> </msub> </mrow> </munder> <mi>W</mi> <mrow> <mo>(</mo> <msup> <mi>p</mi> <mi>s</mi> </msup> <mo>,</mo> <msup> <mi>q</mi> <mi>s</mi> </msup> <mo>)</mo> </mrow> <mo>&amp;CenterDot;</mo> <msup> <mi>C</mi> <mi>s</mi> </msup> <mrow> <mo>(</mo> <msup> <mi>q</mi> <mi>s</mi> </msup> <mo>,</mo> <msup> <mi>d</mi> <mi>s</mi> </msup> <mo>)</mo> </mrow> </mrow>
<mrow> <mover> <mi>v</mi> <mo>~</mo> </mover> <mo>=</mo> <munder> <mrow> <mi>a</mi> <mi>r</mi> <mi>g</mi> <mi>m</mi> <mi>i</mi> <mi>n</mi> </mrow> <msubsup> <mrow> <mo>{</mo> <msup> <mi>z</mi> <mi>s</mi> </msup> <mo>}</mo> </mrow> <mrow> <mi>s</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>S</mi> </msubsup> </munder> <mrow> <mo>(</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>s</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>S</mi> </munderover> <munder> <mo>&amp;Sigma;</mo> <mrow> <msup> <mi>q</mi> <mi>s</mi> </msup> <mo>&amp;Element;</mo> <msub> <mi>N</mi> <mi>s</mi> </msub> </mrow> </munder> <mi>W</mi> <mo>(</mo> <mrow> <msup> <mi>p</mi> <mi>s</mi> </msup> <mo>,</mo> <msup> <mi>q</mi> <mi>s</mi> </msup> </mrow> <mo>)</mo> <mo>|</mo> <mo>|</mo> <msup> <mi>z</mi> <mi>s</mi> </msup> <mo>-</mo> <msup> <mi>C</mi> <mi>s</mi> </msup> <mo>(</mo> <mrow> <msup> <mi>q</mi> <mi>s</mi> </msup> <mo>,</mo> <msup> <mi>d</mi> <mi>s</mi> </msup> </mrow> <mo>)</mo> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>+</mo> <mi>&amp;lambda;</mi> <mo>&amp;CenterDot;</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>s</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>S</mi> </munderover> <mo>|</mo> <mo>|</mo> <msup> <mi>z</mi> <mi>s</mi> </msup> <mo>-</mo> <msup> <mi>z</mi> <mrow> <mi>s</mi> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mo>|</mo> <msup> <mo>|</mo> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow>
in the formula, lambda is a regularization factor,by usingAn optimization objective function representing equation (11), andcomprises the following steps:
<mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mo>(</mo> <mn>1</mn> <mo>+</mo> <mi>&amp;lambda;</mi> <mo>)</mo> <msup> <mi>z</mi> <mi>s</mi> </msup> <mo>-</mo> <msup> <mi>&amp;lambda;z</mi> <mrow> <mi>s</mi> <mo>+</mo> <mn>1</mn> </mrow> </msup> <mo>=</mo> <msup> <mover> <mi>C</mi> <mo>~</mo> </mover> <mi>s</mi> </msup> <mo>(</mo> <msup> <mi>p</mi> <mi>s</mi> </msup> <mo>,</mo> <msup> <mi>d</mi> <mi>s</mi> </msup> <mo>)</mo> <mo>,</mo> <mi>s</mi> <mo>=</mo> <mn>0</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <msup> <mi>&amp;lambda;z</mi> <mrow> <mi>s</mi> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mo>+</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>+</mo> <mn>2</mn> <mi>&amp;lambda;</mi> <mo>)</mo> </mrow> <msup> <mi>z</mi> <mi>s</mi> </msup> <mo>-</mo> <msup> <mi>&amp;lambda;z</mi> <mrow> <mi>s</mi> <mo>+</mo> <mn>1</mn> </mrow> </msup> <mo>=</mo> <msup> <mover> <mi>C</mi> <mo>~</mo> </mover> <mi>s</mi> </msup> <mrow> <mo>(</mo> <msup> <mi>p</mi> <mi>s</mi> </msup> <mo>,</mo> <msup> <mi>d</mi> <mi>s</mi> </msup> <mo>)</mo> </mrow> <mo>,</mo> <mi>s</mi> <mo>=</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mo>,</mo> <mo>...</mo> <mo>,</mo> <mi>S</mi> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <msup> <mi>&amp;lambda;z</mi> <mrow> <mi>s</mi> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mo>+</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>+</mo> <mn>2</mn> <mi>&amp;lambda;</mi> <mo>)</mo> </mrow> <msup> <mi>z</mi> <mi>s</mi> </msup> <mo>=</mo> <msup> <mover> <mi>C</mi> <mo>~</mo> </mover> <mi>s</mi> </msup> <mrow> <mo>(</mo> <msup> <mi>p</mi> <mi>s</mi> </msup> <mo>,</mo> <msup> <mi>d</mi> <mi>s</mi> </msup> <mo>)</mo> </mrow> <mo>,</mo> <mi>s</mi> <mo>=</mo> <mi>S</mi> </mrow> </mtd> </mtr> </mtable> </mfenced>
<mrow> <mi>A</mi> <mover> <mi>v</mi> <mo>^</mo> </mover> <mo>=</mo> <mover> <mi>v</mi> <mo>~</mo> </mover> </mrow>
<mrow> <mi>A</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <mn>1</mn> <mo>+</mo> <mi>&amp;lambda;</mi> </mrow> </mtd> <mtd> <mrow> <mo>-</mo> <mi>&amp;lambda;</mi> </mrow> </mtd> <mtd> <mrow></mrow> </mtd> <mtd> <mrow></mrow> </mtd> <mtd> <mrow></mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <mi>&amp;lambda;</mi> </mrow> </mtd> <mtd> <mrow> <mn>1</mn> <mo>+</mo> <mn>2</mn> <mi>&amp;lambda;</mi> </mrow> </mtd> <mtd> <mrow> <mo>-</mo> <mi>&amp;lambda;</mi> </mrow> </mtd> <mtd> <mrow></mrow> </mtd> <mtd> <mrow></mrow> </mtd> </mtr> <mtr> <mtd> <mrow></mrow> </mtd> <mtd> <mo>...</mo> </mtd> <mtd> <mo>...</mo> </mtd> <mtd> <mrow></mrow> </mtd> <mtd> <mrow></mrow> </mtd> </mtr> <mtr> <mtd> <mrow></mrow> </mtd> <mtd> <mrow></mrow> </mtd> <mtd> <mo>...</mo> </mtd> <mtd> <mo>...</mo> </mtd> <mtd> <mrow></mrow> </mtd> </mtr> <mtr> <mtd> <mrow></mrow> </mtd> <mtd> <mrow></mrow> </mtd> <mtd> <mrow> <mo>-</mo> <mi>&amp;lambda;</mi> </mrow> </mtd> <mtd> <mrow> <mn>1</mn> <mo>+</mo> <mn>2</mn> <mi>&amp;lambda;</mi> </mrow> </mtd> <mtd> <mrow> <mo>-</mo> <mi>&amp;lambda;</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow></mrow> </mtd> <mtd> <mrow></mrow> </mtd> <mtd> <mrow></mrow> </mtd> <mtd> <mrow> <mo>-</mo> <mi>&amp;lambda;</mi> </mrow> </mtd> <mtd> <mrow> <mn>1</mn> <mo>+</mo> <mi>&amp;lambda;</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
<mrow> <mover> <mi>v</mi> <mo>^</mo> </mover> <mo>=</mo> <msup> <mi>A</mi> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mover> <mi>v</mi> <mo>~</mo> </mover> </mrow>
<mrow> <msub> <mover> <mi>C</mi> <mo>~</mo> </mover> <mrow> <mi>f</mi> <mi>i</mi> <mi>n</mi> <mi>a</mi> <mi>l</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>d</mi> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>T</mi> <mrow> <mi>h</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> </mrow> </msub> <mo>&amp;CenterDot;</mo> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>q</mi> <mo>&amp;Element;</mo> <mi>N</mi> </mrow> </munder> <msub> <mi>W</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <msub> <mi>C</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>q</mi> <mo>,</mo> <mi>d</mi> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>T</mi> <mrow> <mi>l</mi> <mi>o</mi> <mi>w</mi> </mrow> </msub> <mo>&amp;CenterDot;</mo> <munder> <mi>&amp;Sigma;</mi> <mrow> <mi>q</mi> <mo>&amp;Element;</mo> <mi>N</mi> </mrow> </munder> <msub> <mi>W</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <msub> <mi>C</mi> <mrow> <mn>1</mn> <mo>/</mo> <mn>2</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>q</mi> <mo>,</mo> <mi>d</mi> <mo>)</mo> </mrow> </mrow>
wherein, ThighAnd TlowRespectively representing the texture region and the weak texture region detected in the text; c1And C1/2Respectively representing the matching cost of the original image scale and the half scale, carrying out Gaussian filtering by using windows with different sizes, and obtaining the final matching cost after fusion.
5. The method for cross-scale cost aggregation stereo matching based on weak texture detection as claimed in claim 1, wherein the disparity refinement in step e specifically comprises:
|D'L(P)-D'R(P-D'R(P))|<
DLRC(P)=min(D'(PL),D'(PR))
<mrow> <msub> <mi>D</mi> <mi>w</mi> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mo>&amp;Sigma;</mo> <mi>q</mi> </munder> <msub> <mi>WB</mi> <mrow> <mi>p</mi> <mi>q</mi> </mrow> </msub> <mrow> <mo>(</mo> <msub> <mi>I</mi> <mi>L</mi> </msub> <mo>)</mo> </mrow> <msub> <mi>D</mi> <mrow> <mi>L</mi> <mi>R</mi> <mi>C</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>q</mi> <mo>)</mo> </mrow> </mrow>
<mrow> <msub> <mi>WB</mi> <mrow> <mi>p</mi> <mi>q</mi> </mrow> </msub> <mo>=</mo> <mi>exp</mi> <mrow> <mo>(</mo> <mo>-</mo> <mfrac> <mrow> <msub> <mi>&amp;Delta;c</mi> <mrow> <mi>p</mi> <mi>q</mi> </mrow> </msub> </mrow> <msubsup> <mi>&amp;sigma;</mi> <mi>c</mi> <mn>2</mn> </msubsup> </mfrac> <mo>+</mo> <mfrac> <mrow> <msub> <mi>&amp;Delta;s</mi> <mrow> <mi>p</mi> <mi>q</mi> </mrow> </msub> </mrow> <msubsup> <mi>&amp;sigma;</mi> <mi>s</mi> <mn>2</mn> </msubsup> </mfrac> <mo>)</mo> </mrow> </mrow>
wherein, the left chart parallax value D 'of one point p in the parallax chart'L(p) and Right View disparity value D'R(p-D'L(p)), a threshold for LRC; d '(PL) is the disparity value for the first non-occluded point on the left, and D' (PR) is the disparity value for the first non-occluded point on the right; WB (wideband weight division multiple Access)pq(IL) As a function of the left image, Δ cpqAnd Δ spqRespectively the color difference and the spatial euclidean distance of points p and q in the left image,andadjustment parameters of color difference and distance difference respectively; dw(p) a filtered image.
CN201710631310.7A 2017-07-28 2017-07-28 A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection Pending CN107392950A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710631310.7A CN107392950A (en) 2017-07-28 2017-07-28 A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710631310.7A CN107392950A (en) 2017-07-28 2017-07-28 A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection

Publications (1)

Publication Number Publication Date
CN107392950A true CN107392950A (en) 2017-11-24

Family

ID=60342086

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710631310.7A Pending CN107392950A (en) 2017-07-28 2017-07-28 A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection

Country Status (1)

Country Link
CN (1) CN107392950A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107945222A (en) * 2017-12-15 2018-04-20 东南大学 A kind of new Stereo matching cost calculates and parallax post-processing approach
CN108181319A (en) * 2017-12-12 2018-06-19 陕西三星洁净工程有限公司 A kind of laying dust detecting device and method based on stereoscopic vision
CN108510529A (en) * 2018-03-14 2018-09-07 昆明理工大学 A kind of figure based on adaptive weight cuts solid matching method
CN108596975A (en) * 2018-04-25 2018-09-28 华南理工大学 A kind of Stereo Matching Algorithm for weak texture region
CN108682026A (en) * 2018-03-22 2018-10-19 辽宁工业大学 A kind of binocular vision solid matching method based on the fusion of more Matching units
CN108765486A (en) * 2018-05-17 2018-11-06 长春理工大学 Based on sparse piece of aggregation strategy method of relevant Stereo matching in color
CN109816782A (en) * 2019-02-03 2019-05-28 哈尔滨理工大学 A kind of indoor scene three-dimensional rebuilding method based on binocular vision
CN109887021A (en) * 2019-01-19 2019-06-14 天津大学 Based on the random walk solid matching method across scale
CN109961417A (en) * 2017-12-26 2019-07-02 广州极飞科技有限公司 Image processing method, device and mobile device control method
CN111191694A (en) * 2019-12-19 2020-05-22 浙江科技学院 Image stereo matching method
CN111508013A (en) * 2020-04-21 2020-08-07 中国科学技术大学 Stereo matching method
CN112070694A (en) * 2020-09-03 2020-12-11 深兰人工智能芯片研究院(江苏)有限公司 Binocular stereo vision disparity map post-processing method and device
WO2021018093A1 (en) * 2019-07-31 2021-02-04 深圳市道通智能航空技术有限公司 Stereo matching method, image processing chip, and moving carrier

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105551035A (en) * 2015-12-09 2016-05-04 深圳市华和瑞智科技有限公司 Stereoscopic vision matching method based on weak edge and texture classification
CN106340036A (en) * 2016-08-08 2017-01-18 东南大学 Binocular stereoscopic vision-based stereo matching method
CN106530336A (en) * 2016-11-07 2017-03-22 湖南源信光电科技有限公司 Stereo matching algorithm based on color information and graph-cut theory

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105551035A (en) * 2015-12-09 2016-05-04 深圳市华和瑞智科技有限公司 Stereoscopic vision matching method based on weak edge and texture classification
CN106340036A (en) * 2016-08-08 2017-01-18 东南大学 Binocular stereoscopic vision-based stereo matching method
CN106530336A (en) * 2016-11-07 2017-03-22 湖南源信光电科技有限公司 Stereo matching algorithm based on color information and graph-cut theory

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
张华 等: ""基于跨尺度变窗口代价聚合的快速立体匹配"", 《计算机工程与应用》 *
曹晓倩 等: "基于弱纹理检测及视差图融合的立体匹配", 《仪器仪表学报》 *
林雪: "双目立体视觉中立体匹配技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108181319A (en) * 2017-12-12 2018-06-19 陕西三星洁净工程有限公司 A kind of laying dust detecting device and method based on stereoscopic vision
CN108181319B (en) * 2017-12-12 2020-09-11 陕西三星洁净工程有限公司 Accumulated dust detection device and method based on stereoscopic vision
CN107945222A (en) * 2017-12-15 2018-04-20 东南大学 A kind of new Stereo matching cost calculates and parallax post-processing approach
CN109961417A (en) * 2017-12-26 2019-07-02 广州极飞科技有限公司 Image processing method, device and mobile device control method
CN108510529A (en) * 2018-03-14 2018-09-07 昆明理工大学 A kind of figure based on adaptive weight cuts solid matching method
CN108682026A (en) * 2018-03-22 2018-10-19 辽宁工业大学 A kind of binocular vision solid matching method based on the fusion of more Matching units
CN108682026B (en) * 2018-03-22 2021-08-06 江大白 Binocular vision stereo matching method based on multi-matching element fusion
CN108596975A (en) * 2018-04-25 2018-09-28 华南理工大学 A kind of Stereo Matching Algorithm for weak texture region
CN108596975B (en) * 2018-04-25 2022-03-29 华南理工大学 Stereo matching algorithm for weak texture region
CN108765486A (en) * 2018-05-17 2018-11-06 长春理工大学 Based on sparse piece of aggregation strategy method of relevant Stereo matching in color
CN109887021A (en) * 2019-01-19 2019-06-14 天津大学 Based on the random walk solid matching method across scale
CN109887021B (en) * 2019-01-19 2023-06-06 天津大学 Cross-scale-based random walk stereo matching method
CN109816782A (en) * 2019-02-03 2019-05-28 哈尔滨理工大学 A kind of indoor scene three-dimensional rebuilding method based on binocular vision
WO2021018093A1 (en) * 2019-07-31 2021-02-04 深圳市道通智能航空技术有限公司 Stereo matching method, image processing chip, and moving carrier
CN111191694A (en) * 2019-12-19 2020-05-22 浙江科技学院 Image stereo matching method
CN111508013A (en) * 2020-04-21 2020-08-07 中国科学技术大学 Stereo matching method
CN111508013B (en) * 2020-04-21 2022-09-06 中国科学技术大学 Stereo matching method
CN112070694A (en) * 2020-09-03 2020-12-11 深兰人工智能芯片研究院(江苏)有限公司 Binocular stereo vision disparity map post-processing method and device

Similar Documents

Publication Publication Date Title
CN107392950A (en) A kind of across yardstick cost polymerization solid matching method based on weak skin texture detection
CN105528785B (en) A kind of binocular vision image solid matching method
CN110473217B (en) Binocular stereo matching method based on Census transformation
CN107578418B (en) Indoor scene contour detection method fusing color and depth information
CN108682026B (en) Binocular vision stereo matching method based on multi-matching element fusion
CN108596975B (en) Stereo matching algorithm for weak texture region
CN104574366B (en) A kind of extracting method in the vision significance region based on monocular depth figure
CN102665086B (en) Method for obtaining parallax by using region-based local stereo matching
US8953873B2 (en) Method for objectively evaluating quality of stereo image
CN104835175B (en) Object detection method in a kind of nuclear environment of view-based access control model attention mechanism
CN109345502B (en) Stereo image quality evaluation method based on disparity map stereo structure information extraction
CN102750695A (en) Machine learning-based stereoscopic image quality objective assessment method
CN106973288B (en) A kind of three-dimensional video-frequency Comfort Evaluation method and device
CN107578430A (en) A kind of solid matching method based on adaptive weight and local entropy
CN108010075B (en) Local stereo matching method based on multi-feature combination
CN103026380A (en) Image processing apparatus and image processing method
CN107610093B (en) Full-reference image quality evaluation method based on similarity feature fusion
CN103325120A (en) Rapid self-adaption binocular vision stereo matching method capable of supporting weight
CN110866882B (en) Layered joint bilateral filtering depth map repairing method based on depth confidence
CN105654142A (en) Natural scene statistics-based non-reference stereo image quality evaluation method
CN114998320B (en) Method, system, electronic device and storage medium for visual saliency detection
CN105898278B (en) A kind of three-dimensional video-frequency conspicuousness detection method based on binocular Multidimensional Awareness characteristic
CN110246111A (en) Based on blending image with reinforcing image without reference stereo image quality evaluation method
CN106530336A (en) Stereo matching algorithm based on color information and graph-cut theory
CN114648482A (en) Quality evaluation method and system for three-dimensional panoramic image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20171124