Nothing Special   »   [go: up one dir, main page]

CN101447077A - Edge detection method of color textile texture image oriented to textile industry - Google Patents

Edge detection method of color textile texture image oriented to textile industry Download PDF

Info

Publication number
CN101447077A
CN101447077A CNA2008101633896A CN200810163389A CN101447077A CN 101447077 A CN101447077 A CN 101447077A CN A2008101633896 A CNA2008101633896 A CN A2008101633896A CN 200810163389 A CN200810163389 A CN 200810163389A CN 101447077 A CN101447077 A CN 101447077A
Authority
CN
China
Prior art keywords
color
pixel
window
textile
candidate edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA2008101633896A
Other languages
Chinese (zh)
Inventor
陆系群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CNA2008101633896A priority Critical patent/CN101447077A/en
Publication of CN101447077A publication Critical patent/CN101447077A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

本发明公开了一种面向纺织行业的彩色纺织纹理图像的边缘检测方法。根据人眼视觉系统观察图像色彩是受周围邻近像素点色彩的影响,在一个以像素点为中心的滑动窗口内计算窗口上半部和下半部,以及左半部和右半部之间的色彩差异,作为像素点水平和垂直方向的偏导数,像素点三个色彩分量水平和垂直方向的偏导数构成的一个3×2矩阵D,计算矩阵DTD的最大特征值和相应的特征矢量,将它们分别作为像素点的梯度大小和梯度方向;根据像素点梯度大小和方向,计算和细化图像的边缘图。该方法能对抗彩色纺织图像中纹理噪声的干扰,提高图像中艺术图案边缘检测的精确度,为进一步图案编辑和设计提供基本的边缘信息,达到面向纺织行业的自动描图功能。

Figure 200810163389

The invention discloses an edge detection method for a color textile texture image facing the textile industry. According to the observation of the human visual system, the color of the image is affected by the color of the surrounding adjacent pixels. In a sliding window centered on the pixel, the upper half and the lower half of the window, as well as the distance between the left half and the right half are calculated. Color difference, as the partial derivative of the horizontal and vertical directions of the pixel, a 3×2 matrix D composed of the partial derivatives of the three color components of the pixel in the horizontal and vertical directions, calculate the maximum eigenvalue and corresponding eigenvector of the matrix D T D , and take them as the gradient size and gradient direction of the pixel points respectively; calculate and refine the edge map of the image according to the gradient size and direction of the pixel points. This method can resist the interference of texture noise in color textile images, improve the accuracy of edge detection of artistic patterns in images, provide basic edge information for further pattern editing and design, and achieve automatic drawing functions for the textile industry.

Figure 200810163389

Description

Edge detection method towards the color textile texture image of textile industry
Technical field
The present invention relates to a kind of edge detection method of the color textile texture image towards textile industry.
Background technology
The edge that extracts art pattern in the textile printing and dyeing cloth is pattern editor and the requisite prior step of design in the textile printing and dyeing industry, but, cause in the digital picture that obtains by scanning textile cloth sample to have equally distributed texture noise owing to adopt the color halftone technique to reduce production cost in the textile printing and dyeing industry.Traditional method for detecting image edge, as Sobel operator, Canny operator, LoG (Laplacian of Gaussian) operator and Susan operator etc., when the computed image gradient, only consider the difference between adjacent level or vertical two pixels, and the texture interference of noise can make and heterochromia occurs between two neighbor pixels in the uniform zone of color, therefore can detect a lot of pseudo-edges, obscure with edge real in the image and to be in the same place, make that the edge by art pattern in the Computer Automatic Extraction cloth becomes very difficult.And these traditional operators can only act on gray level image, and at color textile texture image, these traditional edge detection operators act on the luminance component of input picture, have lost the correlativity between the color component in the image.
Summary of the invention
The present invention's technical matters at first to be solved is to provide a kind of edge detection method of the color textile texture image towards textile industry, improves the precision of textile texture image rim detection.For this reason, the present invention is by the following technical solutions: it may further comprise the steps:
(1) scanning one color textile sample cloth, the digital picture of acquisition color textile sample cloth is determined one to the isotropic moving window of central point, the width of described window is the integral multiple of described Pixel of Digital Image dot spacing;
The first half of the described window that (2) to calculate with described Pixel of Digital Image point be central point and the heterochromia between the Lower Half, and the left side of described window and the heterochromia between the right-hand part:
d y c = 1 w ( 2 w + 1 ) Σ i = y + 1 y + w Σ j = x - w x + w I c ( i , j ) - 1 w ( 2 w + 1 ) Σ i = y - w y - 1 Σ j = x - w x + w I c ( i , j ) - - - ( 1 )
d x c = 1 w ( 2 w + 1 ) Σ i = y - w y + w Σ j = x + 1 x + w I c ( i , j ) - 1 w ( 2 w + 1 ) Σ i = y - w y + w Σ j = x - w x - 1 I c ( i , j )
(2)
The volume coordinate of i and j remarked pixel point wherein, I c(i, j) expression with (y, x) pixel is to be positioned at (i in the moving window at center, j) color component value, subscript c represents R, G, three color components of B, w represents the radius of described moving window, the length of side of window or diameter are 2w+1, are the odd-multiple of pixel interval
Figure A200810163389D00051
Figure A200810163389D00052
Represent respectively three color components pixel (y, the x) heterochromia of vertical direction,
Figure A200810163389D00053
Represent respectively three color components pixel (y, the x) heterochromia of horizontal direction constitute one 3 * 2 matrix D,
D = d y R d x R d y G d x G d y B d x B - - - ( 3 )
Compute matrix D TThe eigenvalue of maximum of D and corresponding eigenvector, with they respectively as pixel (y, gradient amplitude x) and gradient direction, T are matrix transpose;
(3) repeating step (2), the gradient amplitude and the gradient direction of the Pixel of Digital Image point that all pixels or calculating are chosen in the calculating digital picture;
(4) mean value of calculating pixel point gradient amplitude, set described mean value or with the mean value value corresponding be first threshold, if the gradient amplitude of pixel, is then judged this pixel more than or equal to first threshold as candidate marginal, obtain initial image border figure thus;
(5) remove isolated candidate marginal;
(6) direction of the candidate marginal heterochromia of definite remainder;
(7) remove the isolated candidate marginal of heterochromia direction;
(8) determine the second neighborhood window and second threshold value, at each remaining candidate marginal, in the second neighborhood window, calculating falls within the difference of the candidate marginal number of the second neighborhood window first half and Lower Half, and the difference that falls within the second neighborhood window left side and right-hand part candidate marginal number, if the absolute value of these two differences confirms then that simultaneously less than second threshold value this candidate marginal is final marginal point.
Owing to adopt technique scheme of the present invention, can eliminate the texture noise that causes owing to the textile industry halftone technique, improve the rim detection precision of textile texture image, for the automatic tracing of textile industry provides technical support.
The step of technique scheme (1) to step (4) also provides a kind of initial pictures edge detection method that is used for the color textile texture image rim detection simultaneously.It utilizes some characteristics of human visual system, as the human visual system is a low-pass filter, and the human visual system is when observing image color and do not rely on the color of single pixel, but be subjected to the pixel characteristics such as influence of neighborhood pixels point COLOR COMPOSITION THROUGH DISTRIBUTION on every side, utilize neighborhood territory pixel point COLOR COMPOSITION THROUGH DISTRIBUTION, can eliminate the texture interference of noise effectively, improved the precision of gradient calculation, for improving the rim detection precision of textile texture image, set up good basis.
Description of drawings
Fig. 1 is a process flow diagram of the present invention.
Fig. 2-a is a width of cloth color textile texture image;
Fig. 2-b is the outline map that adopts the Sobel operator to detect to Fig. 2-a;
Fig. 2-c is the outline map that adopts the Canny operator to detect to Fig. 2-a;
Fig. 2-d is the outline map that adopts the Susan operator to detect to Fig. 2-a;
Fig. 2-e is the outline map that adopts the LoG operator to detect to Fig. 2-a;
Fig. 2-f is the outline map that adopts the inventive method to detect to Fig. 2-a;
Fig. 3-a is another width of cloth color textile texture image;
Fig. 3-b is the outline map that adopts the Sobel operator to detect to Fig. 3-a;
Fig. 3-c is the outline map that adopts the Canny operator to detect to Fig. 3-a;
Fig. 3-d is the outline map that adopts the Susan operator to detect to Fig. 3-a;
Fig. 3-e is the outline map that adopts the LoG operator to detect to Fig. 3-a;
Fig. 3-f is the outline map that adopts the inventive method to detect to Fig. 3-a;
Embodiment
Embodiment 1, with reference to accompanying drawing 1, adopts following steps that Fig. 2-a is carried out rim detection:
(1) the color textile printing and dyeing sample cloth of scanning shown in Fig. 2-a obtains the color textile texture digital picture;
(2) slide the neighborhood window here for square, windows radius w according to user's setting, general value is 2, image is done symmetry expansion up and down accordingly, the width of input picture and highly be respectively Width and Height, the width of expansion back image and highly become Width+2w and Height+2w can carry out symmetry to image line earlier and expand so, and then row are carried out symmetry expand, promptly
I row - ext c ( i , j ) = I c ( i , w + 2 - j ) ; 1 ≤ i ≤ Height , 1 ≤ j ≤ w I c ( i , j - w ) ; 1 ≤ i ≤ Height , w + 1 ≤ j ≤ w + Width I c ( i , 3 Width + w - j ) ; 1 ≤ i ≤ Height , w + Width + 1 ≤ j ≤ 2 w + Width
(1)
I ext c ( i , j ) = I row - ext c ( w + 2 - i , j ) ; 1 ≤ i ≤ w , 1 ≤ j ≤ 2 w + Width I row - ext c ( i - w , j ) ; w + 1 ≤ i ≤ w + Height , 1 ≤ j ≤ 2 w + Width I row - ext c ( 2 Height + w - i , j ) ; w + 1 + Height ≤ i ≤ 2 w + Height , 1 ≤ j ≤ 2 w + Width
(2)
Here I cBe a certain color component of presentation video, expression is red respectively in rgb color space, green or blue component,
Figure A200810163389D00071
Be to color component I cThe symmetry expansion of line direction, and
Figure A200810163389D00072
Be
Figure A200810163389D00073
On the basis column direction is carried out symmetry expansion back image.
Perhaps also can carry out the symmetry expansion to image column earlier, and then row is carried out the symmetry expansion, promptly
I col - ext c ( i , j ) = I c ( w + 2 - i , j ) ; 1 ≤ i ≤ w , 1 ≤ j ≤ Width I c ( i - w , j ) ; w + 1 ≤ i ≤ w + Height , 1 ≤ j ≤ Width I c ( 2 Height + w - i , j ) ; w + Height + 1 ≤ i ≤ 2 w + Height , 1 ≤ j ≤ Width
(3)
I ext c ( i , j ) = I col - ext c ( i , w + 2 - j ) ; 1 ≤ i ≤ 2 w + Height , 1 ≤ j ≤ w I col - ext c ( i , j - w ) ; 1 ≤ i ≤ 2 w + Height , 1 + w ≤ j ≤ w + Width I col - ext c ( i , 2 w + Width - j ) ; 1 ≤ i ≤ 2 w + Height , w + 1 + Width ≤ j ≤ 2 w + Width
(4)
Wherein
Figure A200810163389D00076
Be to color component I cThe symmetry expansion of column direction, and
Figure A200810163389D00077
Be
Figure A200810163389D00078
On the basis line direction is carried out image after the symmetry expansion, the expanded images that two kinds of symmetrical extended modes obtain is the same.
(3) radius is inner the work from top to bottom of w square moving window image after expansion, zigzag scanning from left to right, and the image inside of promptly removing moving window radius size frame, the size of expanded images inside is the same with the size of input picture.
(4) when moving window rest on pixel (y in the time of x), calculates the moving window first half and Lower Half, and the heterochromia between left side and the right-hand part:
d y c = 1 w ( 2 w + 1 ) Σ i = y + 1 y + w Σ j = x - w x + w I c ( i , j ) - 1 w ( 2 w + 1 ) Σ i = y - w y - 1 Σ j = x - w x + w I c ( i , j ) - - - ( 5 )
d x c = 1 w ( 2 w + 1 ) Σ i = y - w y + w Σ j = x + 1 x + w I c ( i , j ) - 1 w ( 2 w + 1 ) Σ i = y - w y + w Σ j = x - w x - 1 I c ( i , j ) - - - ( 6 )
The volume coordinate of i and j remarked pixel point wherein, I c(i, j) expression is with (y, x) pixel is to be positioned at (i in the moving window at center, j) color component value, subscript c represents R, G, three color components of B, also is convertible into other color space component, and w represents described radius to the isotropic moving window of central point, the length of side of window or diameter are 2w+1, will
Figure A200810163389D000711
With
Figure A200810163389D000712
Respectively as color component I cAt current pixel point (y, x) level on and vertical direction partial derivative.
(5) according to the vertical and horizontal direction partial derivative of three color components of pixel, one 3 * 2 matrix D of formation, as follows:
D = d y R d x R d y G d x G d y B d x B - - - ( 7 )
Compute matrix D TThe eigenvalue of maximum of D and corresponding eigenvector, wherein label T is matrix transpose, with them respectively as the gradient amplitude and the gradient direction of pixel.
(6) scan complete width of cloth digital picture according to the zigzag path when moving window, or selected zone, the mean value of computed image pixel gradient amplitude is if the gradient amplitude of pixel is greater than this mean value, then this pixel obtains initial image border figure as candidate's marginal point;
(7), in 3 * 3 neighborhoods of each candidate marginal, check the number of its neighborhood pixels point, if the number of neighbor candidate marginal point, is then removed these isolated candidate marginal less than 3 for candidate marginal according to initial image border figure;
(8) calculating with each remaining candidate marginal respectively is the center, radius be in the square moving window of w along level, back-diagonal, the heterochromia of vertical and diagonal, the general value of w is 2, selects the direction of the maximum direction of heterochromia absolute value as this candidate marginal heterochromia; Wherein along the computing method of level and vertical direction heterochromia shown in formula (5) and (6), and be calculated as follows along the heterochromia of diagonal:
d d c = 1 w ( 2 w + 1 ) { patch c ( x , y ) . * Template d up - patch c ( x , y ) . * Template d down } - - - ( 8 )
Patch wherein c(x, y) expression color component I cBe positioned at (y, x) candidate marginal is (2w+1) * (2w+1) sized images piece at center. *Element addition again after the representing matrix dot product, and
Figure A200810163389D00083
With
Figure A200810163389D00084
Represent two two-value templates that size is (2w+1) * (2w+1):
Figure A200810163389D00085
Figure A200810163389D00086
Heterochromia along the back-diagonal direction is calculated as follows:
d d c = 1 w ( 2 w + 1 ) { patch c ( x , y ) . * Template ad up - patch c ( x , y ) . * Template ad down } - - - ( 11 )
Wherein
Figure A200810163389D00088
With
Figure A200810163389D00089
Represent two two-value templates that size is (2w+1) * (2w+1):
Figure A200810163389D00091
Figure A200810163389D00092
(9) in (2w+1) * (2w+1) neighborhood of each candidate marginal of remainder, check its heterochromia direction and whether surpass neighbor candidate marginal point heterochromia direction more than half consistent, if it is inconsistent, then remove the isolated marginal point of aberration direction, the w value is generally 2 here;
(10) at each remaining candidate marginal, in (2w+1) * (2w+1) neighborhood window, here the w value is generally 2, calculating falls within the difference of the candidate marginal number of the window first half and Lower Half, and the difference that falls within window left side and right-hand part candidate marginal number, if the absolute value of these two differences confirms then that simultaneously less than (2w+1) this candidate marginal is final marginal point, concrete outcome is seen Fig. 2-f.
What the Sobel operator adopted is the gradient that following two single order partial derivative templates are come computed image, the convolution results of two templates and image is respectively the partial derivative of image level and vertical direction, the root mean square of the quadratic sum of both direction partial derivative is as the gradient amplitude of image, and the partial derivative of the partial derivative of vertical direction and horizontal direction liken Tangent value to into gradient direction.The outline map that Fig. 2-a adopts the Sobel operator to detect is seen Fig. 2-b.
- 1 0 1 - 2 0 2 - 1 0 1 - 1 - 2 - 1 0 0 0 1 2 1 - - - ( 14 )
The Canny operator needs to adopt a gauss low frequency filter that image is carried out low-pass filtering earlier, adopt the partial derivative of Sobel operator calculated level and vertical direction, gradient amplitude size according to image, carry out the compacting of local non-maximum value, adopt two different threshold values further to judge the position at edge then, if it is marginal point that the gradient amplitude of pixel, is then judged this pixel greater than big threshold value; If less than little threshold value, judge that then this pixel is not a marginal point; If the gradient magnitude of pixel between two threshold values, then need to check this pixel around whether have marginal point, if there is marginal point to exist, think that then this pixel is a marginal point, otherwise, then be not marginal point.The outline map that Fig. 2-a adopts the Canny operator to detect is seen Fig. 2-c.
The Susan operator then is the difference between neighborhood pixels point gray scale and the central pixel point gray scale around calculating, and (x) weigh the similar number of neighborhood pixels point and central pixel point on every side with exp, wherein x is exactly the gray difference that calculates, and just gray difference is big more, and similarity is more little.If similar neighborhood pixels point number, thinks then that this pixel is a marginal point less than a threshold value.The outline map that Fig. 2-a adopts the Susan operator to detect is seen Fig. 2-d.
The LoG operator then is to adopt the second derivative template to determine the edge, but also needs to adopt a gauss low frequency filter that image is carried out low-pass filtering in advance.The outline map that Fig. 2-a adopts the LoG operator to detect is seen Fig. 2-e.
These traditional edge detection methods are only considered the difference between the single pixel, so texture interference of noise in can not fine removal textile printing and dyeing image, in conjunction with Fig. 2-b, Fig. 2-c, Fig. 2-d, Fig. 2-e, Fig. 2-f, obviously use the result that result that the present invention detects obviously is better than traditional edge detection method.
Embodiment 2
Adopt the step identical that accompanying drawing 3-a is detected with embodiment 1, the image border figure that obtains that detects sees accompanying drawing 3-f, in conjunction with Fig. 3-b, Fig. 3-c, Fig. 3-d, Fig. 3-e, checking once more: use the obvious result of result that the present invention detects owing to traditional edge detection method.

Claims (5)

1.一种面向纺织行业的彩色纺织纹理图像的边缘检测方法,其特征在于它包括以下步骤:1. a kind of edge detection method of the color textile texture image facing textile industry, it is characterized in that it comprises the following steps: (1)扫描一彩色纺织样布,获得彩色纺织样布的数字图像,确定一个对中心点各向同性的滑动窗口,所述窗口的宽度为所述数字图像像素点间距的整数倍;(1) scan a colored textile sample cloth, obtain the digital image of colored textile sample cloth, determine a sliding window that is isotropic to the center point, the width of the window is an integer multiple of the pixel pitch of the digital image; (2)计算以所述数字图像像素点为中心点的所述窗口的上半部和下半部之间的色彩差异,以及所述窗口的左半部和右半部之间的色彩差异:(2) Calculate the color difference between the upper half and the lower half of the window with the digital image pixel as the center point, and the color difference between the left half and the right half of the window: dd ythe y cc == 11 ww (( 22 ww ++ 11 )) ΣΣ ii == ythe y ++ 11 ythe y ++ ww ΣΣ jj == xx -- ww xx ++ ww II cc (( ii ,, jj )) -- 11 ww (( 22 ww ++ 11 )) ΣΣ ii == ythe y -- ww ythe y -- 11 ΣΣ jj == xx -- ww xx ++ ww II cc (( ii ,, jj )) -- -- -- (( 11 )) dd xx cc == 11 ww (( 22 ww ++ 11 )) ΣΣ ii == ythe y -- ww ythe y ++ ww ΣΣ jj == xx ++ 11 xx ++ ww II cc (( ii ,, jj )) -- 11 ww (( 22 ww ++ 11 )) ΣΣ ii == ythe y -- ww ythe y ++ ww ΣΣ jj == xx -- ww xx -- 11 II cc (( ii ,, jj )) -- -- -- (( 22 )) 其中i和j表示像素点的空间坐标,Ic(i,j)表示以(y,x)像素点为中心的滑动窗口内位于(i,j)的色彩分量值,上标c表示R、G、B三个色彩分量,w表示所述滑动窗口的半径,窗口的边长或直径为2w+1,为像素点间距的奇数倍,
Figure A200810163389C00024
分别表示三个色彩分量在像素点(y,x)垂直方向的色彩差异,
Figure A200810163389C00025
分别表示三个色彩分量在像素点(y,x)水平方向的色彩差异,构成一个3×2矩阵D,
Among them, i and j represent the spatial coordinates of the pixel, I c (i, j) represents the color component value at (i, j) in the sliding window centered on the (y, x) pixel, and the superscript c represents R, G and B three color components, w represents the radius of the sliding window, and the side length or diameter of the window is 2w+1, which is an odd multiple of the pixel pitch,
Figure A200810163389C00024
Respectively represent the color difference of the three color components in the vertical direction of the pixel (y, x),
Figure A200810163389C00025
Represent the color difference of the three color components in the horizontal direction of the pixel point (y, x), forming a 3×2 matrix D,
DD. == dd ythe y RR dd xx RR dd ythe y GG dd xx GG dd ythe y BB dd xx BB -- -- -- (( 33 )) 计算矩阵DTD的最大特征值和相应的特征矢量,将它们分别作为像素点(y,x)的梯度幅度和梯度方向,T为矩阵转置;Calculate the largest eigenvalue and corresponding eigenvector of the matrix D T D, and use them as the gradient magnitude and gradient direction of the pixel point (y, x) respectively, and T is the matrix transposition; (3)重复步骤(2),按顺序计算数字图像中所有像素点或计算选取的数字图像像素点的梯度幅度和梯度方向;(3) Step (2) is repeated to calculate the gradient magnitude and the gradient direction of all pixels in the digital image or calculate the selected digital image pixels in order; (4)计算像素点梯度幅度的平均值,设定所述平均值或与平均值对应的数值为第一阈值,如果像素点的梯度幅度大于等于第一阈值,则判定该像素点作为候选边缘点,由此得到初始的图像边缘图;(4) Calculate the average value of the gradient magnitude of the pixel point, set the average value or the value corresponding to the average value as the first threshold, if the gradient magnitude of the pixel point is greater than or equal to the first threshold value, then determine the pixel point as a candidate edge point, thus obtaining the initial image edge map; (5)去除孤立的候选边缘点;(5) Remove isolated candidate edge points; (6)确定余下的候选边缘点色彩差异的方向;(6) Determine the direction of the color difference of the remaining candidate edge points; (7)去除色彩差异方向孤立的候选边缘点;(7) Remove the candidate edge points isolated in the color difference direction; (8)确定第二邻域窗口及第二阈值,针对每个余下的候选边缘点,在第二邻域窗口内,计算落于第二邻域窗口上半部和下半部的候选边缘点个数的差异,以及落于第二邻域窗口左半部和右半部候选边缘点个数的差异,如果这两个差异的绝对值同时小于第二阈值,则确认该候选边缘点为最终的边缘点。(8) Determine the second neighborhood window and the second threshold, and for each remaining candidate edge point, within the second neighborhood window, calculate the candidate edge points that fall in the upper half and lower half of the second neighborhood window The difference in number, and the difference in the number of candidate edge points falling in the left half and right half of the second neighborhood window, if the absolute value of these two differences is less than the second threshold at the same time, it is confirmed that the candidate edge point is the final edge point.
2.根据权利要求1所述的一种面向纺织行业的彩色纺织纹理图像的边缘检测方法,其特征在于所述对中心点各向同性的窗口为正方形或圆形。2. A method for edge detection of a color textile texture image oriented to the textile industry according to claim 1, wherein the window isotropic to the center point is a square or a circle. 3.根据权利要求1所述的一种面向纺织行业的彩色纺织纹理图像的边缘检测方法,其特征在于:所述步骤(5)采用以下步骤:确定第三邻域窗口及第三阈值,在每个候选边缘点的第三邻域窗口中检查其邻近像素点为候选边缘点的个数,如果邻近候选边缘点的个数小于第三阈值,则去除这些孤立的候选边缘点。3. the edge detection method of a kind of color textile texture image facing textile industry according to claim 1, it is characterized in that: described step (5) adopts the following steps: determine the 3rd neighborhood window and the 3rd threshold value, in In the third neighborhood window of each candidate edge point, check the number of its adjacent pixel points as candidate edge points, and if the number of adjacent candidate edge points is less than the third threshold, remove these isolated candidate edge points. 4.根据权利要求1所述的一种面向纺织行业的彩色纺织纹理图像的边缘检测方法,其特征在于:所述步骤(6)采用以下步骤:确定第四邻域窗口,分别计算在以每个余下的候选边缘点为中心的第四邻域窗口内沿水平,反对角线,垂直和对角线方向的色彩差异,选择色彩差异绝对值最大的方向作为该候选边缘点色彩差异的方向。4. the edge detection method of a kind of color textile texture image facing textile industry according to claim 1, it is characterized in that: described step (6) adopts the following steps: determine the 4th neighborhood window, calculate respectively in each The remaining candidate edge points are the color differences along the horizontal, anti-diagonal, vertical and diagonal directions in the fourth neighborhood window centered on the remaining candidate edge points, and the direction with the largest absolute value of the color difference is selected as the direction of the color difference of the candidate edge point. 5.根据权利要求1所述的一种面向纺织行业的彩色纺织纹理图像的边缘检测方法,其特征在于:所述步骤(7)采用以下步骤:确定第五邻域窗口,在每个余下的候选边缘点的第五邻域窗口中检查其色彩差异方向和多数邻近候选边缘点的色彩差异方向是否一致,如果不一致,则去除该色彩差异方向孤立的候选边缘点。5. The edge detection method of a kind of textile industry-oriented color textile texture image according to claim 1, characterized in that: said step (7) adopts the following steps: determine the fifth neighborhood window, in each remaining Check whether the color difference direction of the candidate edge point is consistent with the color difference direction of most adjacent candidate edge points in the fifth neighborhood window, and if not, remove the isolated candidate edge point of the color difference direction.
CNA2008101633896A 2008-12-18 2008-12-18 Edge detection method of color textile texture image oriented to textile industry Pending CN101447077A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNA2008101633896A CN101447077A (en) 2008-12-18 2008-12-18 Edge detection method of color textile texture image oriented to textile industry

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNA2008101633896A CN101447077A (en) 2008-12-18 2008-12-18 Edge detection method of color textile texture image oriented to textile industry

Publications (1)

Publication Number Publication Date
CN101447077A true CN101447077A (en) 2009-06-03

Family

ID=40742744

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2008101633896A Pending CN101447077A (en) 2008-12-18 2008-12-18 Edge detection method of color textile texture image oriented to textile industry

Country Status (1)

Country Link
CN (1) CN101447077A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101894295A (en) * 2010-06-04 2010-11-24 北京工业大学 A Method for Simulating Attention Shifts with Neural Networks
CN101908208A (en) * 2010-07-27 2010-12-08 浙江大学 Adaptive Determination Method of Smoothing Filter Spatial Scale Oriented to Image Edge Detection
CN103095967A (en) * 2011-10-28 2013-05-08 浙江大华技术股份有限公司 Video noise quantization calculation method and video noise quantization calculation system
CN103955944A (en) * 2014-05-22 2014-07-30 苏州大学 Image edge detection method and device
CN106409060A (en) * 2016-10-20 2017-02-15 徐次香 Automobile driving simulator
CN109155065A (en) * 2016-03-31 2019-01-04 眼睛有限公司 System and method for diagnostic image analysis and image quality measure
CN111445491A (en) * 2020-03-24 2020-07-24 山东智翼航空科技有限公司 Three-neighborhood maximum difference value edge detection narrow lane guidance algorithm for micro unmanned aerial vehicle
CN113435287A (en) * 2021-06-21 2021-09-24 深圳拓邦股份有限公司 Lawn obstacle recognition method and device, mowing robot and readable storage medium
CN114820631A (en) * 2022-07-04 2022-07-29 南通中豪超纤制品有限公司 Fabric defect detection method capable of resisting texture interference

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101894295B (en) * 2010-06-04 2014-07-23 北京工业大学 Method for simulating attention mobility by using neural network
CN101894295A (en) * 2010-06-04 2010-11-24 北京工业大学 A Method for Simulating Attention Shifts with Neural Networks
CN101908208A (en) * 2010-07-27 2010-12-08 浙江大学 Adaptive Determination Method of Smoothing Filter Spatial Scale Oriented to Image Edge Detection
CN101908208B (en) * 2010-07-27 2011-11-09 浙江大学 Self-adaptive confirming method of smooth wave-filtering spatial scale facing to picture edge detection
CN103095967A (en) * 2011-10-28 2013-05-08 浙江大华技术股份有限公司 Video noise quantization calculation method and video noise quantization calculation system
CN103955944A (en) * 2014-05-22 2014-07-30 苏州大学 Image edge detection method and device
CN109155065B (en) * 2016-03-31 2022-07-26 眼睛有限公司 System and method for diagnostic image analysis and image quality assessment
CN109155065A (en) * 2016-03-31 2019-01-04 眼睛有限公司 System and method for diagnostic image analysis and image quality measure
CN106409060A (en) * 2016-10-20 2017-02-15 徐次香 Automobile driving simulator
CN106409060B (en) * 2016-10-20 2022-12-06 国网浙江省电力有限公司台州供电公司 An image processing method for a car driving simulator
CN111445491A (en) * 2020-03-24 2020-07-24 山东智翼航空科技有限公司 Three-neighborhood maximum difference value edge detection narrow lane guidance algorithm for micro unmanned aerial vehicle
CN111445491B (en) * 2020-03-24 2023-09-15 山东智翼航空科技有限公司 Three-neighborhood maximum difference edge detection narrow channel guiding method for miniature unmanned aerial vehicle
CN113435287A (en) * 2021-06-21 2021-09-24 深圳拓邦股份有限公司 Lawn obstacle recognition method and device, mowing robot and readable storage medium
CN114820631A (en) * 2022-07-04 2022-07-29 南通中豪超纤制品有限公司 Fabric defect detection method capable of resisting texture interference

Similar Documents

Publication Publication Date Title
CN101447077A (en) Edge detection method of color textile texture image oriented to textile industry
CN108416766B (en) Double-side light-entering type light guide plate defect visual detection method
CN105913419B (en) TFT-LCD mura defect inspection methods based on ICA study and Multichannel fusion
CN104376548B (en) A kind of quick joining method of image based on modified SURF algorithm
CN111047655B (en) High-definition camera cloth defect detection method based on convolutional neural network
CN103632361B (en) An image segmentation method and a system
CN105675625B (en) A kind of fruit surface defect detection method of Gradient Iteration Threshold segmentation
CN103927750B (en) The detection method of gridiron pattern image angular-point sub-pixel
CN101464998B (en) Non-gauss veins noise smooth filtering method for textile industry
CN109859226A (en) A kind of detection method of the X-comers sub-pix of figure segmentation
CN105338342B (en) The detection method and device of a kind of dead pixel points of images
CN101527043B (en) Video picture segmentation method based on moving target outline information
CN102074017B (en) Method and device for detecting and tracking barbell central point
CN104574381B (en) A kind of full reference image quality appraisement method based on local binary patterns
CN104751458B (en) A kind of demarcation angular-point detection method based on 180 ° of rotation operators
CN104200197A (en) Three-dimensional human body behavior recognition method and device
CN111080574A (en) A fabric defect detection method based on information entropy and visual attention mechanism
CN103873880A (en) System for detecting structured artifacts in video sequences
CN110111711A (en) The detection method and device of screen, computer readable storage medium
CN110807763A (en) Method and system for detecting ceramic tile surface bulge
CN106709952B (en) A kind of automatic calibration method of display screen
CN111429462A (en) License plate positioning method based on edge detection and mathematical morphology
CN110728668A (en) A spatial high-pass filter for shape-preserving small objects
CN105787912A (en) Classification-based step type edge sub pixel localization method
CN102393964B (en) Strip gap detection method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20090603