Summary of the invention
The present invention's technical matters at first to be solved is to provide a kind of edge detection method of the color textile texture image towards textile industry, improves the precision of textile texture image rim detection.For this reason, the present invention is by the following technical solutions: it may further comprise the steps:
(1) scanning one color textile sample cloth, the digital picture of acquisition color textile sample cloth is determined one to the isotropic moving window of central point, the width of described window is the integral multiple of described Pixel of Digital Image dot spacing;
The first half of the described window that (2) to calculate with described Pixel of Digital Image point be central point and the heterochromia between the Lower Half, and the left side of described window and the heterochromia between the right-hand part:
(2)
The volume coordinate of i and j remarked pixel point wherein, I
c(i, j) expression with (y, x) pixel is to be positioned at (i in the moving window at center, j) color component value, subscript c represents R, G, three color components of B, w represents the radius of described moving window, the length of side of window or diameter are 2w+1, are the odd-multiple of pixel interval
Represent respectively three color components pixel (y, the x) heterochromia of vertical direction,
Represent respectively three color components pixel (y, the x) heterochromia of horizontal direction constitute one 3 * 2 matrix D,
Compute matrix D
TThe eigenvalue of maximum of D and corresponding eigenvector, with they respectively as pixel (y, gradient amplitude x) and gradient direction, T are matrix transpose;
(3) repeating step (2), the gradient amplitude and the gradient direction of the Pixel of Digital Image point that all pixels or calculating are chosen in the calculating digital picture;
(4) mean value of calculating pixel point gradient amplitude, set described mean value or with the mean value value corresponding be first threshold, if the gradient amplitude of pixel, is then judged this pixel more than or equal to first threshold as candidate marginal, obtain initial image border figure thus;
(5) remove isolated candidate marginal;
(6) direction of the candidate marginal heterochromia of definite remainder;
(7) remove the isolated candidate marginal of heterochromia direction;
(8) determine the second neighborhood window and second threshold value, at each remaining candidate marginal, in the second neighborhood window, calculating falls within the difference of the candidate marginal number of the second neighborhood window first half and Lower Half, and the difference that falls within the second neighborhood window left side and right-hand part candidate marginal number, if the absolute value of these two differences confirms then that simultaneously less than second threshold value this candidate marginal is final marginal point.
Owing to adopt technique scheme of the present invention, can eliminate the texture noise that causes owing to the textile industry halftone technique, improve the rim detection precision of textile texture image, for the automatic tracing of textile industry provides technical support.
The step of technique scheme (1) to step (4) also provides a kind of initial pictures edge detection method that is used for the color textile texture image rim detection simultaneously.It utilizes some characteristics of human visual system, as the human visual system is a low-pass filter, and the human visual system is when observing image color and do not rely on the color of single pixel, but be subjected to the pixel characteristics such as influence of neighborhood pixels point COLOR COMPOSITION THROUGH DISTRIBUTION on every side, utilize neighborhood territory pixel point COLOR COMPOSITION THROUGH DISTRIBUTION, can eliminate the texture interference of noise effectively, improved the precision of gradient calculation, for improving the rim detection precision of textile texture image, set up good basis.
Embodiment
Embodiment 1, with reference to accompanying drawing 1, adopts following steps that Fig. 2-a is carried out rim detection:
(1) the color textile printing and dyeing sample cloth of scanning shown in Fig. 2-a obtains the color textile texture digital picture;
(2) slide the neighborhood window here for square, windows radius w according to user's setting, general value is 2, image is done symmetry expansion up and down accordingly, the width of input picture and highly be respectively Width and Height, the width of expansion back image and highly become Width+2w and Height+2w can carry out symmetry to image line earlier and expand so, and then row are carried out symmetry expand, promptly
(1)
(2)
Here I
cBe a certain color component of presentation video, expression is red respectively in rgb color space, green or blue component,
Be to color component I
cThe symmetry expansion of line direction, and
Be
On the basis column direction is carried out symmetry expansion back image.
Perhaps also can carry out the symmetry expansion to image column earlier, and then row is carried out the symmetry expansion, promptly
(3)
(4)
Wherein
Be to color component I
cThe symmetry expansion of column direction, and
Be
On the basis line direction is carried out image after the symmetry expansion, the expanded images that two kinds of symmetrical extended modes obtain is the same.
(3) radius is inner the work from top to bottom of w square moving window image after expansion, zigzag scanning from left to right, and the image inside of promptly removing moving window radius size frame, the size of expanded images inside is the same with the size of input picture.
(4) when moving window rest on pixel (y in the time of x), calculates the moving window first half and Lower Half, and the heterochromia between left side and the right-hand part:
The volume coordinate of i and j remarked pixel point wherein, I
c(i, j) expression is with (y, x) pixel is to be positioned at (i in the moving window at center, j) color component value, subscript c represents R, G, three color components of B, also is convertible into other color space component, and w represents described radius to the isotropic moving window of central point, the length of side of window or diameter are 2w+1, will
With
Respectively as color component I
cAt current pixel point (y, x) level on and vertical direction partial derivative.
(5) according to the vertical and horizontal direction partial derivative of three color components of pixel, one 3 * 2 matrix D of formation, as follows:
Compute matrix D
TThe eigenvalue of maximum of D and corresponding eigenvector, wherein label T is matrix transpose, with them respectively as the gradient amplitude and the gradient direction of pixel.
(6) scan complete width of cloth digital picture according to the zigzag path when moving window, or selected zone, the mean value of computed image pixel gradient amplitude is if the gradient amplitude of pixel is greater than this mean value, then this pixel obtains initial image border figure as candidate's marginal point;
(7), in 3 * 3 neighborhoods of each candidate marginal, check the number of its neighborhood pixels point, if the number of neighbor candidate marginal point, is then removed these isolated candidate marginal less than 3 for candidate marginal according to initial image border figure;
(8) calculating with each remaining candidate marginal respectively is the center, radius be in the square moving window of w along level, back-diagonal, the heterochromia of vertical and diagonal, the general value of w is 2, selects the direction of the maximum direction of heterochromia absolute value as this candidate marginal heterochromia; Wherein along the computing method of level and vertical direction heterochromia shown in formula (5) and (6), and be calculated as follows along the heterochromia of diagonal:
Patch wherein
c(x, y) expression color component I
cBe positioned at (y, x) candidate marginal is (2w+1) * (2w+1) sized images piece at center.
*Element addition again after the representing matrix dot product, and
With
Represent two two-value templates that size is (2w+1) * (2w+1):
Heterochromia along the back-diagonal direction is calculated as follows:
Wherein
With
Represent two two-value templates that size is (2w+1) * (2w+1):
(9) in (2w+1) * (2w+1) neighborhood of each candidate marginal of remainder, check its heterochromia direction and whether surpass neighbor candidate marginal point heterochromia direction more than half consistent, if it is inconsistent, then remove the isolated marginal point of aberration direction, the w value is generally 2 here;
(10) at each remaining candidate marginal, in (2w+1) * (2w+1) neighborhood window, here the w value is generally 2, calculating falls within the difference of the candidate marginal number of the window first half and Lower Half, and the difference that falls within window left side and right-hand part candidate marginal number, if the absolute value of these two differences confirms then that simultaneously less than (2w+1) this candidate marginal is final marginal point, concrete outcome is seen Fig. 2-f.
What the Sobel operator adopted is the gradient that following two single order partial derivative templates are come computed image, the convolution results of two templates and image is respectively the partial derivative of image level and vertical direction, the root mean square of the quadratic sum of both direction partial derivative is as the gradient amplitude of image, and the partial derivative of the partial derivative of vertical direction and horizontal direction liken Tangent value to into gradient direction.The outline map that Fig. 2-a adopts the Sobel operator to detect is seen Fig. 2-b.
The Canny operator needs to adopt a gauss low frequency filter that image is carried out low-pass filtering earlier, adopt the partial derivative of Sobel operator calculated level and vertical direction, gradient amplitude size according to image, carry out the compacting of local non-maximum value, adopt two different threshold values further to judge the position at edge then, if it is marginal point that the gradient amplitude of pixel, is then judged this pixel greater than big threshold value; If less than little threshold value, judge that then this pixel is not a marginal point; If the gradient magnitude of pixel between two threshold values, then need to check this pixel around whether have marginal point, if there is marginal point to exist, think that then this pixel is a marginal point, otherwise, then be not marginal point.The outline map that Fig. 2-a adopts the Canny operator to detect is seen Fig. 2-c.
The Susan operator then is the difference between neighborhood pixels point gray scale and the central pixel point gray scale around calculating, and (x) weigh the similar number of neighborhood pixels point and central pixel point on every side with exp, wherein x is exactly the gray difference that calculates, and just gray difference is big more, and similarity is more little.If similar neighborhood pixels point number, thinks then that this pixel is a marginal point less than a threshold value.The outline map that Fig. 2-a adopts the Susan operator to detect is seen Fig. 2-d.
The LoG operator then is to adopt the second derivative template to determine the edge, but also needs to adopt a gauss low frequency filter that image is carried out low-pass filtering in advance.The outline map that Fig. 2-a adopts the LoG operator to detect is seen Fig. 2-e.
These traditional edge detection methods are only considered the difference between the single pixel, so texture interference of noise in can not fine removal textile printing and dyeing image, in conjunction with Fig. 2-b, Fig. 2-c, Fig. 2-d, Fig. 2-e, Fig. 2-f, obviously use the result that result that the present invention detects obviously is better than traditional edge detection method.
Embodiment 2
Adopt the step identical that accompanying drawing 3-a is detected with embodiment 1, the image border figure that obtains that detects sees accompanying drawing 3-f, in conjunction with Fig. 3-b, Fig. 3-c, Fig. 3-d, Fig. 3-e, checking once more: use the obvious result of result that the present invention detects owing to traditional edge detection method.