Nothing Special   »   [go: up one dir, main page]

CN116071351B - Flour quality visual detection system based on flour bran star identification - Google Patents

Flour quality visual detection system based on flour bran star identification Download PDF

Info

Publication number
CN116071351B
CN116071351B CN202310200588.4A CN202310200588A CN116071351B CN 116071351 B CN116071351 B CN 116071351B CN 202310200588 A CN202310200588 A CN 202310200588A CN 116071351 B CN116071351 B CN 116071351B
Authority
CN
China
Prior art keywords
flour
image
segmented image
pixel point
bran
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310200588.4A
Other languages
Chinese (zh)
Other versions
CN116071351A (en
Inventor
王成美
王斌
侯兴贞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Jinlikang Food Technology Co.,Ltd.
Original Assignee
Shandong Jinlikang Flour Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Jinlikang Flour Co ltd filed Critical Shandong Jinlikang Flour Co ltd
Priority to CN202310200588.4A priority Critical patent/CN116071351B/en
Publication of CN116071351A publication Critical patent/CN116071351A/en
Application granted granted Critical
Publication of CN116071351B publication Critical patent/CN116071351B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • G06V10/763Non-hierarchical techniques, e.g. based on statistics of modelling distributions
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Quality & Reliability (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image processing of flour bran star recognition, and provides a flour quality visual detection system based on flour bran star recognition, which divides a flour gray level image into segmented images with equal sizes, calculates gray level entropy of each segmented image and obtains color difference degree of each segmented image; clustering each segmented image to obtain the non-aggregation degree of each pixel point; judging whether the pixel points on the block images are bran star pixel points or not according to the color difference degree of each block image and the non-aggregation degree of each pixel point in the block images, and obtaining the local bran star content of the block images; the total content of the wheat bran star is obtained, and the quality of the wheat flour is detected according to the total content of the wheat bran star. The flour quality visual detection system based on flour bran star identification provided by the invention can be suitable for detecting flour bran stars with any colors and can be used for avoiding the problem of low detection precision when the flour bran stars are similar to flour raw materials.

Description

Flour quality visual detection system based on flour bran star identification
Technical Field
The invention relates to the technical field of image processing of flour bran star identification, in particular to a flour quality visual detection system based on flour bran star identification.
Background
The wheat bran star refers to spot-shaped substances formed by wheat bran which is not separated in the process of processing wheat flour and buckwheat bran which is not cleaned. In the process of flour production and processing, a small amount of flour bran star is acceptable, but the excessive content of flour bran star can influence the health of human bodies, so that the size and content of flour bran star influence the quality of flour.
Typically, the finished flour exhibits a relatively uniform white powder, and although the flour bran star is more pronounced in the white finished flour, the specific level is difficult to quantify. In the prior art, most of flour production factories adopt experts in grain inspection to compare flour to be detected with standard flour, and the quality and flour grade of the flour to be detected are judged according to comparison results. The detection method has great influence on the quality of the flour, has great comparison errors of different inspection experts, and is difficult to accurately detect the content of the bran star and the size of the bran star in the flour. On the other hand, although there are few detecting instruments for wheat bran, such instruments are expensive, and most of the flour factories are not acceptable, so that the detecting method is difficult to be a detection method with high universality.
Disclosure of Invention
The invention provides a flour quality visual detection system based on flour bran star identification, which aims to solve the problems of strong human subjectivity, expensive detection instrument, difficulty in popularization and low whiteness detection precision in the existing bran star detection method in flour, and adopts the following specific technical scheme:
one embodiment of the present invention provides a flour quality visual inspection system based on flour bran star identification, the system comprising:
the data acquisition module acquires a flour image, and the flour image is preprocessed and then grayed to obtain a flour gray image;
the image color difference degree calculation module is used for dividing the flour gray level image into segmented images with equal size, calculating the gray level entropy of each segmented image, obtaining the obvious gray level entropy of each gray level according to the number of each gray level in the segmented image and the gray level entropy of the segmented image, and obtaining the color difference degree of each segmented image according to the obvious gray level entropy of each gray level and the average value of all the obvious gray level entropy, wherein the color difference degree is recorded as first color difference degree;
the pixel point non-aggregation degree acquisition module is used for clustering each segmented image to obtain a clustering center of each iteration of the clustering, and obtaining a measurement distance from each pixel point in the segmented image to the clustering center during each iteration, obtaining a difference value of measurement distances of the same pixel point in two continuous clusters and recording the difference value as a measurement distance change amplitude value, and obtaining the amplitude significance value of each pixel point according to Euclidean distance between two clustering center points of adjacent iteration times in all iterations and the measurement distance change amplitude value of the same pixel point in the two continuous clusters and obtaining the non-aggregation degree of each pixel point according to the difference between the amplitude significance value of each pixel point and the average value of the amplitude significance values of all pixel points;
the local flour bran star content acquisition module is used for acquiring an updating radius of each pixel point according to the non-aggregation degree of each pixel point in the segmented image, the maximum value of the non-aggregation degree and the side length of the segmented image, updating the segmented image according to the updating radius corresponding to each pixel point, calculating the color difference degree of the updated segmented image to be a second color difference degree, obtaining the color change difference degree corresponding to each pixel point according to the difference between the first color difference degree and the second color difference degree, judging whether the pixel point is a bran star pixel point according to the color change difference degree of each pixel point and obtaining a bran star determination factor, and obtaining the local bran star content of the segmented image according to the bran star determination factor;
the flour quality detection module is used for obtaining the local bran star content of each block image, adding the local bran star content of all the block images to obtain the total content of the flour bran star, and detecting the flour quality according to the total content of the flour bran star.
Preferably, the method for dividing the flour gray level image into the segmented images with equal size comprises the following steps:
uniformly dividing the flour image into C square quick-dividing images with equal size, wherein the size of the block image is W.times.W, when the square of the C square.times.W just covers the flour gray level image, a plurality of combinations (C, W) are obtained, and the obtained maximum W is taken as the side length of the block image.
Preferably, the method for obtaining the color difference degree of each segmented image according to the significant gray entropy of each gray level and the average value of all the significant gray entropy comprises the following steps:
Figure SMS_1
in the method, in the process of the invention,
Figure SMS_2
is the gray value of pixel i,
Figure SMS_3
is the significant gray entropy of pixel i, the meaning of M is that there are M unequal gray values in the segmented image c,
Figure SMS_4
is the mean of the M gray scale significant gray scale entropies,
Figure SMS_5
the color difference of the block image c is expressed and is denoted as a first color difference.
Preferably, the method for obtaining the amplitude significance value of each pixel point according to the euclidean distance between two clustering center points of adjacent iteration times in all iterations and the measurement distance change amplitude of the same pixel point in two continuous clusters comprises the following steps:
Figure SMS_6
in the method, in the process of the invention,
Figure SMS_7
is the Euclidean distance between the v iteration and the v+1 iteration corresponding to the cluster center movement,
Figure SMS_8
is the total number of iterations at the end of the clustering of the segmented image c,
Figure SMS_9
the magnitude of the change of the distance between the pixel point f of the v th iteration and the v+1 th iteration and the center point measurement of the two clusters is represented,
Figure SMS_10
representing the magnitude saliency value of the f-th pixel.
Preferably, the method for obtaining the update radius of each pixel point according to the non-aggregation degree of each pixel point in the segmented image, the maximum value of the non-aggregation degree and the side length of the segmented image comprises the following steps:
Figure SMS_11
in the method, in the process of the invention,
Figure SMS_12
is the maximum value of the degree of non-aggregation of the pixel points in the segmented image c,
Figure SMS_13
is the degree of non-aggregation of the pixel point f,
Figure SMS_14
is the side length of the tile image c,
Figure SMS_15
the update radius.
Preferably, the method for updating the segmented image through the update radius corresponding to each pixel point and calculating the color difference of the updated segmented image to be recorded as the second color difference comprises the following steps:
obtaining updated radius
Figure SMS_16
After that, obtain
Figure SMS_17
The center point of the square area is the same as the center point of the segmented image, the gray values of all pixel points in the square area are converted into the gray value of the pixel point f, the gray values of the pixel points belonging to the square area in the segmented image are replaced, an updated segmented image is obtained, the updated segmented image is calculated to obtain new color difference degree, and the new color difference degree is recorded as second color difference degree.
Preferably, the calculating method of the local bran star content of the segmented image comprises the following steps:
Figure SMS_18
in the method, in the process of the invention,
Figure SMS_19
a factor is determined for the bran-star for pixel f,
Figure SMS_20
is the total number of pixels in the tile image c,
Figure SMS_21
the local content of bran star of the c-th segmented image is represented.
The beneficial effects of the invention are as follows: the invention provides a flour quality visual detection system based on flour bran star identification, wherein the traditional whiteness detection method is influenced by a wheat storage factor, and the accuracy of a flour detection result processed by long-term stored wheat is lower. Aiming at the problems of the traditional detection method, the color difference degree is constructed based on the distribution condition of the pixels with different gray levels in the flour image, the color difference degree considers the change condition of gray entropy in the segmented image and the distribution difference between the pixels with different gray levels, and the color significance has the beneficial effect of being applicable to the detection of the bran star with any color. Secondly, according to the characteristics of weak aggregation characteristics of bran-star pixels and remarkable aggregation characteristics of flour pixels in the block diagram, the non-aggregation degree is constructed through the influence of different pixels on the update amplitude of the clustering center, and the non-aggregation degree has the beneficial effects that for flour bran-star with any size, the influence degree of the bran-star pixels on the update of the clustering center can be accurately obtained, so that the problem of low detection precision when the flour bran-star is similar to the flour raw materials is avoided.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings can be obtained according to these drawings without inventive faculty for a person skilled in the art.
Fig. 1 is a schematic flow chart of a flour quality visual detection system based on flour bran star identification according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, a flowchart of a flour quality visual inspection system based on identification of bran stars according to an embodiment of the present invention is shown, the system includes:
the data acquisition module, in this embodiment, installs an industrial CCD camera above the raw material powder grinding discharge position, acquires flour images using the camera, and it is noted that the obtained flour images are RGB images. In addition, during the image capturing and transmitting process, interference noise inevitably exists, including interference noise caused by processing machines and during the transmitting process, and the noise not only reduces the image quality, but also affects the detection result of the flour bran star on the flour image. Therefore, in order to reduce noise interference and improve image quality, it is necessary to perform denoising processing on the acquired flour image. In the field of image processing, the commonly used denoising methods are divided into two categories, namely denoising based on a filter and denoising based on a model. The proposal adopts bilateral filtering denoising technology to preprocess the collected flour image. The bilateral filtering denoising technology is a known technology, and the specific process is not described in detail. And carrying out graying treatment on the denoised flour image to obtain a gray level image of the flour image, namely a flour gray level image.
The image color difference calculating module, in this embodiment, the detection object is a flour bran star, and the gray value of the pixel points belonging to the flour bran star in the finished flour image is larger than that of the normal flour pixel points. For processed flours, the coarser the particle size of the flour, the greater the number of bran-star in the flour, the greater the size of the bran-star, and the worse the color of the flour; the smaller the particle size of the flour, the smaller the number of bran stars in the flour, the smaller the body shape of the bran stars, and the even and white color of the flour. Therefore, the invention considers that the entropy value graph is obtained by mapping and transforming the gray entropy of the local area in the collected flour image, and the difference condition of pixels in the local area is primarily judged according to the magnitude of the gray entropy value, and the gray entropy is the image entropy of the gray image.
Firstly, after a flour gray level image is obtained, the flour image is uniformly divided into C segmented images with equal size, the size of each segmented image is W, for all C, W when the two conditions are met, the C squares with W are enabled to cover the flour gray level image exactly, a plurality of combinations (C and W) are obtained, the obtained maximum W is taken as the side length of the segmented image, and for the flour image and the segmented image, flour bran stars are unevenly and randomly distributed. The gray entropy is an image characteristic calculated according to the distribution condition of gray values of pixels in an image, whether the image is a block image or a flour image, compared with the gray values of the pixels of the bran star, the gray values of the pixels of the bran star are smaller, the gray values of the pixels of the flour are larger, and in general, the number of the pixels of the flour in the flour formed by processing raw materials is larger than that of the pixels of the bran star, so that for the block image, the number of the pixels of the flour is larger than that of the pixels of the bran star, if the bran star appears in a large amount of flour, the corresponding gray entropy changes, and the color of the block image also changes.
Further, the gray entropy of each segmented image is obtained, and the formula is as follows:
Figure SMS_22
in the method, in the process of the invention,
Figure SMS_23
meaning that the number of pixels having a gray value u in all pixels in the segmented image c is a ratio,
Figure SMS_24
is the gray entropy of the segmented image c.
It should be noted that, in this embodiment, each of the different gray values is calculated as a gray level. Further, the color difference is obtained by changing all gray levels in the segmented image, so that the significant gray entropy of each gray level in the segmented image is obtained according to the gray entropy of the segmented image, and the formula is as follows:
Figure SMS_25
Figure SMS_26
is that the gray value in the segmented image c is
Figure SMS_27
N is the number of pixels in the segmented image c,
Figure SMS_28
is the significant gray entropy of pixel i, i is the i-th pixel in the segmented image c,
Figure SMS_29
is the gray value of pixel i,
Figure SMS_30
is the gray entropy of the segmented image c.
Further, the color difference degree of the segmented image is obtained through the remarkable gray entropy of each gray level, and the formula is as follows:
Figure SMS_31
where M means that there are M unequal gray values in the segmented image c, each of which corresponds to one gray level,
Figure SMS_32
is the mean of the M gray scale significant gray scale entropies,
Figure SMS_33
the color difference of the block image c is expressed, and the color difference at this time is recorded as a first color difference. The color difference degree is the color difference of pixel points in the segmented image, the gray values of the pixel points in the segmented image are different, the corresponding color difference degree is smaller, the color displayed by the segmented image is smaller, and the possibility of the presence of flour bran is lower.
The non-aggregation degree acquisition module of the pixels is used for blocking the flour image, if the occupation ratio of the flour pixels in the area is larger for each corresponding area of the blocked image, the similarity among the flour pixels is larger, namely the aggregation characteristic of the flour pixels is obvious, the white possibility of the whole area in the flour image is larger, and for the flour bran star distributed randomly, the aggregation characteristic is not obvious in the blocked image.
Firstly, for each segmented image, K-means clustering is used for clustering pixels through the position of an iterative clustering center, for a large number of flour pixels in the segmented image, the pixels have higher similarity, the segmented image has higher local density, in each clustering center point position iteration process, if all the pixels in the segmented image are flour pixels, the clustering is stopped quickly, the updating amplitude of each clustering center point is very small, and if bran star pixels exist in the segmented image, the updating amplitude of each clustering center point is relatively large because the bran star pixels influence the determination of the clustering center point. In this embodiment, the number of initial cluster centers of K-means clustering is 2, the iteration stop condition is that the two clustering results do not change any more, the measurement distance between the pixel point and the cluster center point in the iteration process is the gray level difference value of two pixel points, the K-means clustering is a known technology, and the specific process is not described in detail.
Further, the measurement distance from any pixel point to the clustering center point in any two iterative processes is obtained, wherein the smaller the measurement distance difference is, the smaller the change of the clustering center point is, and the formula for obtaining the measurement distance change amplitude is as follows:
Figure SMS_34
in the method, in the process of the invention,
Figure SMS_35
is the clustering center of the cluster where the pixel point f is located after the v-th iteration in the clustering process,
Figure SMS_36
is the pixel point f after the (v+1) th iteration in the clustering processAt the center of the cluster,
Figure SMS_37
is the pixel point f and the center point after the v th iteration
Figure SMS_38
Is a measure of the distance of (1),
Figure SMS_39
is the pixel point f and the central point after the v+1th iteration
Figure SMS_40
Is a measure of the distance of (1),
Figure SMS_41
the magnitude of the change in the measured distance between the pixel point f of the v th iteration and the v+1 th iteration and the center point of the two clusters is represented.
Further, according to the measurement distance change amplitude value obtained by each pixel point in all iterative processes, the amplitude significance value of each pixel point is obtained, and the formula is as follows:
Figure SMS_42
in the method, in the process of the invention,
Figure SMS_43
is the Euclidean distance between the v iteration and the v+1 iteration corresponding to the cluster center movement,
Figure SMS_44
is the total number of iterations at the end of the clustering of the segmented image c,
Figure SMS_45
the magnitude of the change of the distance between the pixel point f of the v th iteration and the v+1 th iteration and the center point measurement of the two clusters is represented,
Figure SMS_46
representing the magnitude saliency value of the f-th pixel.
Further, according to the average value of the amplitude significant values of all the pixels in the segmented image c, the non-aggregation degree of each pixel is obtained by taking the difference between the amplitude significant value of each pixel and the average value of the amplitude significant values, and the formula is as follows:
Figure SMS_47
in the method, in the process of the invention,
Figure SMS_48
represents the magnitude saliency value of the f-th pixel,
Figure SMS_49
is the average of the amplitude saliency values of all pixels in the segmented image c,
Figure SMS_50
indicating the degree of non-aggregation of the f-th pixel.
The non-aggregation degree reflects the possibility of aggregation characteristics of pixel points in the segmented image, in the process of clustering the segmented image, the update of the clustering center is determined according to the measurement distance from all samples to the clustering center, if the segmented image contains the wheat bran star, the measurement distance between the bran star pixel points and the clustering center is larger, the next update of the clustering center can move to the aggregation area of the bran star pixel points, and because the distribution of the bran star pixel points is scattered, the clustering center moves for a plurality of times,
Figure SMS_51
the value of (c) will be larger,
Figure SMS_52
the larger the value of (2), the more the measured distance change of the bran-star pixel points before and after updating is larger than the moving distance of the clustering center, the non-aggregation degree
Figure SMS_53
The larger the pixel point f has key influence on the updating amplitude of the clustering center, the more likely the pixel point f corresponds to the pixel point in the bran star area of the flour, and the corresponding aggregation characteristicThe less pronounced.
And the local content acquisition module of the wheat bran star is used for respectively acquiring the color difference degree corresponding to each segmented image and the non-aggregation degree of each pixel point in the segmented image according to the steps for the preprocessed wheat flour image. For a segmented image c, the degree of color difference
Figure SMS_54
Reflecting the number of gray levels in the segmented image c and the uniformity of the distribution of each gray level pixel point due to the color difference
Figure SMS_55
The larger the patch image c, the greater the likelihood that the wheat bran star is present. On the other hand, the degree of non-aggregation
Figure SMS_56
The larger the image, the less obvious the aggregate feature corresponding to the pixel point f in the segmented image c, the more likely the pixel point f will be the pixel point in the corresponding bran star area of the flour.
Firstly, selecting an updating radius to update each segmented image, taking a square area with the updating radius as a side length as an updating area, and finding out a pixel point with the largest non-aggregation degree from pixel points in each segmented image, wherein the non-aggregation degree of the pixel points is recorded as
Figure SMS_57
Obtaining a plurality of update radiuses by using the maximum non-aggregation degree and the non-aggregation degree of each pixel point of the segmented image, wherein the update radius obtaining mode is as follows:
Figure SMS_58
in the method, in the process of the invention,
Figure SMS_59
is the maximum value of the degree of non-aggregation of the pixel points in the segmented image c,
Figure SMS_60
is the minimum value of the non-aggregation degree of the pixel points in the segmented image c,
Figure SMS_61
is the degree of non-aggregation of the pixel point f,
Figure SMS_62
is the side length of the tile image c,
Figure SMS_63
that is, update radius, thus obtaining
Figure SMS_64
Is a square region of (1), wherein
Figure SMS_65
The larger the pixel point f is more likely to be the bran-star flour pixel point, the more contrast in the image should be amplified, which is beneficial to improving the accuracy of subsequent detection.
The bran star and color are not fixed in size, for some relatively dull or smaller bran stars, if it is
Figure SMS_66
Setting a threshold value may ignore the above-mentioned non-contrast-significant pixels of bran-star, for example pixels on the intersection edge of bran-star and flour. The purpose of the update is to amplify the contrast of the bran-star pixels. In the updating process, the gray value of the pixel points in a certain range around the bran-star pixel point is consistent with the bran-star pixel point, so that the pixel points on the bran-star which are relatively dim or small in area are more obvious in the image.
Further, it will
Figure SMS_67
The center point of the segmented image is taken as the midpoint of the square area, all pixel points in the square area are updated to be gray values of the pixel point f, and after updating, new color difference is obtained according to the calculation step of the color difference
Figure SMS_68
The color difference is recorded as a second color difference, and a color change difference is obtained, which is calculated as follows:
Figure SMS_69
in the method, in the process of the invention,
Figure SMS_70
a first color difference degree representing a c-th block image,
Figure SMS_71
a second color difference degree representing a c-th block image,
Figure SMS_72
and representing the color change difference degree of the c-th segmented image after the f-th pixel point is updated.
After updating the pixel points, if the pixel points are bran-star pixel points, the more the updated surrounding pixel points consistent with the gray value of the updated pixel points, the color difference degree in the updated divided image
Figure SMS_73
The larger the thus, that is, the degree of variation in color change
Figure SMS_74
The larger the pixel point f, the more likely it is a bran-star pixel point.
If the color difference is the bran-star pixel point, the updated color difference is equivalent to the bran-star pixel point,
Figure SMS_75
if the color is the flour pixel, the updated color is equal to the bran star pixel, the color difference is reduced,
Figure SMS_76
. The bran-star pixel point can be judged more accurately.
Further, normalizing the color change difference degree corresponding to each pixel after updating, and setting colorJudgment threshold
Figure SMS_77
In the present embodiment, it is noted that
Figure SMS_78
0.3, record
Figure SMS_79
Determining a factor for the bran-star of pixel f if
Figure SMS_80
Figure SMS_81
At this time, the pixel point f is considered to be the bran-star pixel point, if
Figure SMS_82
Figure SMS_83
At this time, the pixel point f is considered to be a non-bran-star pixel point. The formula is as follows:
Figure SMS_84
the local content of bran star of the segmented image c can thus be obtained:
Figure SMS_85
in the method, in the process of the invention,
Figure SMS_86
a factor is determined for the bran-star for pixel f,
Figure SMS_87
is the total number of pixels in the tile image c,
Figure SMS_88
the local content of bran star of the c-th segmented image is represented. The local content of the bran-star is the duty ratio of the number of the bran-star pixels in all pixels of the block image, and the bran-star is obtained by matchingAnd replacing and updating pixel points in the block images, calculating the variation of the color difference degree of the block images before and after updating, and determining the value of the bran star determining factor according to the variation. The greater the local content of bran star, the greater the content of flour bran star in the segmented image.
The flour quality detection module is used for respectively calculating the color difference degree of the segmented images and the non-aggregation degree of each pixel point in the segmented images according to the C segmented images obtained by segmenting the flour images in the steps, further, acquiring the local bran star content corresponding to each segmented image according to the calculation results of the two indexes, and respectively marking as
Figure SMS_89
Figure SMS_90
、…、
Figure SMS_91
…、
Figure SMS_92
Adding the calculated results of the C local contents to obtain the average value of the total content of the flour bran star in the flour image, and marking the average value as
Figure SMS_93
. According to the classification of flour grades, obtaining the acceptable bran star content range of flour in the flour of each grade, wherein the bran star content range of the top grade flour is [0,0.05 ]]The bran star content of the inferior flour is in the range of [0.05,1 ]]Respectively detecting the total content of the wheat bran star
Figure SMS_94
Comparing with the bran star content range of the flour of each grade of flour, if
Figure SMS_95
The detected flour is recorded as the top-grade flour within the acceptable bran content range of the top-grade flour if
Figure SMS_96
The detected flour is marked as the inferior flour in the acceptable bran star content range of the inferior flour.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (5)

1. Flour quality visual detection system based on flour bran star discernment, characterized by that, this system includes following module:
the data acquisition module acquires a flour image, and the flour image is preprocessed and then grayed to obtain a flour gray image;
the image color difference degree calculation module is used for dividing the flour gray level image into segmented images with equal size, calculating the gray level entropy of each segmented image, obtaining the obvious gray level entropy of each gray level according to the number of each gray level in the segmented image and the gray level entropy of the segmented image, and obtaining the color difference degree of each segmented image according to the obvious gray level entropy of each gray level and the average value of all the obvious gray level entropy, wherein the color difference degree is recorded as first color difference degree;
the pixel point non-aggregation degree acquisition module is used for clustering each segmented image to obtain a clustering center of each iteration of the clustering, and obtaining a measurement distance from each pixel point in the segmented image to the clustering center during each iteration, obtaining a difference value of measurement distances of the same pixel point in two continuous clusters and recording the difference value as a measurement distance change amplitude value, and obtaining the amplitude significance value of each pixel point according to Euclidean distance between two clustering center points of adjacent iteration times in all iterations and the measurement distance change amplitude value of the same pixel point in the two continuous clusters and obtaining the non-aggregation degree of each pixel point according to the difference between the amplitude significance value of each pixel point and the average value of the amplitude significance values of all pixel points;
the local flour bran star content acquisition module is used for acquiring an updating radius of each pixel point according to the non-aggregation degree of each pixel point in the segmented image, the maximum value of the non-aggregation degree and the side length of the segmented image, updating the segmented image according to the updating radius corresponding to each pixel point, calculating the color difference degree of the updated segmented image to be a second color difference degree, obtaining the color change difference degree corresponding to each pixel point according to the difference between the first color difference degree and the second color difference degree, judging whether the pixel point is a bran star pixel point according to the color change difference degree of each pixel point and obtaining a bran star determination factor, and obtaining the local bran star content of the segmented image according to the bran star determination factor;
the flour quality detection module is used for obtaining the local bran star content of each block image, adding the local bran star content of all the block images to obtain the total content of the bran star of the flour, and detecting the flour quality according to the total content of the bran star of the flour;
the method for obtaining the color difference degree of each segmented image according to the significant gray entropy of each gray level and the average value of all the significant gray entropy comprises the following steps:
Figure QLYQS_1
in the method, in the process of the invention,
Figure QLYQS_2
is the gray value of pixel i, +.>
Figure QLYQS_3
Is the significant gray entropy of the pixel point i, M means that M unequal gray values are shared in the segmented image c and marked as M gray levels, < >>
Figure QLYQS_4
Is the mean value of the M gray scale significant gray scale entropies, < >>
Figure QLYQS_5
The color difference degree of the block image c is expressed and is recorded as a first color difference degree;
the salient gray entropy for each gray level is given by:
Figure QLYQS_6
Figure QLYQS_7
is the gray value of +.>
Figure QLYQS_8
N is the number of pixels in the segmented image c, +.>
Figure QLYQS_9
Is the significant gray entropy of pixel i, i is the i-th pixel in the segmented image c,/and->
Figure QLYQS_10
Is the gray value of pixel i, +.>
Figure QLYQS_11
The gray entropy of the segmented image c;
the method for obtaining the amplitude significant value of each pixel point according to the Euclidean distance between two clustering center points of adjacent iteration times in all iterations and the measured distance change amplitude of the same pixel point in two continuous clusters comprises the following steps:
Figure QLYQS_12
in the method, in the process of the invention,
Figure QLYQS_13
is the Euclidean distance between the v iteration and the v+1 iteration corresponding to the cluster center movement,>
Figure QLYQS_14
is the total number of iterations at the end of clustering of the segmented image c,/->
Figure QLYQS_15
Representing the variation amplitude of the measurement distance between the v th iteration and the v+1th iteration pixel point f and the center point of the two clusters, and +.>
Figure QLYQS_16
Representing the magnitude saliency value of the f-th pixel.
2. A flour quality visual inspection system based on flour bran star identification as claimed in claim 1 wherein the method of dividing the flour gray scale image into equal sized segmented images is:
uniformly dividing the flour image into C square quick-dividing images with equal size, wherein the size of the block image is W.times.W, when the square of the C square.times.W just covers the flour gray level image, a plurality of combinations (C, W) are obtained, and the obtained maximum W is taken as the side length of the block image.
3. A flour quality visual inspection system based on flour bran star identification as claimed in claim 1 wherein the method of obtaining the updated radius of each pixel point from the non-aggregation level of each pixel point in the segmented image, the maximum value of the non-aggregation level and the side length of the segmented image is as follows:
Figure QLYQS_17
in the method, in the process of the invention,
Figure QLYQS_18
is the maximum value of the non-aggregation degree of the pixel points in the segmented image c, < >>
Figure QLYQS_19
Is the minimum value of the non-aggregation degree of the pixel points in the segmented image c, +.>
Figure QLYQS_20
Is the degree of non-aggregation of pixel f, < >>
Figure QLYQS_21
Is the side length of the tile image c, +.>
Figure QLYQS_22
The update radius.
4. The flour quality visual inspection system based on flour bran star identification as claimed in claim 1, wherein the method for updating the segmented image by the update radius corresponding to each pixel point and calculating the color difference of the updated segmented image as the second color difference is as follows:
obtaining updated radius
Figure QLYQS_23
After that, get->
Figure QLYQS_24
The center point of the square area is the same as the center point of the segmented image, the gray values of all pixel points in the square area are converted into the gray value of the pixel point f, the gray values of the pixel points belonging to the square area in the segmented image are replaced, an updated segmented image is obtained, the updated segmented image is calculated to obtain new color difference degree, and the new color difference degree is recorded as second color difference degree.
5. A flour quality visual inspection system based on flour bran star identification as claimed in claim 1 wherein the calculation method of the local bran star content of the segmented image is:
Figure QLYQS_25
in the method, in the process of the invention,
Figure QLYQS_26
determining a factor for the bran of pixel f, < >>
Figure QLYQS_27
Is the total number of pixels in the tile image c, and>
Figure QLYQS_28
the local content of bran star of the c-th segmented image is represented.
CN202310200588.4A 2023-03-06 2023-03-06 Flour quality visual detection system based on flour bran star identification Active CN116071351B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310200588.4A CN116071351B (en) 2023-03-06 2023-03-06 Flour quality visual detection system based on flour bran star identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310200588.4A CN116071351B (en) 2023-03-06 2023-03-06 Flour quality visual detection system based on flour bran star identification

Publications (2)

Publication Number Publication Date
CN116071351A CN116071351A (en) 2023-05-05
CN116071351B true CN116071351B (en) 2023-06-30

Family

ID=86173244

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310200588.4A Active CN116071351B (en) 2023-03-06 2023-03-06 Flour quality visual detection system based on flour bran star identification

Country Status (1)

Country Link
CN (1) CN116071351B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117058256B (en) * 2023-10-12 2024-01-12 临沂大学 Rose paste browning degree detection method based on machine vision
CN118096545B (en) * 2024-04-24 2024-06-21 宝鸡源盛实业有限公司 Dough kneading impurity detection method and system
CN118196100B (en) * 2024-05-17 2024-07-23 武汉市巽皇食品有限公司 Dough detection device and method based on dough mixer

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109100350A (en) * 2018-08-21 2018-12-28 珠海市博恩科技有限公司 A kind of flour bran speck detection method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101819158A (en) * 2010-03-18 2010-09-01 浙江工商大学 Automatic detection device of flour bran speck
CN103366365B (en) * 2013-06-18 2016-05-25 西安电子科技大学 SAR image change detection method based on artificial immunity multi-object clustering
IL243113B (en) * 2015-12-15 2020-08-31 Picscout Israel Ltd Logo detection for automatic image search engines
CN107807126A (en) * 2017-10-25 2018-03-16 宝鸡金昱食品机械制造有限公司 A kind of flour bran speck detection method based on PCNN
US11669607B2 (en) * 2019-08-29 2023-06-06 PXL Vision AG ID verification with a mobile device
CN114429476A (en) * 2022-01-25 2022-05-03 惠州Tcl移动通信有限公司 Image processing method, image processing apparatus, computer device, and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109100350A (en) * 2018-08-21 2018-12-28 珠海市博恩科技有限公司 A kind of flour bran speck detection method

Also Published As

Publication number Publication date
CN116071351A (en) 2023-05-05

Similar Documents

Publication Publication Date Title
CN116071351B (en) Flour quality visual detection system based on flour bran star identification
CN116385448B (en) Alumina ceramic surface machining defect detection method based on machine vision
CN115829883B (en) Surface image denoising method for special-shaped metal structural member
CN115311270B (en) Plastic product surface defect detection method
CN115018853B (en) Mechanical component defect detection method based on image processing
CN116843678A (en) Hard carbon electrode production quality detection method
CN116735612B (en) Welding defect detection method for precise electronic components
CN108960237B (en) Reading identification method for pointer type oil level indicator
CN115294140A (en) Hardware part defect detection method and system
CN110749598A (en) Silkworm cocoon surface defect detection method integrating color, shape and texture characteristics
CN114119603A (en) Image processing-based snack box short shot defect detection method
CN115994907B (en) Intelligent processing system and method for comprehensive information of food detection mechanism
CN114820625B (en) Automobile top block defect detection method
CN115601368B (en) Sheet metal part defect detection method for building material equipment
CN117252882A (en) Cylinder head quality detection method and system
CN115619793A (en) Power adapter appearance quality detection method based on computer vision
WO2018207648A1 (en) Road surface condition determining method and road surface condition determining device
CN115546155A (en) Rivet size detection method based on machine vision
CN116523913B (en) Intelligent detection method for quality of screw rod
CN115115603A (en) Automobile accessory flywheel surface detection method based on artificial intelligence
CN118134938A (en) Visual detection method and system for surface defects of refractory bricks
CN103942526B (en) Linear feature extraction method for discrete data point set
CN112381826A (en) Binarization method of edge defect image
CN118014994B (en) Rice hull degradable meal box crack detection method
CN111815575B (en) Bearing steel ball part detection method based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 273200 Hua Cun Zhen Bei Zhuang Cun, Sishui County, Jining City, Shandong Province

Patentee after: Shandong Jinlikang Food Technology Co.,Ltd.

Address before: 273200 Hua Cun Zhen Bei Zhuang Cun, Sishui County, Jining City, Shandong Province

Patentee before: Shandong Jinlikang Flour Co.,Ltd.

CP01 Change in the name or title of a patent holder
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A Visual Inspection System for Flour Quality Based on Flour Bran Star Recognition

Effective date of registration: 20231219

Granted publication date: 20230630

Pledgee: Bank of China Limited Sishui sub branch

Pledgor: Shandong Jinlikang Food Technology Co.,Ltd.

Registration number: Y2023980072803

PE01 Entry into force of the registration of the contract for pledge of patent right