CN118052793A - Real-time monitoring system and method for plush toy production process - Google Patents
Real-time monitoring system and method for plush toy production process Download PDFInfo
- Publication number
- CN118052793A CN118052793A CN202410214748.5A CN202410214748A CN118052793A CN 118052793 A CN118052793 A CN 118052793A CN 202410214748 A CN202410214748 A CN 202410214748A CN 118052793 A CN118052793 A CN 118052793A
- Authority
- CN
- China
- Prior art keywords
- plush toy
- feature
- finished
- finished product
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 98
- 238000012544 monitoring process Methods 0.000 title claims abstract description 47
- 238000000034 method Methods 0.000 title claims abstract description 24
- 238000001514 detection method Methods 0.000 claims abstract description 69
- 238000012545 processing Methods 0.000 claims abstract description 13
- 239000000047 product Substances 0.000 claims description 173
- 239000013598 vector Substances 0.000 claims description 118
- 238000012549 training Methods 0.000 claims description 81
- 238000000605 extraction Methods 0.000 claims description 44
- 238000005259 measurement Methods 0.000 claims description 37
- 238000013527 convolutional neural network Methods 0.000 claims description 30
- 238000007689 inspection Methods 0.000 claims description 16
- 238000003062 neural network model Methods 0.000 claims description 15
- 239000002994 raw material Substances 0.000 claims description 14
- 239000011265 semifinished product Substances 0.000 claims description 14
- 238000012512 characterization method Methods 0.000 claims description 5
- 238000004806 packaging method and process Methods 0.000 claims description 5
- 230000002159 abnormal effect Effects 0.000 claims description 2
- 238000004364 calculation method Methods 0.000 claims description 2
- 238000004891 communication Methods 0.000 claims description 2
- 238000003908 quality control method Methods 0.000 abstract description 11
- 230000002950 deficient Effects 0.000 abstract description 7
- 230000005856 abnormality Effects 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 238000011176 pooling Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 230000004913 activation Effects 0.000 description 5
- 238000009826 distribution Methods 0.000 description 5
- 238000012797 qualification Methods 0.000 description 4
- 230000007547 defect Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000002776 aggregation Effects 0.000 description 2
- 238000004220 aggregation Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 239000000945 filler Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000005065 mining Methods 0.000 description 2
- 210000002569 neuron Anatomy 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000004931 aggregating effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000000306 component Substances 0.000 description 1
- 239000008358 core component Substances 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011049 filling Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000009958 sewing Methods 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The application discloses a real-time monitoring system and a real-time monitoring method for a plush toy production process, which are characterized in that a camera is used for collecting detection images of a plush toy finished product, and an image processing and analyzing algorithm is introduced into the rear end to analyze the detection images, so that whether the quality of the plush toy finished product is qualified or not is automatically detected and judged. Therefore, the automation degree and the quality control level of the plush toy production process can be improved, the number of artificial errors and defective products is reduced, and the production efficiency and the product quality are improved. Meanwhile, an operator can monitor the production condition in real time through the display, timely find abnormality and take corresponding measures, and further improve the efficiency and accuracy of production management.
Description
Technical Field
The application relates to the field of intelligent monitoring, in particular to a real-time monitoring system and method for a plush toy production process.
Background
Plush toy is a popular children toy, and the production process involves a plurality of procedures such as cutting, sewing, filling, shaping, decorating and the like. In order to ensure the quality and safety of the plush toy, strict quality detection is required to avoid the defects of surface breakage, stains, unstable stitching, exposed filler and the like of the plush toy. However, the conventional plush toy quality detection method mainly relies on manual vision and manual operation, which is not only inefficient, but also prone to subjective errors and fatigue errors. Accordingly, a real-time monitoring system for plush toy production processes is desired that is capable of monitoring the plush toy production processes in real time and performing automated quality inspection of finished plush toy products.
Disclosure of Invention
The present application has been made to solve the above-mentioned technical problems. The embodiment of the application provides a real-time monitoring system and a real-time monitoring method for a plush toy production process, which are characterized in that a detection image of a plush toy finished product is acquired through a camera, and an image processing and analyzing algorithm is introduced into the rear end to analyze the detection image, so that whether the quality of the plush toy finished product is qualified or not is automatically detected and judged. Therefore, the automation degree and the quality control level of the plush toy production process can be improved, the number of artificial errors and defective products is reduced, and the production efficiency and the product quality are improved. Meanwhile, an operator can monitor the production condition in real time through the display, timely find abnormality and take corresponding measures, and further improve the efficiency and accuracy of production management.
According to one aspect of the present application, there is provided a real-time monitoring system for a plush toy production process, comprising:
a plurality of sensors deployed in each production link of the plush toy production process and used for detecting the quantity, quality, position and state parameters of raw materials, semi-finished products and finished products of the plush toy;
the cameras are deployed in all production links of the plush toy production process and are used for collecting images of raw materials, semi-finished products and finished products of the plush toy;
The central controller is in communication connection with the plurality of sensors and the plurality of cameras and is used for processing data from the sensors and the cameras and detecting the quality of finished plush toys;
The executor is arranged in each production link of the plush toy production process and is used for receiving the instruction signals of the central controller and carrying, sorting, assembling, checking and packaging the raw materials, semi-finished products and finished products of the plush toy;
And the display is used for displaying the production condition, the work instruction and the abnormal prompt content of the plush toy to an operator according to the received data and the sent data of the central controller.
According to another aspect of the present application, there is provided a real-time monitoring method of a plush toy production process, comprising:
Acquiring a detection image of the finished plush toy to be detected, which is acquired by the camera;
extracting a set of reference images of finished plush toys qualified in quality inspection from a database;
respectively carrying out feature extraction on each reference image in the set of reference images by a plush toy finished product surface state feature extractor based on a deep neural network model to obtain a set of plush toy finished product surface feature reference feature vectors;
Inputting the collection of the plush toy finished product surface feature reference feature vectors into an essential feature extraction network to obtain plush toy finished product surface feature reference essential feature vectors;
Extracting features of the detection image of the plush toy finished product to be detected by the plush toy finished product surface state feature extractor based on the deep neural network model to obtain a plush toy surface state detection feature vector to be detected;
calculating a state difference semantic measurement coefficient between the plush toy finished product surface feature reference essential feature vector and the plush toy surface state detection feature vector to be detected;
And determining whether the quality of the plush toy finished product to be detected is qualified or not based on the comparison between the state difference semantic measurement coefficient and a preset threshold value.
Compared with the prior art, the real-time monitoring system and method for the plush toy production process provided by the application have the advantages that the detection image of the plush toy finished product is acquired through the camera, and the analysis of the detection image is carried out by introducing an image processing and analyzing algorithm at the rear end, so that whether the quality of the plush toy finished product is qualified or not is automatically detected and judged. Therefore, the automation degree and the quality control level of the plush toy production process can be improved, the number of artificial errors and defective products is reduced, and the production efficiency and the product quality are improved. Meanwhile, an operator can monitor the production condition in real time through the display, timely find abnormality and take corresponding measures, and further improve the efficiency and accuracy of production management.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing embodiments of the present application in more detail with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate the application and together with the embodiments of the application, and not constitute a limitation to the application. In the drawings, like reference numerals generally refer to like parts or steps.
FIG. 1 is a block diagram of a real-time monitoring system for a plush toy production process in accordance with an embodiment of the present application;
FIG. 2 is a system architecture diagram of a real-time monitoring system for a plush toy production process according to an embodiment of the present application;
FIG. 3 is a block diagram of a training phase of a real-time monitoring system for a plush toy production process in accordance with an embodiment of the present application;
FIG. 4 is a block diagram of a central controller in a real-time monitoring system for a plush toy manufacturing process according to an embodiment of the present application;
fig. 5 is a flowchart of a real-time monitoring method of a plush toy manufacturing process according to an embodiment of the present application.
Detailed Description
Hereinafter, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein.
As used in the specification and in the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
Although the present application makes various references to certain modules in a system according to embodiments of the present application, any number of different modules may be used and run on a user terminal and/or server. The modules are merely illustrative, and different aspects of the systems and methods may use different modules.
A flowchart is used in the present application to describe the operations performed by a system according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in order precisely. Rather, the various steps may be processed in reverse order or simultaneously, as desired. Also, other operations may be added to or removed from these processes.
Hereinafter, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein.
Considering that the quality detection of the finished plush toy is particularly important in the real-time monitoring process of the actual plush toy production process, the defects of surface breakage, stains, unstable stitching, exposed fillers and the like of the produced finished plush toy product can be avoided, and the safety and reliability of the plush toy are ensured. Based on the above, in order to realize automatic quality detection of finished plush toy products, so as to improve detection efficiency and intelligent degree, the application has the technical concept that a detection image of the finished plush toy products is acquired through a camera, and an image processing and analyzing algorithm is introduced into the rear end to analyze the detection image, so that whether the quality of the finished plush toy products is qualified or not is automatically detected and judged. Therefore, the automation degree and the quality control level of the plush toy production process can be improved, the number of artificial errors and defective products is reduced, and the production efficiency and the product quality are improved.
In the technical scheme of the application, a real-time monitoring system for the production process of plush toys is provided. Fig. 1 is a block diagram of a real-time monitoring system for a plush toy manufacturing process according to an embodiment of the present application. As shown in fig. 1, a real-time monitoring system 300 for a plush toy manufacturing process according to an embodiment of the present application includes: a plurality of sensors 310 deployed at various production links of the plush toy production process for detecting the number, quality, position and status parameters of raw materials, semi-finished products and finished products of the plush toy; a plurality of cameras 320 disposed at various production links of the plush toy production process, for collecting images of raw materials, semi-finished products and finished products of the plush toy; a central controller 330, which is communicatively connected to the plurality of sensors and the plurality of cameras, and is configured to process data from the respective sensors and the cameras and perform qualification detection on the quality of finished plush toys; actuators 340 provided at each production link of the plush toy production process, for receiving the command signals of the central controller and carrying, sorting, assembling, inspecting and packaging the raw materials, semi-finished products and finished products of the plush toy; and a display 350 for displaying the production condition, the work instruction and the abnormality prompt content of the plush toy to the operator according to the received data and the transmitted data of the central controller.
In particular, the plurality of sensors 310 disposed at various production links of the plush toy production process are used to detect the number, quality, position and state parameters of raw materials, semi-finished products and finished products of the plush toy. It should be understood that detecting the number, quality, location and status parameters of raw materials, semi-finished products and finished products of plush toys plays an important role in quality control, quality traceability, inventory management, location traceability and status monitoring.
In particular, the plurality of cameras 320 disposed at various production links of the plush toy production process are used to collect images of raw materials, semi-finished products and finished products of the plush toy. It should be understood that the quality control of the raw materials, semi-finished products and finished products of the plush toy can be performed by capturing images. Through image analysis and processing, appearance defects, dimensional deviations, assembly problems, and the like of the product can be detected and identified. This helps to find and correct quality problems in time, ensuring that the product meets quality standards.
In particular, the central controller 330 is communicatively connected to the plurality of sensors and the plurality of cameras, and is configured to process data from each of the sensors and the cameras and to perform qualification testing of finished plush toys. In particular, in one specific example of the present application, as shown in fig. 2 and 4, the central controller 330 includes: the detection image acquisition module 331 is used for acquiring detection images of finished plush toys to be detected, which are acquired by the camera; a reference image acquisition module 332, configured to extract a set of reference images of finished plush toys that are qualified for quality inspection from the database; a plush toy finished product surface reference state feature extraction module 333, configured to perform feature extraction on each reference image in the set of reference images by using a plush toy finished product surface state feature extractor based on a deep neural network model, so as to obtain a set of plush toy finished product surface feature reference feature vectors; the plush toy finished product surface reference state essential feature characterization module 334 is used for inputting the set of plush toy finished product surface feature reference feature vectors into an essential feature extraction network to obtain plush toy finished product surface feature reference essential feature vectors; a plush toy surface detection state feature extraction module 335, configured to perform feature extraction on the detected image of the plush toy finished product to be detected by using the plush toy finished product surface state feature extractor based on the deep neural network model to obtain a plush toy surface state detection feature vector to be detected; a state difference semantic measurement module 336 for calculating a state difference semantic measurement coefficient between the plush toy finished product surface feature reference essential feature vector and the plush toy surface state detection feature vector to be detected; a plush toy finished product quality detection module 337 is configured to determine whether the quality of the plush toy finished product to be detected is acceptable based on a comparison between the state difference semantic measurement coefficient and a predetermined threshold.
Specifically, the detection image acquisition module 331 and the reference image acquisition module 332 are configured to acquire a detection image of a finished plush toy to be detected, which is acquired by the camera; and extracting a set of reference images of finished plush toys which are qualified in quality inspection from the database. It should be appreciated that during the production of plush toys, inspected finished plush toy products are considered quality-meeting products, which can be used as references to represent ideal product conditions. Therefore, by extracting the set of reference images of the finished plush toy products qualified in quality inspection from the database, a reliable reference standard can be established, wherein the images of the qualified products of various types, styles and specifications are included for comparison and quality judgment of the finished plush toy products to be detected.
Specifically, the plush toy finished product surface reference state feature extraction module 333 is configured to perform feature extraction on each reference image in the set of reference images by using a plush toy finished product surface state feature extractor based on a deep neural network model to obtain a set of plush toy finished product surface feature reference feature vectors. In particular, the deep neural network model is a convolutional neural network model. That is, in the technical scheme of the application, each reference image in the set of reference images is subjected to feature mining through a plush toy finished product surface state feature extractor based on a convolutional neural network model, so that surface state reference feature information about a plush toy finished product in each reference image is extracted respectively, and thus the set of plush toy finished product surface feature reference feature vectors is obtained. In particular, the surface state feature extractor of the finished plush toy based on the convolutional neural network model is used for processing, so that the surface state feature of the finished plush toy based on quality inspection qualification can be extracted and represented, wherein the reference feature vector of the surface feature of the finished plush toy based on the convolutional neural network model can be regarded as abstract representation of the surface feature of the finished plush toy based on quality inspection qualification, and important reference feature information on the surface state of the finished plush toy is contained, which is beneficial to comparison and quality judgment of the finished plush toy to be inspected. More specifically, each layer of the plush toy finished product surface state characteristic extractor based on the convolutional neural network model is used for respectively carrying out input data in forward transfer of the layer: carrying out convolution processing on input data to obtain a convolution characteristic diagram; pooling the convolution feature images based on a feature matrix to obtain pooled feature images; performing nonlinear activation on the pooled feature map to obtain an activated feature map; the output of the last layer of the plush toy finished product surface state feature extractor based on the convolutional neural network model is a set of the plush toy finished product surface feature reference feature vectors, and the input of the first layer of the plush toy finished product surface state feature extractor based on the convolutional neural network model is each reference image in the set of the reference images.
Convolutional neural networks (Convolutional Neural Network, CNN) are a deep learning model that is specifically used to process data with a grid structure, such as images and speech. The core idea of CNN is to extract the features of the input data by convolution operation and to perform high-level representation and abstraction of the features by layer-by-layer stacking. The following are the basic components and working principles of CNN: convolution layer: the convolutional layer is the core component of the CNN for extracting features of the input data. It performs a convolution operation on the input data by applying a set of learnable convolution kernels (filters). The convolution operation may capture local patterns and features in the input data and generate a series of feature maps; activation function: after the convolutional layer, a nonlinear activation function, such as ReLU, is typically applied. The activation function introduces nonlinear features that enable the network to learn more complex patterns and representations; pooling layer: the pooling layer is used to reduce the size and number of parameters of the feature map and extract the most important features. Common pooling operations include maximum pooling and average pooling; full tie layer: after passing through a series of convolution and pooling layers, some fully connected layers are typically added. The fully connected layer converts the feature mapping of the previous layer into an output result, such as classification or regression; dropout: to prevent overfitting, dropout techniques are often used in CNNs. Dropout discards a part of neurons randomly in the training process so as to reduce the dependency relationship among the neurons and improve the generalization capability of the model. Through a back propagation algorithm, the CNN can automatically learn and extract the characteristics in the input data and optimize according to the training target. During training, the CNN adjusts the network parameters by minimizing the loss function so that the output results are as close as possible to the real labels.
Specifically, the plush toy finished product surface reference state essential feature characterization module 334 is configured to input the set of plush toy finished product surface feature reference feature vectors into an essential feature extraction network to obtain plush toy finished product surface feature reference essential feature vectors. It should be appreciated that in the real-time monitoring system, the set of the surface feature reference feature vectors of the finished plush toy contains important feature information about the surface states of the finished plush toy that are qualified for quality inspection, so in order to more fully understand and judge the quality of the finished plush toy, it is necessary to further extract and represent the surface state essential features of the finished plush toy that are qualified for quality inspection, and reduce the contribution degree of the surface state reference semantic features that have a low degree of association with the essential features. It is worth mentioning that the essential characteristics can be understood as the common characteristics among the surface state characteristics of the plush toys with qualified quality, and can better reflect the inherent characteristics and qualified quality of the finished plush toys. Based on the above, in the technical scheme of the application, the set of the reference feature vectors of the surface features of the finished plush toy is further input into an essential feature extraction network to obtain the reference essential feature vectors of the surface features of the finished plush toy. Through the processing of the essential feature extraction network, the surface feature reference feature vectors of all plush toy finished products in the set of the surface feature reference feature vectors of all plush toy finished products can be subjected to weighted fusion analysis based on prototype essential contribution degree, so that all the plush toy surface state reference features which are qualified in quality inspection are mapped into a feature space with more distinguishing and expressing capabilities, and therefore, the characterization features of the plush toy surface state essence of the quality qualified products are captured. Thus, the quality of the finished plush toy can be more comprehensively evaluated, and the accuracy and reliability of quality control can be improved. More specifically, the set of the plush toy finished product surface feature reference feature vectors is processed by using the essential feature extraction network in the following essential feature extraction formula to obtain the plush toy finished product surface feature reference essential feature vectors; the essential characteristic extraction formula is as follows: ; wherein/> And/>Referencing the set of feature vectors to the surface features of the finished plush toyAnd/>Surface feature reference feature vector of finished plush toyIs the collection of the surface feature reference feature vectors of the finished plush toy product,/>Is a norm of the vector,/>The number of vectors in the set of the reference feature vectors for the surface features of the finished plush toy is-1,/>Referencing feature values of each position in the essential feature vector for the surface features of the finished plush toyIs the length of the reference essential feature vector of the surface features of the finished plush toyIs the reference essential feature vector of the surface features of the finished plush toyIs an exponential operation.
Specifically, the plush toy surface detection state feature extraction module 335 is configured to perform feature extraction on the detected image of the plush toy product to be detected by using the deep neural network model-based plush toy product surface state feature extractor to obtain a plush toy surface state detection feature vector to be detected, that is, perform feature mining on the detected image of the plush toy product to be detected by using the convolutional neural network model-based plush toy product surface state feature extractor, so as to extract surface state feature information about the plush toy to be detected in the detected image of the plush toy product to be detected, thereby obtaining the plush toy surface state detection feature vector to be detected. More specifically, each layer of the plush toy finished product surface state characteristic extractor based on the convolutional neural network model is used for respectively carrying out input data in forward transfer of the layer: carrying out convolution processing on input data to obtain a convolution characteristic diagram; pooling the convolution feature images based on a feature matrix to obtain pooled feature images; performing nonlinear activation on the pooled feature map to obtain an activated feature map; the output of the last layer of the plush toy finished product surface state feature extractor based on the convolutional neural network model is the plush toy surface state detection feature vector to be detected, and the input of the first layer of the plush toy finished product surface state feature extractor based on the convolutional neural network model is the detection image of the plush toy finished product to be detected.
Specifically, the state difference semantic measurement module 336 is configured to calculate a state difference semantic measurement coefficient between the plush toy finished product surface feature reference essential feature vector and the plush toy surface state detection feature vector to be detected. That is, after obtaining the surface state essential characteristics of the reference plush toy finished product and the surface state characteristics of the plush toy finished product to be detected, in order to quantify the degree of difference between the plush toy finished product to be detected and the reference plush toy finished product, thereby more accurately judging whether the quality of the plush toy finished product to be detected is qualified. More specifically, calculating state difference semantic measurement coefficients between the plush toy finished product surface feature reference essential feature vector and the plush toy surface state detection feature vector to be detected according to the following semantic measurement formula; wherein, the semantic measurement formula is: ; wherein/> And/>Characteristic values of the respective positions of the plush toy finished product surface characteristic reference essential characteristic vector and the plush toy surface state detection characteristic vector to be detected are respectively represented by/>Is the scale of the reference essential feature vector of the surface features of the finished plush toy and the detection feature vector of the surface state of the plush toy to be detected,/>Is the state difference semantic metric coefficient.
Specifically, the plush toy finished product quality detection module 337 is configured to determine whether the quality of the plush toy finished product to be detected is qualified based on a comparison between the state difference semantic measurement coefficient and a predetermined threshold value. And determining whether the quality of the plush toy finished product to be detected is qualified or not based on the comparison between the state difference semantic measurement coefficient and a preset threshold value. That is, by comparing the state difference semantic measurement coefficient with a predetermined threshold value, it is possible to determine whether the quality of the finished plush toy to be detected meets the requirements. If the calculated measurement coefficient is smaller than the threshold value, the difference between the finished plush toy to be detected and the reference finished plush toy is smaller, and the quality is qualified; otherwise, if the measurement coefficient is larger than the threshold value, the difference between the finished plush toy to be detected and the reference finished plush toy is larger, and the quality is judged to be unqualified. Therefore, the automatic detection of the quality of the finished plush toy can be realized, thereby improving the automation degree and quality control level of the plush toy production process, reducing the number of artificial errors and defective products and improving the production efficiency and the product quality.
In particular, the actuators 340 provided at the respective production links of the plush toy production process are used for receiving the command signals of the central controller and carrying, sorting, assembling, inspecting and packaging the raw materials, semi-finished products and finished products of the plush toy. Thus, the automatic detection of the quality of the finished plush toy can be realized. It should be appreciated that by receiving the command signals from the central controller, the individual operations may be automated for handling, sorting, assembling, inspection, packaging, etc. Therefore, manual operation can be reduced, production efficiency is improved, and human error is reduced.
In particular, the display 350 is configured to display the production situation, the work instruction, and the abnormality indication content of the plush toy to the operator according to the reception data and the transmission data of the central controller. It should be understood that the production data and the status information are fed back to the operator in real time through the display, so that the operator can know the production efficiency, the quality control condition and the like in time, and accordingly adjust and make decisions according to the feedback information.
It will be appreciated that training of the convolutional neural network model-based plush toy finished surface state feature extractor and the essential feature extraction network is required prior to the inference using the neural network model described above. That is, the real-time monitoring system 300 for plush toy production process according to the present application further comprises a training stage 400 for training the surface state feature extractor of the plush toy finished product based on the convolutional neural network model and the essential feature extraction network.
Fig. 3 is a block diagram of a training phase of a real-time monitoring system for a plush toy manufacturing process according to an embodiment of the present application. As shown in fig. 3, a real-time monitoring system 300 for a plush toy production process according to an embodiment of the present application includes: training phase 400, comprising: a training data acquisition unit 410, configured to acquire training data, where the training data includes training detection images of finished plush toys to be detected, which are acquired by the camera; extracting a set of training reference images of the finished plush toy products which are qualified in quality inspection from a database; a training reference state feature extraction unit 420, configured to perform feature extraction on each training reference image in the set of training reference images by using a plush toy finished product surface state feature extractor based on a convolutional neural network model, so as to obtain a set of training plush toy finished product surface feature reference feature vectors; a training reference state essential feature characterization unit 430, configured to input the set of training plush toy finished product surface feature reference feature vectors into an essential feature extraction network to obtain training plush toy finished product surface feature reference essential feature vectors; an optimizing unit 440, configured to optimize the reference essential feature vector of the surface feature of the finished training plush toy to obtain an optimized reference essential feature vector of the surface feature of the finished training plush toy; a training detection state feature extraction unit 450, configured to perform feature extraction on the training detection image of the finished plush toy to be detected by using the surface state feature extractor of the finished plush toy based on the convolutional neural network model, so as to obtain a training detection feature vector of the surface state of the plush toy to be detected; the training state difference semantic measurement unit 460 is configured to calculate a training state difference semantic measurement coefficient between the optimized training plush toy finished product surface feature reference essential feature vector and the training plush toy surface state detection feature vector to be detected; a loss calculation unit 470, configured to train a difference loss function between the state difference semantic measurement coefficient and the real state difference semantic measurement coefficient; and a training unit 480, configured to train the surface state feature extractor and the essential feature extraction network of the plush toy based on the convolutional neural network model based on the difference loss function.
In the technical scheme of the application, each training plush toy finished product surface feature reference feature vector in the training plush toy finished product surface feature reference feature vector set expresses the image semantic feature of the reference image of the corresponding qualified plush toy finished product, so that when the training plush toy finished product surface feature reference feature vector set is input into an essential feature extraction network, the image semantic feature distribution difference of the training plush toy finished product surface feature reference feature vector caused by the source image semantic difference of each reference image is considered, and after the feature distribution expression based on the specific image semantic essential feature distribution is carried out on each training plush toy finished product surface feature reference feature vector, the problem of insufficient image semantic feature distribution aggregation exists in the training plush toy finished product surface feature reference essential feature vector, thereby influencing the expression effect of the training plush toy finished product surface feature reference essential feature vector and the training speed of a model.
Based on the above, in the model iteration process, namely, each time the difference loss function between the predicted state difference semantic measurement coefficient and the real state difference semantic measurement coefficient is reversely propagated through the training plush toy finished product surface feature reference essential feature vector, the applicant optimizes the training plush toy finished product surface feature reference essential feature vector, and the model iteration process is expressed as follows: optimizing the surface feature reference essential feature vector of the finished training plush toy by using the following optimization formula to obtain the surface feature reference essential feature vector of the finished training plush toy; wherein, the optimization formula is: ; wherein/> Is the reference essential feature vector of the surface features of the finished product of the training plush toy,/>Is the reference essential feature vector/>, of the surface features of the finished product of the training plush toySquare of 1 norm,/>Is the reference essential feature vector of the surface features of the finished product of the training plush toyInverse of square root of 2 norms,/>Is the/> -of the training plush toy finished product surface feature reference essential feature vectorCharacteristic value of individual position,/>Is the reference essential feature vector/>, of the surface features of the finished product of the training plush toyAnd/>Is a scaled hyper-parameter,/>Represents a logarithmic function value based on 2,/>Is the/> -of the optimized training plush toy finished product surface feature reference essential feature vectorCharacteristic values of the individual positions.
Here, the essential feature vector is referenced based on the surface feature of the finished plush toyIs expressed as the reference essential feature vector/>, of the surface features of the finished training plush toy productVoting clusters aggregated by feature value sets of the training plush toy, and referencing essential feature vectors/>, to the surface features of the training plush toy finished productNormalized voting with respect to an information-aware framework to map feature values attributed to the same regression class to a similar set of locally normalized coordinates by aggregating the regression representations of the direction and scale of feature distribution, thereby promoting the training plush toy finished product surface feature reference essential feature vector/>Improving the aggregation effect of the feature set of the training plush toy finished product surface feature reference essential feature vector/>Expression effects of (a) and training speed of the model. Therefore, whether the quality of the finished plush toy is qualified or not can be automatically detected and judged in the production process of the plush toy, the automation degree and the quality control level of the production process of the plush toy are improved, the number of artificial errors and defective products is reduced, and the production efficiency and the product quality are improved.
As described above, the real-time monitoring system 300 for a plush toy manufacturing process according to an embodiment of the present application may be implemented in various wireless terminals, such as a server having a real-time monitoring algorithm for a plush toy manufacturing process, etc. In one possible implementation, the real-time plush toy production process monitoring system 300 according to an embodiment of the present application may be integrated into a wireless terminal as a software module and/or a hardware module. For example, the plush toy production process real time monitoring system 300 may be a software module in the operating system of the wireless terminal or may be an application developed for the wireless terminal; of course, the plush toy production process real time monitoring system 300 could equally be one of the many hardware modules of the wireless terminal.
Alternatively, in another example, the plush toy production process real time monitoring system 300 and the wireless terminal may be separate devices, and the plush toy production process real time monitoring system 300 may be connected to the wireless terminal through a wired and/or wireless network and transmit interactive information in a agreed data format.
Further, a real-time monitoring method for the plush toy production process is also provided.
Fig. 5 is a flowchart of a real-time monitoring method of a plush toy manufacturing process according to an embodiment of the present application. As shown in fig. 5, the real-time monitoring method for the plush toy production process according to the embodiment of the application comprises the following steps: s1, acquiring a detection image of a finished plush toy to be detected, which is acquired by the camera; s2, extracting a set of reference images of the finished plush toy products which are qualified in quality inspection from a database; s3, respectively carrying out feature extraction on each reference image in the set of reference images through a plush toy finished product surface state feature extractor based on a deep neural network model to obtain a set of plush toy finished product surface feature reference feature vectors; s4, inputting the set of the plush toy finished product surface feature reference feature vectors into an essential feature extraction network to obtain the plush toy finished product surface feature reference essential feature vectors; s5, extracting features of the detection image of the plush toy finished product to be detected through the plush toy finished product surface state feature extractor based on the deep neural network model to obtain a plush toy surface state detection feature vector to be detected; s6, calculating a state difference semantic measurement coefficient between the plush toy finished product surface feature reference essential feature vector and the plush toy surface state detection feature vector to be detected; and S7, determining whether the quality of the plush toy finished product to be detected is qualified or not based on the comparison between the state difference semantic measurement coefficient and a preset threshold value.
In summary, the real-time monitoring method for the plush toy production process according to the embodiment of the application is explained, wherein the detection image of the plush toy finished product is acquired through a camera, and an image processing and analyzing algorithm is introduced at the rear end to analyze the detection image, so that whether the quality of the plush toy finished product is qualified or not is automatically detected and judged. Therefore, the automation degree and the quality control level of the plush toy production process can be improved, the number of artificial errors and defective products is reduced, and the production efficiency and the product quality are improved. Meanwhile, an operator can monitor the production condition in real time through the display, timely find abnormality and take corresponding measures, and further improve the efficiency and accuracy of production management.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
Claims (9)
1. A real-time monitoring system for plush toy production process, comprising:
a plurality of sensors deployed in each production link of the plush toy production process and used for detecting the quantity, quality, position and state parameters of raw materials, semi-finished products and finished products of the plush toy;
the cameras are deployed in all production links of the plush toy production process and are used for collecting images of raw materials, semi-finished products and finished products of the plush toy;
The central controller is in communication connection with the plurality of sensors and the plurality of cameras and is used for processing data from the sensors and the cameras and detecting the quality of finished plush toys;
The executor is arranged in each production link of the plush toy production process and is used for receiving the instruction signals of the central controller and carrying, sorting, assembling, checking and packaging the raw materials, semi-finished products and finished products of the plush toy;
And the display is used for displaying the production condition, the work instruction and the abnormal prompt content of the plush toy to an operator according to the received data and the sent data of the central controller.
2. The real-time plush toy manufacturing process monitoring system of claim 1, wherein the central controller comprises:
The detection image acquisition module is used for acquiring detection images of finished plush toys to be detected, which are acquired by the camera;
the reference image acquisition module is used for extracting a set of reference images of the finished plush toy products which are qualified in quality inspection from the database;
the plush toy finished product surface reference state feature extraction module is used for respectively carrying out feature extraction on each reference image in the set of reference images through a plush toy finished product surface state feature extractor based on a deep neural network model so as to obtain a set of plush toy finished product surface feature reference feature vectors;
the plush toy finished product surface reference state essential feature representation module is used for inputting the collection of the plush toy finished product surface feature reference feature vectors into an essential feature extraction network to obtain the plush toy finished product surface feature reference essential feature vectors;
the plush toy surface detection state feature extraction module is used for carrying out feature extraction on the detection image of the plush toy finished product to be detected through the plush toy finished product surface state feature extractor based on the deep neural network model so as to obtain a plush toy surface state detection feature vector to be detected;
The state difference semantic measurement module is used for calculating state difference semantic measurement coefficients between the plush toy finished product surface feature reference essential feature vector and the plush toy surface state detection feature vector to be detected;
the plush toy finished product quality detection module is used for determining whether the quality of the plush toy finished product to be detected is qualified or not based on the comparison between the state difference semantic measurement coefficient and a preset threshold value.
3. The real-time monitoring system of plush toy production process according to claim 2, wherein the deep neural network model is a convolutional neural network model.
4. The real-time plush toy production process monitoring system according to claim 3, wherein the plush toy finished product surface reference state essential characteristic characterization module is used for: processing the set of finished plush toy surface feature reference feature vectors with the following essential feature extraction formula by using the essential feature extraction network to obtain the finished plush toy surface feature reference essential feature vectors;
the essential characteristic extraction formula is as follows: ; wherein/> And/>Referencing the set of feature vectors to the surface features of the finished plush toyAnd/>Surface feature reference feature vector of finished plush toyIs the collection of the surface feature reference feature vectors of the finished plush toy product,/>Is a norm of the vector which is the one,The number of vectors in the set of the reference feature vectors for the surface features of the finished plush toy is-1,/>Referencing feature values of each position in the essential feature vector for the surface features of the finished plush toyIs the length of the reference essential feature vector of the surface features of the finished plush toyIs the reference essential feature vector of the surface features of the finished plush toy,Is an exponential operation.
5. The real-time plush toy manufacturing process monitoring system of claim 4, wherein the state difference semantic measurement module is configured to: calculating state difference semantic measurement coefficients between the plush toy finished product surface feature reference essential feature vector and the plush toy surface state detection feature vector to be detected according to the following semantic measurement formula;
wherein, the semantic measurement formula is: ; wherein/> And/>Characteristic values of the respective positions of the plush toy finished product surface characteristic reference essential characteristic vector and the plush toy surface state detection characteristic vector to be detected are respectively represented by/>Is the scale of the reference essential feature vector of the surface features of the finished plush toy and the detection feature vector of the surface state of the plush toy to be detected,/>Is the state difference semantic metric coefficient.
6. The real-time plush toy production process monitoring system of claim 5, wherein the plush toy finished product quality detection module is configured to: and determining whether the quality of the plush toy finished product to be detected is qualified or not based on the comparison between the state difference semantic measurement coefficient and a preset threshold value.
7. The real-time plush toy production process monitoring system of claim 6, further comprising a training module for training the convolutional neural network model-based plush toy finished product surface state feature extractor and the intrinsic feature extraction network.
8. The plush toy production process real time monitoring system of claim 7, wherein the training module comprises:
The training data acquisition unit is used for acquiring training data, and the training data comprises training detection images of finished plush toys to be detected, which are acquired by the camera; extracting a set of training reference images of the finished plush toy products which are qualified in quality inspection from a database;
The training reference state feature extraction unit is used for respectively carrying out feature extraction on each training reference image in the training reference image set through a plush toy finished product surface state feature extractor based on a convolutional neural network model so as to obtain a training plush toy finished product surface feature reference feature vector set;
The training reference state essential feature representation unit is used for inputting the set of the surface feature reference feature vectors of the finished training plush toy into an essential feature extraction network to obtain the surface feature reference essential feature vectors of the finished training plush toy;
The optimizing unit is used for optimizing the surface feature reference essential feature vector of the finished training plush toy to obtain an optimized surface feature reference essential feature vector of the finished training plush toy;
the training detection state feature extraction unit is used for extracting features of the training detection image of the plush toy finished product to be detected through the plush toy finished product surface state feature extractor based on the convolutional neural network model so as to obtain training plush toy surface state detection feature vectors to be detected;
The training state difference semantic measurement unit is used for calculating a training state difference semantic measurement coefficient between the surface feature reference essential feature vector of the finished product of the optimized training plush toy and the surface state detection feature vector of the plush toy to be detected;
the loss calculation unit is used for training a difference loss function between the state difference semantic measurement coefficient and the real state difference semantic measurement coefficient;
And the training unit is used for training the plush toy finished product surface state feature extractor based on the convolutional neural network model and the essential feature extraction network based on the difference loss function.
9. A real-time monitoring method for plush toy production process is characterized by comprising the following steps:
Acquiring a detection image of the finished plush toy to be detected, which is acquired by the camera;
extracting a set of reference images of finished plush toys qualified in quality inspection from a database;
respectively carrying out feature extraction on each reference image in the set of reference images by a plush toy finished product surface state feature extractor based on a deep neural network model to obtain a set of plush toy finished product surface feature reference feature vectors;
Inputting the collection of the plush toy finished product surface feature reference feature vectors into an essential feature extraction network to obtain plush toy finished product surface feature reference essential feature vectors;
Extracting features of the detection image of the plush toy finished product to be detected by the plush toy finished product surface state feature extractor based on the deep neural network model to obtain a plush toy surface state detection feature vector to be detected;
calculating a state difference semantic measurement coefficient between the plush toy finished product surface feature reference essential feature vector and the plush toy surface state detection feature vector to be detected;
And determining whether the quality of the plush toy finished product to be detected is qualified or not based on the comparison between the state difference semantic measurement coefficient and a preset threshold value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410214748.5A CN118052793A (en) | 2024-02-27 | 2024-02-27 | Real-time monitoring system and method for plush toy production process |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410214748.5A CN118052793A (en) | 2024-02-27 | 2024-02-27 | Real-time monitoring system and method for plush toy production process |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118052793A true CN118052793A (en) | 2024-05-17 |
Family
ID=91051587
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410214748.5A Pending CN118052793A (en) | 2024-02-27 | 2024-02-27 | Real-time monitoring system and method for plush toy production process |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118052793A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118212667A (en) * | 2024-05-21 | 2024-06-18 | 杭州名光微电子科技有限公司 | Dynamic identification method and system for palm vein of living body |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016056034A1 (en) * | 2014-10-06 | 2016-04-14 | Bezzi Giorgio | Automatic control system and method for clothing items |
CN109978868A (en) * | 2019-03-29 | 2019-07-05 | 北京百度网讯科技有限公司 | Toy appearance quality determining method and its relevant device |
CN115392649A (en) * | 2022-08-04 | 2022-11-25 | 温州职业技术学院 | Leather shoe production system |
CN116563795A (en) * | 2023-05-30 | 2023-08-08 | 北京天翊文化传媒有限公司 | Doll production management method and doll production management system |
CN117152152A (en) * | 2023-10-31 | 2023-12-01 | 吉林瑞特生物科技有限公司 | Production management system and method for detection kit |
CN117576014A (en) * | 2023-11-10 | 2024-02-20 | 山东盈和电子科技有限公司 | Ceramic substrate quality detection method, system, electronic equipment and storage medium |
-
2024
- 2024-02-27 CN CN202410214748.5A patent/CN118052793A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016056034A1 (en) * | 2014-10-06 | 2016-04-14 | Bezzi Giorgio | Automatic control system and method for clothing items |
CN109978868A (en) * | 2019-03-29 | 2019-07-05 | 北京百度网讯科技有限公司 | Toy appearance quality determining method and its relevant device |
CN115392649A (en) * | 2022-08-04 | 2022-11-25 | 温州职业技术学院 | Leather shoe production system |
CN116563795A (en) * | 2023-05-30 | 2023-08-08 | 北京天翊文化传媒有限公司 | Doll production management method and doll production management system |
CN117152152A (en) * | 2023-10-31 | 2023-12-01 | 吉林瑞特生物科技有限公司 | Production management system and method for detection kit |
CN117576014A (en) * | 2023-11-10 | 2024-02-20 | 山东盈和电子科技有限公司 | Ceramic substrate quality detection method, system, electronic equipment and storage medium |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118212667A (en) * | 2024-05-21 | 2024-06-18 | 杭州名光微电子科技有限公司 | Dynamic identification method and system for palm vein of living body |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10835930B2 (en) | Sorting system | |
CN109060817B (en) | Artificial intelligence reinspection system and method thereof | |
WO2017084186A1 (en) | System and method for automatic monitoring and intelligent analysis of flexible circuit board manufacturing process | |
CN111461555A (en) | Production line quality monitoring method, device and system | |
US20200402221A1 (en) | Inspection system, image discrimination system, discrimination system, discriminator generation system, and learning data generation device | |
CN107389689A (en) | A kind of AOI management systems and management method for defects detection | |
CN109840900A (en) | A kind of line detection system for failure and detection method applied to intelligence manufacture workshop | |
CN110895716B (en) | Inspection device and machine learning method | |
CN118052793A (en) | Real-time monitoring system and method for plush toy production process | |
CN116972913B (en) | On-line monitoring method and system for running state of cold chain equipment | |
CN114556383A (en) | Model generation device, estimation device, model generation method, and model generation program | |
JP6823025B2 (en) | Inspection equipment and machine learning method | |
KR20220144539A (en) | Master state generation method based on graph neural network for detecting error in real time | |
KR20190127029A (en) | Method and apparatus for managing 3d printing using g-code | |
CN117314829A (en) | Industrial part quality inspection method and system based on computer vision | |
JP7345006B1 (en) | Learning model generation method and testing device | |
Schmidt et al. | An automated optical inspection system for PIP solder joint classification using convolutional neural networks | |
CN118732629B (en) | Intelligent production process recommendation method and system for information cloud sharing of industrial Internet of things | |
CN117078620B (en) | PCB welding spot defect detection method and device, electronic equipment and storage medium | |
CN117969553B (en) | On-line visual detection system for appearance of TPV knitted composite pipe | |
CN117935174B (en) | Intelligent management system and method for vacuum bag film production line | |
CN118502329B (en) | Control management system and method for application bonding silver wire equipment based on Internet of things | |
CN117798654B (en) | Intelligent adjusting system for center of steam turbine shafting | |
CN117828499B (en) | PCBA abnormal part determination method, system, storage medium and electronic equipment | |
Indasyah et al. | Automated Visual Inspection System of Gear Surface Defects Detection Using Faster RCNN |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |