CN118015312B - Image processing method, device and equipment - Google Patents
Image processing method, device and equipment Download PDFInfo
- Publication number
- CN118015312B CN118015312B CN202410172433.9A CN202410172433A CN118015312B CN 118015312 B CN118015312 B CN 118015312B CN 202410172433 A CN202410172433 A CN 202410172433A CN 118015312 B CN118015312 B CN 118015312B
- Authority
- CN
- China
- Prior art keywords
- feature
- local
- determining
- image
- matching
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 22
- 238000012545 processing Methods 0.000 claims abstract description 79
- 238000000034 method Methods 0.000 claims abstract description 34
- 238000010586 diagram Methods 0.000 description 10
- 238000004590 computer program Methods 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000000697 sensory organ Anatomy 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
- G06V10/763—Non-hierarchical techniques, e.g. based on statistics of modelling distributions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Medical Informatics (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Human Computer Interaction (AREA)
- Probability & Statistics with Applications (AREA)
- Image Analysis (AREA)
Abstract
The application provides an image processing method, device and equipment. The method comprises the following steps: determining a plurality of local features in the target image and determining the matching sequence of the plurality of local features, wherein the confidence coefficient of the local features is larger than or equal to a preset threshold value; according to the matching sequence, determining a cluster feature set corresponding to each local feature in a plurality of preset cluster feature sets in sequence, wherein the cluster feature set comprises a plurality of cluster features; determining a target subset in the image set according to the local features, the matching sequence and the clustering feature set corresponding to each local feature; and carrying out matching processing on the target image and the images in the target subset to obtain an image matching result. According to the method, the target subset can be determined in the image set through the hierarchical matching processing of the plurality of local features of the target image, so that the target image is matched with fewer images when the image matching processing is carried out, the matching processing time is shortened, and the image matching efficiency is improved.
Description
Technical Field
The present application relates to the field of big data technologies, and in particular, to an image processing method, apparatus, and device.
Background
At present, the face recognition technology is widely applied to daily life of people. For example, at a station, passengers can perform riding information verification through face recognition to shorten waiting time.
In the related art, after the image acquisition device acquires the face image of the target user, each image stored in the database needs to be matched with the face image in sequence. If the database contains the image matched with the face image, determining that the face recognition passes; and if the image matched with the face image does not exist in the database, determining that the face recognition is not passed.
However, when the number of images stored in the database is large, it takes a long time to determine the matching result, resulting in low efficiency of image matching.
Disclosure of Invention
The application provides an image processing method, an image processing device and image processing equipment, which are used for solving the problem that in the related art, when the number of images stored in a database is large, a long time is required to determine a matching result, so that the efficiency of image matching is low.
In a first aspect, the present application provides an image processing method, including:
Determining a plurality of local features in a target image, and determining the matching sequence of the plurality of local features, wherein the confidence of the local features is larger than or equal to a preset threshold;
according to the matching sequence, determining a cluster feature set corresponding to each local feature in a plurality of preset cluster feature sets in sequence, wherein the cluster feature set comprises a plurality of cluster features;
determining a target subset in the image set according to the local features, the matching sequence and the clustering feature set corresponding to each local feature;
And carrying out matching processing on the target image and the images in the target subset to obtain an image matching result.
In one possible implementation manner, according to the matching sequence, in a preset plurality of cluster feature sets, determining a cluster feature set corresponding to each local feature in turn includes:
Determining a cluster feature set corresponding to a first local feature of the plurality of local features according to a matching level corresponding to the first local feature, wherein the first local feature is the first local feature of the plurality of local features;
and determining a last local feature of the second local feature aiming at any second local feature except the first local feature in the local features, and determining a cluster feature set corresponding to the second local feature according to the matching degree of the last local feature and each cluster feature in the corresponding cluster feature set.
In a possible implementation manner, determining the cluster feature set corresponding to the first local feature according to the matching level corresponding to the first local feature includes:
determining a local part corresponding to the first local feature;
determining a matching level corresponding to the first local feature according to the local part;
And determining the cluster feature set corresponding to the matching level corresponding to the first local feature as the cluster feature set corresponding to the first local feature.
In one possible implementation manner, determining the cluster feature set corresponding to the second local feature according to the matching degree of the last local feature and each cluster feature in the corresponding cluster feature set includes:
Determining at least one first cluster feature in the cluster feature set corresponding to the last local feature according to the matching degree of each cluster feature in the cluster feature set corresponding to the last local feature;
The determining the cluster feature set corresponding to the second local feature includes: and the next-stage cluster feature set corresponding to each first cluster feature.
In one possible implementation manner, determining a target subset in the image set according to the plurality of local features, the matching order, and the cluster feature set corresponding to each local feature includes:
Determining a third local feature from the plurality of local features according to the matching sequence, wherein the third local feature is the last local feature in the plurality of local features;
Obtaining the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set;
Determining at least one second cluster feature in the cluster feature set corresponding to the third local feature according to the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set;
and determining the image corresponding to the second aggregate characteristic in the image set as the target subset.
In one possible implementation, determining a plurality of local features in a target image includes:
determining a plurality of initial partial images corresponding to a plurality of preset parts and the confidence degree of each initial partial image in the target image, wherein the confidence degree indicates at least one of the following: the integrity, shielding degree and definition of the initial partial image;
Determining an initial partial image with the confidence coefficient being greater than or equal to a preset threshold value as a target partial image;
and determining the image features corresponding to the target local image as the local features.
In one possible implementation, determining the matching order of the plurality of local features includes:
determining a plurality of target positions corresponding to the local features;
Determining a preset sequence corresponding to a plurality of preset parts, wherein the plurality of preset parts comprise the plurality of target parts;
and determining the matching sequence according to the preset sequence and the target positions.
In a second aspect, the present application provides an image processing apparatus comprising:
the determining module is used for determining a plurality of local features in the target image and determining the matching sequence of the local features, and the confidence coefficient of the local features is larger than or equal to a preset threshold value;
the determining module is further configured to sequentially determine a cluster feature set corresponding to each local feature from a plurality of preset cluster feature sets according to the matching sequence, where the cluster feature set includes a plurality of cluster features;
the determining module is further configured to determine a target subset in the image set according to the plurality of local features, the matching sequence, and the cluster feature set corresponding to each local feature;
And the matching processing module is used for carrying out matching processing on the target image and the images in the target subset to obtain an image matching result.
In one possible implementation manner, the determining module is specifically configured to:
Determining a cluster feature set corresponding to a first local feature of the plurality of local features according to a matching level corresponding to the first local feature, wherein the first local feature is the first local feature of the plurality of local features;
and determining a last local feature of the second local feature aiming at any second local feature except the first local feature in the local features, and determining a cluster feature set corresponding to the second local feature according to the matching degree of the last local feature and each cluster feature in the corresponding cluster feature set.
In a possible implementation manner, the determining module is specifically further configured to:
determining a local part corresponding to the first local feature;
determining a matching level corresponding to the first local feature according to the local part;
And determining the cluster feature set corresponding to the matching level corresponding to the first local feature as the cluster feature set corresponding to the first local feature.
In a possible implementation manner, the determining module is specifically further configured to:
Determining at least one first cluster feature in the cluster feature set corresponding to the last local feature according to the matching degree of each cluster feature in the cluster feature set corresponding to the last local feature;
The determining the cluster feature set corresponding to the second local feature includes: and the next-stage cluster feature set corresponding to each first cluster feature.
In a possible implementation manner, the determining module is specifically further configured to:
Determining a third local feature from the plurality of local features according to the matching sequence, wherein the third local feature is the last local feature in the plurality of local features;
Obtaining the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set;
Determining at least one second cluster feature in the cluster feature set corresponding to the third local feature according to the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set;
and determining the image corresponding to the second aggregate characteristic in the image set as the target subset.
In a possible implementation manner, the determining module is specifically further configured to:
determining a plurality of initial partial images corresponding to a plurality of preset parts and the confidence degree of each initial partial image in the target image, wherein the confidence degree indicates at least one of the following: the integrity, shielding degree and definition of the initial partial image;
Determining an initial partial image with the confidence coefficient being greater than or equal to a preset threshold value as a target partial image;
and determining the image features corresponding to the target local image as the local features.
In a possible implementation manner, the determining module is specifically further configured to:
determining a plurality of target positions corresponding to the local features;
Determining a preset sequence corresponding to a plurality of preset parts, wherein the plurality of preset parts comprise the plurality of target parts;
and determining the matching sequence according to the preset sequence and the target positions.
In a third aspect, the present application provides an image processing apparatus comprising: a processor, and a memory communicatively coupled to the processor;
The memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the method of any one of the first aspects.
In a fourth aspect, the present application provides a computer-readable storage medium having stored therein computer-executable instructions for performing the method of any of the first aspects when executed by a processor.
In a fifth aspect, the application provides a computer program product comprising a computer program which, when executed by a processor, implements the method according to any of the first aspects.
The application provides an image processing method, an image processing device and image processing equipment, which can determine a plurality of local features in a target image and determine the matching sequence of the plurality of local features; according to the matching sequence, determining a cluster feature set corresponding to each local feature in a plurality of preset cluster feature sets in sequence; determining a target subset in the image set according to the local features, the matching sequence and the clustering feature set corresponding to each local feature; and carrying out matching processing on the target image and the images in the target subset to obtain an image matching result. In the method, after the plurality of local features of the target image are acquired, the plurality of local features of the target image are sequentially subjected to hierarchical matching processing according to the matching sequence of the plurality of local features, so that a target subset is determined in the image set, fewer images are matched when the target image is subjected to image matching processing, the time for the target image to be subjected to matching processing is reduced, and the image matching efficiency of the target image is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic diagram of an application scenario provided by the present application;
FIG. 2 is a schematic flow chart of an image processing method according to the present application;
FIG. 3 is a flowchart of another image processing method according to the present application;
FIG. 4 is a schematic diagram of an image processing apparatus according to the present application;
fig. 5 is a schematic diagram of a hardware structure of an image processing apparatus provided by the present application.
Specific embodiments of the present application have been shown by way of the above drawings and will be described in more detail below. The drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but rather to illustrate the inventive concepts to those skilled in the art by reference to the specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
Next, referring to fig. 1, an application scenario provided by the present application will be described.
Fig. 1 is a schematic diagram of an application scenario provided by the present application. Referring to fig. 1, the image processing apparatus includes an image processing apparatus, where the image processing apparatus may acquire a target image, determine a plurality of local features in the target image, determine a matching sequence of the plurality of local features and a cluster feature set corresponding to each local feature, and further determine a target subset in the image set according to the plurality of local features, the matching sequence of the plurality of local features, and the cluster feature set corresponding to each local feature, and perform a matching process on the target image and an image of the target subset to obtain an image matching result.
In the related art, after the image acquisition device acquires the face image of the target user, each image stored in the database needs to be matched with the face image in sequence. If the database contains the image matched with the face image, determining that the face recognition passes; and if the image matched with the face image does not exist in the database, determining that the face recognition is not passed. However, when the number of images stored in the database is large, it takes a long time to determine the matching result, resulting in low efficiency of image matching.
The application provides an image processing method, which aims to solve the technical problems in the related art. The image processing method can determine a plurality of local features in the target image and determine the matching sequence of the plurality of local features; according to the matching sequence, determining a cluster feature set corresponding to each local feature in a plurality of preset cluster feature sets in sequence; determining a target subset in the image set according to the local features, the matching sequence and the clustering feature set corresponding to each local feature; and carrying out matching processing on the target image and the images in the target subset to obtain an image matching result. In the method, after the plurality of local features of the target image are acquired, the plurality of local features of the target image are sequentially subjected to hierarchical matching processing according to the matching sequence of the plurality of local features, so that a target subset is determined in the image set, fewer images are matched when the target image is subjected to image matching processing, the time for the target image to be subjected to matching processing is reduced, and the image matching efficiency of the target image is improved.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 2 is a schematic flow chart of an image processing method provided by the application. Referring to fig. 2, the method may include:
The image processing method may be executed by an image processing apparatus or may be executed by an image processing device provided in the image processing apparatus. The image processing apparatus may be realized by software or by a combination of software and hardware.
S201, determining a plurality of local features in the target image and determining a matching sequence of the plurality of local features.
The confidence level of the local feature is greater than or equal to a preset threshold.
The target image may be a face image or an image of a target object or a target scene. The target image may be an image acquired by the image acquisition device or an image uploaded by the client. The format of the target image may be a joint photographic experts group (Joint Photographic Experts Group, JPEG) format or a portable network graphics (Portable Network Graphics, PNG) or other image format.
If the target image is a face image, the local features may be face scale features, eye features, nose features, mouth features, ear features, and the like. If the target image is an image of a target object or target scene, the local features may be color features, pattern features, texture features, and the like.
It will be appreciated that in actual use, a different number of local features may be selected for subsequent matching depending on the use scenario. For example, if the accuracy requirement for the matching process of the target image is strict, a larger number of local features (for example, the features of the full face, including the face scale features, the eye features, the nose features, the mouth features and the ear features) may be selected for the matching process, so as to improve the accuracy of the matching process of the target image; if the accuracy requirement for the matching processing of the target image is relatively loose, a smaller number of face features (e.g., typical local features such as eye features or mouth features) can be selected for the matching processing, so as to reduce the calculation amount of the matching processing and improve the matching processing efficiency.
S202, determining a cluster feature set corresponding to each local feature in a plurality of preset cluster feature sets according to the matching sequence.
The cluster feature set includes a plurality of cluster features.
It will be appreciated that the user may store in advance the preset plurality of sets of cluster features in the database. The preset plurality of cluster feature sets may be stored in a database in a hierarchical manner according to a matching level in the form of a table. For example, the preset plurality of cluster feature sets may be hierarchically stored as shown in table 1:
TABLE 1
Assuming that 3 local features are face proportion features, eye features and nose features respectively, please refer to table 1, a face proportion clustering feature set corresponding to the face proportion features comprises 10 face proportion clustering features (face proportion clustering features 1-10); each face proportional-clustering feature can correspond to 5 eye-clustering features (eye-clustering feature 1-eye-clustering feature 5), and the cluster feature set corresponding to the eye features can comprise eye-clustering features corresponding to the face proportional-clustering features; each eye cluster feature may correspond to 2 nose cluster features (nose cluster feature 1 and nose cluster feature 2), and the set of cluster features corresponding to the nose features may include nose cluster features corresponding to the eye cluster features.
S203, determining a target subset in the image set according to the local features, the matching sequence and the clustering feature set corresponding to each local feature.
Optionally, the clustering feature and the image set corresponding to the clustering feature may be correspondingly stored in the database, for example, the correspondence between the clustering feature and the image set may be as shown in table 2:
TABLE 2
For example, assume that there are 3 local features, namely a face scale feature, an eye feature and a nose feature, and the matching sequence of the 3 local features is as follows: face scale features > eye features > nose features. Matching the face proportional feature with 10 face proportional clustering features in the face proportional clustering feature set, and determining a face proportional clustering feature 1 and a face proportional clustering feature 2 which have the highest matching degree with the face proportional feature; matching 5 eye clustering features corresponding to the eye features and the face proportional clustering feature 1, and matching 5 eye clustering features corresponding to the eye features and the face proportional clustering feature 2 to determine an eye clustering feature 5 corresponding to the face proportional clustering feature 1 with the highest matching degree of the eye features; and carrying out matching processing on the nose features and 2 nose cluster features corresponding to the eye cluster features 5, determining nose cluster features 1 corresponding to the eye cluster features 5 with the highest matching degree of the nose features, and determining a set of image compositions corresponding to the nose cluster features 1 corresponding to the eye cluster features 5 as a target subset.
According to table 2, the image set corresponding to the face proportional cluster feature 1-eye cluster feature 5-nose cluster feature 1 includes face images 8001 to 9000, and the target subset includes face images 8001 to 9000.
S204, matching the target image with the images in the target subset to obtain an image matching result.
The image matching result may be matching success or matching failure.
Assuming M images in the target subset, sequentially carrying out matching processing on the target image and each image in the M images; if the M images have images matched with the target image, determining that the image matching result is successful; if the M images do not exist, determining that the image matching result is a matching failure, wherein M is a positive integer.
The image processing method provided by the application can determine a plurality of local features and the matching sequence of the plurality of local features in the target image, and sequentially carry out hierarchical matching processing on the plurality of local features of the target image according to the matching sequence of the plurality of local features so as to determine a target subset in the image set, and carry out matching processing on the images in the target subset of the target image to obtain a matching result. The method can enable the target image to be matched with fewer images when the image matching processing is carried out, reduce the time of the target image for the matching processing and improve the image matching efficiency of the target image.
Fig. 3 is a flow chart of another image processing method provided by the application. Referring to fig. 3, the method may include:
The image processing method may be executed by an image processing apparatus or may be executed by an image processing device provided in the image processing apparatus. The image processing apparatus may be realized by software or by a combination of software and hardware.
S301, determining a plurality of initial partial images corresponding to a plurality of preset parts and the confidence coefficient of each initial partial image in the target image.
If the target image is a face image, the preset portion may be a face portion, an eye portion, a nose portion, a mouth portion, or the like. After the target image is acquired, clipping processing or extraction processing may be performed on the target image, so as to determine a plurality of initial partial images corresponding to a plurality of preset parts in the target image.
For example, the preset part is a face part, and the face contour can be extracted from the face image, the position of the five sense organs in the face can be determined, and the characteristics of some details are blurred, so that an initial local image corresponding to the face part can be obtained. For example, the features of eyes, nose, mouth, etc. may be blurred in the extracted face contour.
For example, the preset part is an eye part, and the face image can be cut to obtain an initial partial image corresponding to the eye part.
In the target image, a confidence level for each initial partial image may also be determined. The confidence may indicate at least one of: integrity, occlusion degree and definition of the initial partial image. It will be appreciated that the higher the integrity, the less occlusion, and the higher the sharpness of the initial partial image, the higher the confidence of the initial partial image.
For example, due to the shielding of a mask or a cap, the initial partial image corresponding to the extracted face part of the person is incomplete, and the confidence of the initial partial image is low.
S302, determining the initial partial image with the confidence coefficient larger than or equal to a preset threshold value as a target partial image.
For example, assuming that the confidence levels of the 3 initial partial images and each initial partial image are obtained by processing the target image in step S301, where the confidence levels of the initial partial image 1, the initial partial image 2, and the initial partial image 3 are 96%, 91%, 60% in order, and the preset threshold value is 90%, the target partial image is the initial partial image 1 and the initial partial image 2.
S303, determining the image features corresponding to the target partial image as a plurality of partial features.
The number of the target partial images can be multiple, and feature extraction processing can be performed on each target partial image in the multiple target partial images to obtain the image feature corresponding to each target partial image. The image feature corresponding to each target partial image may be determined as 1 partial feature.
For example, there are 3 target partial images, and 3 partial features can be obtained.
By performing the steps S301 to S303 as shown above, a plurality of local features can be determined in the target image, and next, the matching order of the plurality of local features can be determined by performing the steps S304 to S306 as shown below.
S304, determining a plurality of target positions corresponding to the local features.
For example, if the local feature is an image feature of an eye portion, the target portion is determined to be the eye portion.
S305, determining a preset sequence corresponding to the preset parts.
The plurality of preset sites includes a plurality of target sites.
Alternatively, the preset sequence of the plurality of preset parts may be determined according to the image processing requirement of the user. For example, when the plurality of preset portions are a face portion, an eye portion, a nose portion, and a mouth portion, respectively, the preset sequence may be: the face part of the person is larger than the eye part, the nose part and the mouth part.
S306, determining a matching sequence according to the preset sequence and the target positions.
For example, if the preset sequence is face part > eye part > nose part > mouth part, and the target parts are face part and eye part respectively, the matching sequence is: the face area of the person > the eye area.
S307, determining a cluster feature set corresponding to the first local feature according to the matching level corresponding to the first local feature aiming at the first local feature in the plurality of local features.
The first local feature may be a first local feature of the plurality of local features. For example, assume that there are 3 local features, and the matching order of the 3 local features is: the face scale feature > eye feature > nose feature, and the first local feature may be a face scale feature.
In one implementation, for a first local feature of the plurality of local features, a matching level corresponding to the first local feature may be determined by executing steps S3071-S3073 shown below, and a cluster feature set corresponding to the first local feature may be determined.
S3071, determining a local part corresponding to the first local feature.
For example, if the first local feature is a face scale feature, then the local region is determined to be a face region.
S3072, determining a matching level corresponding to the first local feature according to the local part.
The corresponding relation between the preset part and the preset matching level can be obtained, and the matching level corresponding to the first local feature is determined according to the local part corresponding to the first local feature and the corresponding relation.
For example, the correspondence between the preset location and the preset matching level may be as shown in table 3:
TABLE 3 Table 3
For example, the first local feature is a face scale feature, the local part corresponding to the face scale feature is a face part, and the matching level corresponding to the face part is one level according to the correspondence in table 3.
S3073, determining a cluster feature set corresponding to the matching level corresponding to the first local feature as the cluster feature set corresponding to the first local feature.
The first local feature is shown as a first level, and the cluster feature set corresponding to the first level is shown as a face proportional cluster feature set in the preset cluster feature sets shown in table 1.
S308, determining a last local feature of the second local features aiming at any second local feature except the first local feature in the plurality of local features, and determining a cluster feature set corresponding to the second local feature according to the matching degree of the last local feature and each cluster feature in the corresponding cluster feature set.
In one implementation, for any second local feature other than the first local feature in the plurality of local features, the set of clustered features corresponding to the second local feature may be determined by performing steps S3081 and S3082 as shown below.
S3081, determining at least one first cluster feature in the cluster feature set corresponding to the last local feature according to the matching degree of the last local feature and each cluster feature in the corresponding cluster feature set.
Optionally, in the cluster feature set corresponding to the last local feature, a cluster feature with a matching degree higher than a preset matching degree may be determined as the first cluster feature.
Illustratively, the last local feature is a face proportion feature, the face proportion clustering feature set corresponding to the face proportion feature includes 10 face proportion clustering features, and the matching degree between the face proportion feature and the 10 face proportion clustering features is shown in table 4:
TABLE 4 Table 4
For example, assuming that the preset matching degree is 80%, according to table 4, there may be two first clustering features in the clustering feature set corresponding to the face proportional feature, which are the face proportional clustering feature 1 and the face proportional clustering feature 2.
When the cluster feature matching processing is performed, a specific matching processing mode can be selected according to the service scene. For example, if the time limit requirement for the matching process is strict, multiple threads can be started to perform parallel matching processing on the last local feature and each cluster feature in the corresponding cluster feature set, so as to improve the efficiency of the matching process. If the calculation resources occupied by the matching processing are smaller, the matching processing can be sequentially performed on the last local feature and each cluster feature in the corresponding cluster feature set, and when the matching processing is performed on the next local feature corresponding to the last local feature, the matching processing can be performed on the next local feature and each cluster feature in the next-stage cluster feature set corresponding to each first cluster feature according to the matching degree of each first cluster feature.
S3082, determining a cluster feature set corresponding to the second local feature includes: and the next-stage cluster feature set corresponding to each first cluster feature.
For example, the first clustering features are face proportional clustering feature 1 and face proportional clustering feature 2, respectively, and in table 1, the next-stage clustering feature set corresponding to the face proportional clustering feature 1 includes: the clustering feature set of the next stage corresponding to the eye clustering feature 1-eye clustering feature 5 and the face proportional clustering feature 2 comprises the following steps: the eye clustering feature 1-5, the clustering feature set corresponding to the second local feature includes: the face proportional cluster feature 1 corresponds to 5 eye cluster features and the face proportional cluster feature 2 corresponds to 5 eye cluster features.
In one implementation, the subset of targets may be determined by performing steps S309-S312 as shown below.
S309, determining a third local feature in the local features according to the matching sequence.
The third local feature may be a last local feature of the plurality of local features.
For example, the plurality of local features are face proportion features, eye features and nose features, respectively, and the matching sequence of the 3 local features is as follows: face scale features > eye features > nose features, then the third local feature is a nose feature.
S310, obtaining the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set.
Alternatively, a matching degree algorithm in the related art may be used to calculate the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set, and the present application is not limited to the type of the matching degree algorithm.
S311, determining at least one second cluster feature in the cluster feature set corresponding to the third local feature according to the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set.
Optionally, in the cluster feature set corresponding to the third local feature, a cluster feature with a matching degree higher than a preset matching degree may be determined as the second cluster feature.
S312, determining the image corresponding to the second aggregate characteristic in the image set as a target subset.
For example, in table 2, assume that the second class feature is: face proportional clustering feature 10-eye clustering feature 1-nose clustering feature 1, wherein the images corresponding to the second clustering feature are face images 90001-91000, and the target subset comprises face images 90001-91000.
S313, matching the target image with the images in the target subset to obtain an image matching result.
Note that, the specific execution process of S313 may refer to the specific execution process of S204, which is not described herein.
The image processing method provided by the application can determine a plurality of local features and the matching sequence of the plurality of local features in the target image, and sequentially carry out hierarchical matching processing on the plurality of local features of the target image according to the matching sequence of the plurality of local features so as to determine a target subset in the image set, and carry out matching processing on the images in the target subset of the target image to obtain a matching result. The method can enable the target image to be matched with fewer images when the image matching processing is carried out, reduce the time of the target image for the matching processing and improve the image matching efficiency of the target image.
Optionally, the image processing method provided by the application further relates to a process of storing a plurality of preset cluster feature sets in a database, and the storing of the plurality of preset cluster feature sets in the database may include the following steps:
(1) N images to be stored are acquired.
The image to be stored may be a face image, and the image to be stored may also be an image of a target object or a target scene. The image to be stored can be an image sent by the client or an image acquired in a preset storage space.
Wherein N is a positive integer.
(2) And determining a plurality of local features corresponding to the preset parts in each image to be stored.
Local features may be represented by feature vectors.
If the image to be stored is a face image, the plurality of preset positions may include a face position, an eye position, a nose position, a mouth position, an ear position, and the like, and the plurality of local features may be face scale features, eye features, nose features, mouth features, ear features, and the like.
(3) And classifying the plurality of local features of each image to be stored in the N images to be stored according to the preset parts to obtain a plurality of feature sets corresponding to the preset parts.
For a feature set corresponding to any one preset position, the feature set may include N local features corresponding to the preset position. For example, assuming that the preset portion is an eye portion, if there are 10 ten thousand images to be stored, local features corresponding to the eye portion of each of the 10 ten thousand images to be stored may be classified into a feature set, where the feature set includes 10 ten thousand eye features.
(4) And clustering N local features in the feature set corresponding to each preset position according to the matching level corresponding to each preset position to obtain a clustering feature set corresponding to each preset position.
The set of cluster features may include a plurality of cluster features.
Alternatively, a K-Means (K-Means) clustering algorithm may be used to perform clustering processing on each level of local features of the plurality of images to be stored. The K-Means (K-Means) clustering algorithm comprises the following steps: step 1, randomly selecting k center points in a data set; step 2, respectively calculating the distances from each data point to k center points, and classifying the data points according to the distances; step 3, calculating the midpoint position of the similar data points as the center point position of the similar type to be updated; and 4, updating the center point, repeating the step 2, and ending the algorithm if the sum of the distances between each data point and the center point of the class to which each data point belongs is not changed.
For example, there are 3 preset parts, respectively, a face part, an eye part and a nose part, wherein the matching levels corresponding to the 3 preset parts are respectively a first level, a second level and a third level. Clustering N face proportion features in the feature set corresponding to the face part to obtain a plurality of face proportion clustering features corresponding to the face part; clustering N eye features in the feature set corresponding to the eye part to obtain a plurality of eye clustering features corresponding to the eye part; and clustering N nose features corresponding to the nose parts to obtain a plurality of nose clustering features corresponding to the nose parts.
(5) And classifying and storing each image to be stored in the plurality of images to be stored according to the cluster characteristics corresponding to the local characteristics of the N images to be stored.
For example, the N images to be stored may be classified and stored in the manner shown in table 2.
It can be understood that, in the initial stage, when the number of the images to be stored in the preset multiple clustering feature sets is small, the multiple images to be stored can be stored according to the methods (1) - (5) so as to more accurately obtain the clustering features corresponding to the preset parts. When the number of the images to be stored in the preset multiple clustering feature sets reaches a preset level, the local features corresponding to the preset positions of the images to be stored can be sequentially matched with the multiple clustering features in the clustering feature sets corresponding to the preset positions to determine the clustering feature with the highest matching degree corresponding to the preset positions, and the images to be stored are stored in a grading mode according to the clustering feature with the highest matching degree corresponding to the preset positions.
Fig. 4 is a schematic structural diagram of an image processing apparatus according to the present application. Referring to fig. 4, the image processing apparatus 10 may include:
A determining module 11, configured to determine a plurality of local features in a target image, and determine a matching order of the plurality of local features, where a confidence level of the local features is greater than or equal to a preset threshold;
The determining module 11 is further configured to sequentially determine, according to the matching sequence, a cluster feature set corresponding to each local feature from a plurality of preset cluster feature sets, where the cluster feature set includes a plurality of cluster features;
the determining module 11 is further configured to determine a target subset in the image set according to the plurality of local features, the matching order, and the cluster feature set corresponding to each local feature;
And the matching processing module 12 is used for performing matching processing on the target image and the images in the target subset to obtain an image matching result.
The image processing device provided by the embodiment of the application can execute the technical scheme shown in the embodiment of the method, and the implementation principle and the beneficial effects are similar, and are not repeated here.
In one possible implementation, the determining module 11 is specifically configured to:
Determining a cluster feature set corresponding to a first local feature of the plurality of local features according to a matching level corresponding to the first local feature, wherein the first local feature is the first local feature of the plurality of local features;
and determining a last local feature of the second local feature aiming at any second local feature except the first local feature in the local features, and determining a cluster feature set corresponding to the second local feature according to the matching degree of the last local feature and each cluster feature in the corresponding cluster feature set.
In a possible implementation, the determining module 11 is specifically further configured to:
determining a local part corresponding to the first local feature;
determining a matching level corresponding to the first local feature according to the local part;
And determining the cluster feature set corresponding to the matching level corresponding to the first local feature as the cluster feature set corresponding to the first local feature.
In a possible implementation, the determining module 11 is specifically further configured to:
Determining at least one first cluster feature in the cluster feature set corresponding to the last local feature according to the matching degree of each cluster feature in the cluster feature set corresponding to the last local feature;
The determining the cluster feature set corresponding to the second local feature includes: and the next-stage cluster feature set corresponding to each first cluster feature.
In a possible implementation, the determining module 11 is specifically further configured to:
Determining a third local feature from the plurality of local features according to the matching sequence, wherein the third local feature is the last local feature in the plurality of local features;
Obtaining the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set;
Determining at least one second cluster feature in the cluster feature set corresponding to the third local feature according to the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set;
and determining the image corresponding to the second aggregate characteristic in the image set as the target subset.
In a possible implementation, the determining module 11 is specifically further configured to:
determining a plurality of initial partial images corresponding to a plurality of preset parts and the confidence degree of each initial partial image in the target image, wherein the confidence degree indicates at least one of the following: the integrity, shielding degree and definition of the initial partial image;
Determining an initial partial image with the confidence coefficient being greater than or equal to a preset threshold value as a target partial image;
and determining the image features corresponding to the target local image as the local features.
In a possible implementation, the determining module 11 is specifically further configured to:
determining a plurality of target positions corresponding to the local features;
Determining a preset sequence corresponding to a plurality of preset parts, wherein the plurality of preset parts comprise the plurality of target parts;
and determining the matching sequence according to the preset sequence and the target positions.
Fig. 5 is a schematic diagram of a hardware structure of an image processing apparatus provided by the present application. Referring to fig. 5, the image processing apparatus 20 may include a processor 21 and a memory 22. Wherein the processor 21 and the memory 22 may communicate; the processor 21 and the memory 22 are in communication via a communication bus 23, as an example.
The memory 22 is used for storing computer-executable instructions;
The processor 21 is configured to execute computer-executable instructions stored in the memory 22, so that the processor 21 executes the image processing method as shown in the above-described method embodiment.
Optionally, the image processing device 20 may further comprise a communication interface, which may comprise a transmitter and/or a receiver.
Alternatively, the Processor may be a central processing unit (Central Processing Unit, CPU), other general purpose Processor, digital signal Processor (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution.
The present application provides a computer-readable storage medium having stored therein computer-executable instructions which, when executed by a processor, are adapted to carry out the image processing method according to any of the embodiments described above.
The present application provides a computer program product comprising a computer program which, when executed by a processor, causes the computer to perform the above-described image processing method.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be appreciated that the various numerical numbers referred to in the embodiments of the present application are merely for ease of description and are not intended to limit the scope of the embodiments of the present application. In the present disclosure, the term "include" and variations thereof may refer to non-limiting inclusion; the term "or" and variations thereof may refer to "and/or". The terms "first," "second," and the like, herein, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. In the present application, "a plurality of" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.
Claims (9)
1. An image processing method, comprising:
Determining a plurality of local features in a target image, and determining the matching sequence of the plurality of local features, wherein the confidence of the local features is larger than or equal to a preset threshold;
according to the matching sequence, determining a cluster feature set corresponding to each local feature in a plurality of preset cluster feature sets in sequence, wherein the cluster feature set comprises a plurality of cluster features;
determining a target subset in the image set according to the local features, the matching sequence and the clustering feature set corresponding to each local feature;
matching the target image with the images in the target subset to obtain an image matching result;
Determining a target subset in the image set according to the local features, the matching sequence and the clustering feature set corresponding to each local feature, wherein the target subset comprises:
Determining a third local feature from the plurality of local features according to the matching sequence, wherein the third local feature is the last local feature in the plurality of local features;
Obtaining the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set;
Determining at least one second cluster feature in the cluster feature set corresponding to the third local feature according to the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set;
and determining the image corresponding to the second aggregate characteristic in the image set as the target subset.
2. The method according to claim 1, wherein determining, in a preset plurality of cluster feature sets, a cluster feature set corresponding to each local feature in turn according to the matching order, includes:
Determining a cluster feature set corresponding to a first local feature of the plurality of local features according to a matching level corresponding to the first local feature, wherein the first local feature is the first local feature of the plurality of local features;
and determining a last local feature of the second local feature aiming at any second local feature except the first local feature in the local features, and determining a cluster feature set corresponding to the second local feature according to the matching degree of the last local feature and each cluster feature in the corresponding cluster feature set.
3. The method of claim 2, wherein determining the set of clustered features corresponding to the first local feature according to the level of matching corresponding to the first local feature comprises:
determining a local part corresponding to the first local feature;
determining a matching level corresponding to the first local feature according to the local part;
And determining the cluster feature set corresponding to the matching level corresponding to the first local feature as the cluster feature set corresponding to the first local feature.
4. The method of claim 2, wherein determining the set of clustered features corresponding to the second local feature based on the degree of matching of the last local feature with each clustered feature in the set of clustered features comprises:
Determining at least one first cluster feature in the cluster feature set corresponding to the last local feature according to the matching degree of each cluster feature in the cluster feature set corresponding to the last local feature;
The determining the cluster feature set corresponding to the second local feature includes: and the next-stage cluster feature set corresponding to each first cluster feature.
5. The method of any of claims 1-4, wherein determining a plurality of local features in the target image comprises:
determining a plurality of initial partial images corresponding to a plurality of preset parts and the confidence degree of each initial partial image in the target image, wherein the confidence degree indicates at least one of the following: the integrity, shielding degree and definition of the initial partial image;
Determining an initial partial image with the confidence coefficient being greater than or equal to a preset threshold value as a target partial image;
and determining the image features corresponding to the target local image as the local features.
6. The method of any of claims 1-4, wherein determining a matching order of the plurality of local features comprises:
determining a plurality of target positions corresponding to the local features;
Determining a preset sequence corresponding to a plurality of preset parts, wherein the plurality of preset parts comprise the plurality of target parts;
and determining the matching sequence according to the preset sequence and the target positions.
7. An image processing apparatus, comprising:
the determining module is used for determining a plurality of local features in the target image and determining the matching sequence of the local features, and the confidence coefficient of the local features is larger than or equal to a preset threshold value;
the determining module is further configured to sequentially determine a cluster feature set corresponding to each local feature from a plurality of preset cluster feature sets according to the matching sequence, where the cluster feature set includes a plurality of cluster features;
the determining module is further configured to determine a target subset in the image set according to the plurality of local features, the matching sequence, and the cluster feature set corresponding to each local feature;
the matching processing module is used for carrying out matching processing on the target image and the images in the target subset to obtain an image matching result;
The determining module is specifically configured to determine a third local feature from the plurality of local features according to the matching sequence, where the third local feature is a last local feature in the plurality of local features; obtaining the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set; determining at least one second cluster feature in the cluster feature set corresponding to the third local feature according to the matching degree between the third local feature and each cluster feature in the corresponding cluster feature set; and determining the image corresponding to the second aggregate characteristic in the image set as the target subset.
8. An image processing apparatus, characterized by comprising: a processor, and a memory communicatively coupled to the processor;
The memory stores computer-executable instructions;
The processor executes computer-executable instructions stored in the memory to implement the method of any one of claims 1 to 6.
9. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the method of any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410172433.9A CN118015312B (en) | 2024-02-06 | 2024-02-06 | Image processing method, device and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410172433.9A CN118015312B (en) | 2024-02-06 | 2024-02-06 | Image processing method, device and equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118015312A CN118015312A (en) | 2024-05-10 |
CN118015312B true CN118015312B (en) | 2024-08-06 |
Family
ID=90942565
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410172433.9A Active CN118015312B (en) | 2024-02-06 | 2024-02-06 | Image processing method, device and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118015312B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101523412A (en) * | 2006-10-11 | 2009-09-02 | 惠普开发有限公司 | Face-based image clustering |
CN115019360A (en) * | 2022-04-29 | 2022-09-06 | 以萨技术股份有限公司 | Matching method and device, nonvolatile storage medium and computer equipment |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110956195B (en) * | 2019-10-11 | 2023-06-02 | 平安科技(深圳)有限公司 | Image matching method, device, computer equipment and storage medium |
CN112966752B (en) * | 2021-03-09 | 2024-05-28 | 厦门市公安局 | Image matching method and device |
-
2024
- 2024-02-06 CN CN202410172433.9A patent/CN118015312B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101523412A (en) * | 2006-10-11 | 2009-09-02 | 惠普开发有限公司 | Face-based image clustering |
CN115019360A (en) * | 2022-04-29 | 2022-09-06 | 以萨技术股份有限公司 | Matching method and device, nonvolatile storage medium and computer equipment |
Also Published As
Publication number | Publication date |
---|---|
CN118015312A (en) | 2024-05-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111950723B (en) | Neural network model training method, image processing method, device and terminal equipment | |
CN110288513B (en) | Method, apparatus, device and storage medium for changing face attribute | |
CN113807451B (en) | Panoramic image feature point matching model training method and device and server | |
CN112614110B (en) | Method and device for evaluating image quality and terminal equipment | |
CN109740479A (en) | A kind of vehicle recognition methods, device, equipment and readable storage medium storing program for executing again | |
CN111626303B (en) | Sex and age identification method, sex and age identification device, storage medium and server | |
CN113705462A (en) | Face recognition method and device, electronic equipment and computer readable storage medium | |
CN110135413B (en) | Method for generating character recognition image, electronic equipment and readable storage medium | |
CN111784699B (en) | Method and device for carrying out target segmentation on three-dimensional point cloud data and terminal equipment | |
CN114299363A (en) | Training method of image processing model, image classification method and device | |
CN111091106A (en) | Image clustering method and device, storage medium and electronic device | |
CN111821693B (en) | Method, device, equipment and storage medium for detecting perspective plug-in of game | |
CN113378864B (en) | Method, device and equipment for determining anchor frame parameters and readable storage medium | |
CN118015312B (en) | Image processing method, device and equipment | |
CN112287945A (en) | Screen fragmentation determination method and device, computer equipment and computer readable storage medium | |
CN112258647B (en) | Map reconstruction method and device, computer readable medium and electronic equipment | |
CN110135428A (en) | Image segmentation processing method and device | |
CN110414845A (en) | For the methods of risk assessment and device of target transaction | |
CN115457634A (en) | Bimodal recognition method and bimodal recognition device for face image | |
CN112598074B (en) | Image processing method and device, computer readable storage medium and electronic equipment | |
CN111291635A (en) | Artificial intelligence detection method and device, terminal and computer readable storage medium | |
CN113762042A (en) | Video identification method, device, equipment and storage medium | |
CN115760888A (en) | Image processing method, image processing device, computer and readable storage medium | |
CN112101330B (en) | Image processing method, image processing apparatus, electronic device, and storage medium | |
CN114582006B (en) | Child age-crossing face recognition method and device, electronic equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |