Nothing Special   »   [go: up one dir, main page]

CN117078685B - Cosmetic efficacy evaluation method, device, equipment and medium based on image analysis - Google Patents

Cosmetic efficacy evaluation method, device, equipment and medium based on image analysis Download PDF

Info

Publication number
CN117078685B
CN117078685B CN202311338178.2A CN202311338178A CN117078685B CN 117078685 B CN117078685 B CN 117078685B CN 202311338178 A CN202311338178 A CN 202311338178A CN 117078685 B CN117078685 B CN 117078685B
Authority
CN
China
Prior art keywords
image
data
color
facial
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311338178.2A
Other languages
Chinese (zh)
Other versions
CN117078685A (en
Inventor
杜一杰
孟宏
朱文驿
宋帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taihe Kangmei Beijing Research Institute of Traditional Chinese Medicine Co Ltd
Original Assignee
Taihe Kangmei Beijing Research Institute of Traditional Chinese Medicine Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taihe Kangmei Beijing Research Institute of Traditional Chinese Medicine Co Ltd filed Critical Taihe Kangmei Beijing Research Institute of Traditional Chinese Medicine Co Ltd
Priority to CN202311338178.2A priority Critical patent/CN117078685B/en
Publication of CN117078685A publication Critical patent/CN117078685A/en
Application granted granted Critical
Publication of CN117078685B publication Critical patent/CN117078685B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a cosmetic efficacy evaluation method, device, equipment and medium based on image analysis, wherein the method comprises the following steps: acquiring first face data before applying cosmetics to the face and second face data after applying cosmetics; constructing a first facial thermal image containing a color corresponding to the first facial data and a second facial thermal image containing a color corresponding to the second facial data; and determining the efficacy of the cosmetic by comparing the difference information of the first color in the first facial thermal image with the second color in the second facial thermal image. The change of the data size is replaced by the change of the color in the facial thermal image, so that a user can clearly know the change condition of the user before and after using the cosmetics, and further the efficacy of the cosmetics is determined.

Description

Cosmetic efficacy evaluation method, device, equipment and medium based on image analysis
Technical Field
The application relates to the technical field of cosmetic efficacy testing, in particular to a cosmetic efficacy evaluation method, device, equipment and medium based on image analysis.
Background
Cosmetic refers to chemical industry or fine chemical products that are spread on any part of the surface of the human body, such as skin, hair, nails, lips, teeth, etc., by painting, spraying, or the like, to clean, maintain, beautify, modify, and change the appearance, or correct the smell of the human body, and maintain a good state.
In the face of cosmetic products of the full-scale tourmaline, consumers often choose cosmetic products with better efficacy. In the prior art, professional technicians are required for the testing process and the testing result of the efficacy of the cosmetics, and the specific quality of the efficacy of the cosmetics cannot be intuitively known for common consumers.
Disclosure of Invention
In view of the foregoing, it is an object of the present application to provide a method, apparatus, device and medium for evaluating efficacy of cosmetics based on image analysis, so as to overcome the problems in the prior art.
In a first aspect, embodiments of the present application provide a method for evaluating efficacy of a cosmetic based on image analysis, the method comprising:
acquiring first face data before applying cosmetics to the face and second face data after applying cosmetics;
constructing a first facial thermal image containing a color corresponding to the first facial data and a second facial thermal image containing a color corresponding to the second facial data;
and determining the efficacy of the cosmetic by comparing the difference information of the first color in the first facial thermal image with the second color in the second facial thermal image.
In some embodiments of the present application, the constructing a first facial thermal image including a color corresponding to the first facial data and a second facial thermal image including a color corresponding to the second facial data includes:
acquiring a first initial image of the face before the face is applied with the cosmetics and a second initial image of the face after the face is applied with the cosmetics;
constructing a first color image according to the first facial data and a preset color mapping relation, and constructing a second color image according to the second facial data and a preset color mapping relation;
and fusing the first initial image and the first color image to obtain the first facial thermal image, and fusing the second initial image and the second color image to obtain the second facial thermal image.
In some technical solutions of the present application, the constructing a first color image according to the first facial data and a preset color mapping relationship, and constructing a second color image according to the second facial data and a preset color mapping relationship, includes:
constructing a blank image corresponding to the face, and detecting key points in the blank image;
performing color filling on a first blank image according to the first face data of the key points and a preset color mapping relation to obtain a first color image;
and performing color filling on a second blank image according to the second face data of the key points and a preset color mapping relation to obtain the second color image.
In some embodiments of the present application, the above method performs color filling by:
classifying the first facial data and the second facial data according to each facial area to obtain first sub-data and second sub-data under each facial area;
performing color filling on each face area in the first blank image according to the first sub-data of the key points and a preset color mapping relation; the face area filled with the color is integrated, so that the first color image is obtained;
performing color filling on each face area in the second blank image according to the second sub-data of the key points and a preset color mapping relation; and integrating the face areas filled with the colors to obtain the second color image.
In some technical solutions of the present application, the method further includes:
and adjusting the first face data and the second face data by using the standard distribution interval to obtain the adjusted first face data and second face data.
In some technical solutions of the present application, the method performs color filling on the first blank image by:
constructing a coordinate system of the first blank image, and determining coordinates of each key point;
constructing a solid circle by taking the coordinates of the key points as circle centers and taking the first face data of the key points as pixel values and preset pixel values as radiuses;
and filling gaps among the solid circles by using an interpolation method according to the preset color mapping to obtain a first color image.
In some technical solutions of the present application, the method further includes:
normalizing the first facial data to obtain the first facial data in a preset range interval;
and taking the first face data in a preset range interval as a pixel value.
In a second aspect, embodiments of the present application provide a device for cosmetic efficacy testing, the device comprising:
an acquisition module for acquiring first face data before the application of the cosmetic to the face and second face data after the application of the cosmetic;
a building module for building a first facial thermal image comprising a color corresponding to the first facial data and a second facial thermal image comprising a color corresponding to the second facial data;
and the comparison module is used for determining the efficacy of the cosmetics by comparing the difference information of the first color in the first facial thermal image and the second color in the second facial thermal image.
In a third aspect, an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the steps of the method for evaluating efficacy of a cosmetic based on image analysis are implemented when the processor executes the computer program.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having a computer program stored thereon, which when executed by a processor performs the steps of the above-described cosmetic efficacy evaluation method based on image analysis.
The technical scheme provided by the embodiment of the application can comprise the following beneficial effects:
the method comprises the steps of acquiring first facial data before the face is applied with cosmetics and second facial data after the face is applied with cosmetics; constructing a first facial thermal image containing a color corresponding to the first facial data and a second facial thermal image containing a color corresponding to the second facial data; and determining the efficacy of the cosmetic by comparing the difference information of the first color in the first facial thermal image with the second color in the second facial thermal image.
The method and the device utilize the change of the color shade in the facial thermal image to replace the change of the data size, so that a user can clearly know the change condition of the user before and after using the cosmetics, and further determine the efficacy of the cosmetics.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a method for evaluating efficacy of a cosmetic based on image analysis according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a test point according to an embodiment of the present application;
fig. 3 is a schematic diagram showing 81 coordinate points returned by the face detection tool according to the embodiment of the present application;
FIG. 4 is a schematic diagram of a device for testing the efficacy of cosmetics according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the accompanying drawings in the present application are only for the purpose of illustration and description, and are not intended to limit the protection scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this application, illustrates operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to the flow diagrams and one or more operations may be removed from the flow diagrams as directed by those skilled in the art.
In addition, the described embodiments are only some, but not all, of the embodiments of the present application. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that the term "comprising" will be used in the embodiments of the present application to indicate the presence of the features stated hereinafter, but not to exclude the addition of other features.
The oil and water content of the skin is an important indicator for skin health and aesthetics. The oil content of skin is often referred to as sebum secretion, and excessive or insufficient sebum secretion affects the health of the skin. The moisture content of the skin needs to keep enough moisture, otherwise, the skin becomes dry and rough, and the problems of wrinkles, fine lines and the like are easy to occur. Thus, knowing the moisture content, oil content, and maintaining a moderate sebum secretion and moisture content can help maintain the health and aesthetics of the skin.
In the prior art, aiming at the changes of facial moisture content and grease content in the efficacy evaluation in the current cosmetic research and development process, a contact type device is usually used for testing, and a bar graph is used for showing the changes of data. But this form of presentation is difficult for the consumer to understand. The primary reason is that consumers do not know the value range of the indicator, but only the increase or decrease of the data is visible from the bar graph but not what level the indicator is increased or decreased to.
Based on the above, the embodiment of the application provides a cosmetic efficacy evaluation method, device, equipment and medium based on image analysis, and the description is given below by way of example.
Fig. 1 shows a schematic flow chart of a method for evaluating efficacy of cosmetics based on image analysis according to an embodiment of the present application, wherein the method includes steps S101 to S103; specific:
s101, acquiring first facial data before the face is subjected to cosmetics and second facial data after the cosmetics are applied;
s102, constructing a first facial thermal image containing a color corresponding to the first facial data and a second facial thermal image containing a color corresponding to the second facial data;
s103, determining the efficacy of the cosmetic by comparing the difference information of the first color in the first facial thermal image and the second color in the second facial thermal image.
The method and the device utilize the change of the color shade in the facial thermal image to replace the change of the data size, so that a user can clearly know the change condition of the user before and after using the cosmetics, and further determine the efficacy of the cosmetics.
Some embodiments of the present application are described in detail below. The following embodiments and features of the embodiments may be combined with each other without conflict.
The embodiment of the application provides a way for testing the efficacy of cosmetics, and the testing method can be used for testing in the production process of cosmetics, or can be used for testing by a consumer after purchasing a cosmetic sample or cosmetics.
In S101 described above, in order to test the efficacy of the cosmetic, the present embodiment requires acquisition of first face data before the use of the cosmetic and second face data after the use of the cosmetic. The face data (including the first face data and the second face data) here includes moisture data or grease data. Facial data in this embodiment is collected for the same user, who may be a volunteer when testing during the cosmetic manufacturing process. The consumer is the consumer at the time of his own test. In order to ensure the test effect, the embodiment of the application selects users with the grease content of more than 100. The user acquires the moisture data or the oil data of the face through the acquisition device before the cosmetics, and measures the moisture data or the oil data of the face again after the user uses the cosmetics for a period of time (typically seven days).
In an alternative embodiment, the embodiment of the present application collects data of the preset test point through the CK skin index tester when collecting the first facial data and the second facial data. CK skin index testers are specialized electronic instruments that are commonly used to measure various indices of the skin, such as oil content, moisture content, elasticity, and the like. The principle of the instrument is that various indexes of the skin are measured through the change of current and resistance, and the instrument has the advantages of high testing speed and high accuracy. The test points are shown in fig. 2, three points of a median point Q1, a central left point Q2 and an eyebrow point Q3 are selected on the forehead of the face, two points of a nose bridge midpoint Q4 and a nose tip Q5 are selected on the nose, one point Q6 is selected on the cheek median, two points of a central point Q7 and a central left point Q8 are selected on the chin, and one point Q9 of the eyes is selected. Namely, the method acquires the water data or the grease data of the points in Q1-Q9.
After the first face data and the second face data are obtained, the embodiment of the present application constructs the first face thermal image and the second face thermal image through S102 described above.
In constructing the first facial thermal image, embodiments of the present application acquire a first initial image, where the first initial image characterizes a facial image acquired when the user is not using cosmetics. For example, taking a facial picture using a VISIA-CR. After the first initial image is acquired, a first color image is also required to be constructed, and the first color image and the first initial image are fused to obtain a first facial thermal image.
In order to construct the first color image, the embodiment of the application needs to create a first blank image according to the facial contour features of the user, and detect and obtain key points in the first blank image. Here, the detection of the key points of the blank image is completed through a face detection model. Specifically, the face detection tool library dlib is adopted, and the dlib face detection can feed back 81 coordinates including eyes, face contours, mouth and nose according to the input face image, as shown in fig. 3. According to the embodiment of the application, the first initial image is input and output to the face detection model of dlib, so that key points in the first initial image are obtained. Because the first blank image and the first initial image are both from the face of the same user, the positions of the first blank image and the first initial image have a corresponding relation, and the key points of the first initial image can be obtained at the same time.
After the key points of the first blank image are obtained, color filling is performed on the basis of each key point in the embodiment of the application, and the first blank image is filled into the first color image. The specific filling is as follows: constructing a coordinate system of the first blank image, and determining coordinates of each key point; constructing a solid circle by taking the coordinates of the key points as circle centers and taking the first face data of the key points as pixel values and preset pixel values as radiuses; and filling gaps among the solid circles by using an interpolation method according to the preset color mapping to obtain a first color image.
In an alternative embodiment, the preset color map is a color list that changes from dark blue to dark red, the RGB values of the dark blue and dark red are determined, the middle gradient color RGB values are filled by linear interpolation, and the color list has a total of 201 RGB color values. When the first face data is moisture data, the decrease in moisture content is sequentially indicated from dark blue to red. When the first facial data is grease data, the grease content is sequentially increased from dark blue to red. After the coordinates of the key points are determined, a solid circle is made with the coordinates as a center, 80 pixels as a radius, and the numerical value of the moisture content or the grease content as a pixel value. And obtaining solid circles with different colors on the first blank image, filling gaps among the solid circles through a idw interpolation method, and obtaining the first color image.
In constructing the second facial thermal image, embodiments of the present application acquire a second initial image, where the second initial image characterizes a facial image acquired after the user has applied the cosmetic. For example, taking a facial picture using a VISIA-CR. After the second initial image is acquired, a second color image is also required to be constructed, and the second color image and the second initial image are fused to obtain a second facial thermal image.
In order to construct the second color image, the embodiment of the application needs to create a second blank image according to the facial contour features of the user, and detect and obtain the key points in the second blank image. Here, the detection of the key points of the blank image is completed through a face detection model. Specifically, the face detection tool library dlib is adopted, and the dlib face detection can feed back 81 coordinates including eyes, face contours, mouth and nose according to the input face image, as shown in fig. 3. In the embodiment of the application, the second initial image is input and output to the face detection model of dlib, so that key points in the second initial image are obtained. Because the second blank image and the second initial image are both from the face of the same user, the positions of the second blank image and the second initial image have a corresponding relation, and the key points of the second blank image can be obtained at the same time.
After the key points of the second blank image are obtained, color filling is performed on the basis of each key point in the embodiment of the application, and the second blank image is filled into the second color image. The specific filling is as follows: constructing a coordinate system of the second blank image, and determining coordinates of each key point; constructing a solid circle by taking the coordinates of the key points as circle centers and taking the second face data of the key points as pixel values and preset pixel values as radiuses; and filling gaps among the solid circles by using an interpolation method according to the preset color mapping to obtain a second color image.
In an alternative embodiment, the preset color map is a color list changing from deep blue to deep red, the RGB values of the deep blue and deep red are determined, the middle gradient color RGB values are filled by linear interpolation, and the color list two has 201 RGB color values in total. When the second face data is moisture data, the decrease of the moisture content is sequentially indicated from dark blue to red. When the second face data is grease data, the grease content is sequentially increased from dark blue to red. After the coordinates of the key points are determined, a solid circle is made with the coordinates as a center, 80 pixels as a radius, and the numerical value of the moisture content or the grease content as a pixel value. And obtaining solid circles with different colors on the second blank image, and filling gaps among the solid circles through a idw interpolation method to obtain the second color image.
In an optional embodiment, when the first face data and the second face data are taken as pixel values, and considering that there may be a larger value in the first face data and the second face data, in this embodiment, before taking the first face data and the second face data as the pixel values, normalization processing is performed on the first face data and the second face data to obtain the first face data and the second face data in a preset range section, and then taking the first face data and the second face data of a person in the preset range section as the pixel values. Specifically, the moisture content and the fat content were normalized to the [0,200] interval by k=200/(Max-Min), and y=k (X-Min). The Y value is taken as the pixel value.
In order to ensure the accuracy of the first face data and the second face data, before the first face data and the second face data are used, the first face data and the second face data are adjusted by using the standard distribution interval, and adjusted first face data and second face data are obtained. Specifically, in the embodiment of the present application, the standard distribution interval is compared with an abnormal value that is too large or too small in the first face data and the second face data, and then the abnormal value that is too large in the first face data or the second face data is modified to be an upper boundary value of the standard distribution interval, and the abnormal value that is too small in the first face data or the second face data is modified to be a lower boundary value of the standard distribution interval. The adjusted first facial data is then used to construct a first color image and the adjusted second facial data is used to construct a second color image.
In an alternative embodiment, the facial data is collected for each test point, and these collection points reflect the facial T-zone (T-zone refers to forehead and nose where oil is easily delivered and is shaped much like a capital T, so called T-zone), periocular, cheek, chin moisture and oil information. Therefore, when constructing the first color image and the second color image, in order to achieve a better construction effect, the face is divided into a plurality of different face areas, namely, a T area, a periocular area, a cheek area and a chin area.
After the first face data and the second face data are acquired, the first face data and the second face data are classified according to the face areas, and first sub-data and second sub-data under the face areas are obtained. Performing color filling on each face area in the first blank image according to the first sub-data of the key points and a preset color mapping relation; and filling the colors of the face areas in the second blank image according to the second sub-data of the key points and a preset color mapping relation.
In S103, after the first facial thermal image and the second facial thermal image are obtained, the user can intuitively determine the efficacy of the cosmetic by observing the first color in the first facial thermal image and the second color in the second facial image.
According to the method, a proper color interval is established through the combination of the face image and the thermodynamic diagram, the health condition of the index is represented by utilizing two hues, the change of the index size is replaced by the change of the color, and a consumer can clearly know the moisture content and the grease content distribution of the face of the consumer and whether the face is in the condition of water shortage or excessive grease secretion.
Fig. 4 is a schematic structural view of a device for testing efficacy of cosmetics according to an embodiment of the present application, the device includes:
an acquisition module for acquiring first face data before the application of the cosmetic to the face and second face data after the application of the cosmetic;
a building module for building a first facial thermal image comprising a color corresponding to the first facial data and a second facial thermal image comprising a color corresponding to the second facial data;
and the comparison module is used for determining the efficacy of the cosmetics by comparing the difference information of the first color in the first facial thermal image and the second color in the second facial thermal image.
The construction includes a first facial thermal image having a color corresponding to the first facial data and a second facial thermal image having a color corresponding to the second facial data, including:
acquiring a first initial image of the face before the face is applied with the cosmetics and a second initial image of the face after the face is applied with the cosmetics;
constructing a first color image according to the first facial data and a preset color mapping relation, and constructing a second color image according to the second facial data and a preset color mapping relation;
and fusing the first initial image and the first color image to obtain the first facial thermal image, and fusing the second initial image and the second color image to obtain the second facial thermal image.
The constructing a first color image according to the first facial data and a preset color mapping relation, and constructing a second color image according to the second facial data and a preset color mapping relation, includes:
constructing a blank image corresponding to the face, and detecting key points in the blank image;
performing color filling on a first blank image according to the first face data of the key points and a preset color mapping relation to obtain a first color image;
and performing color filling on a second blank image according to the second face data of the key points and a preset color mapping relation to obtain the second color image.
Color filling is performed by:
classifying the first facial data and the second facial data according to each facial area to obtain first sub-data and second sub-data under each facial area;
performing color filling on each face area in the first blank image according to the first sub-data of the key points and a preset color mapping relation; the face area filled with the color is integrated, so that the first color image is obtained;
performing color filling on each face area in the second blank image according to the second sub-data of the key points and a preset color mapping relation; and integrating the face areas filled with the colors to obtain the second color image.
The device further comprises an adjusting module, wherein the adjusting module is used for adjusting the first facial data and the second facial data by using the standard distribution interval to obtain the adjusted first facial data and second facial data.
Color filling the first blank image by:
constructing a coordinate system of the first blank image, and determining coordinates of each key point;
constructing a solid circle by taking the coordinates of the key points as circle centers and taking the first face data of the key points as pixel values and preset pixel values as radiuses;
and filling gaps among the solid circles by using an interpolation method according to the preset color mapping to obtain a first color image.
The device further comprises a normalization module, wherein the normalization module is used for normalizing the first facial data to obtain the first facial data in a preset range interval;
and taking the first face data in a preset range interval as a pixel value.
As shown in fig. 5, an embodiment of the present application provides an electronic device for performing the image analysis-based cosmetic efficacy evaluation method in the present application, where the device includes a memory, a processor, a bus, and a computer program stored on the memory and executable on the processor, and the steps of the image analysis-based cosmetic efficacy evaluation method are implemented when the processor executes the computer program.
Specifically, the above-mentioned memory and processor may be general-purpose memory and processor, and are not particularly limited herein, and the above-mentioned cosmetic efficacy evaluation method based on image analysis can be performed when the processor runs a computer program stored in the memory.
Corresponding to the method for evaluating the efficacy of the cosmetics based on the image analysis in the application, the embodiment of the application also provides a computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, and the computer program is executed by a processor to execute the steps of the method for evaluating the efficacy of the cosmetics based on the image analysis.
Specifically, the storage medium can be a general-purpose storage medium, such as a removable disk, a hard disk, or the like, on which a computer program is executed to perform the above-described cosmetic efficacy evaluation method based on image analysis.
In the embodiments provided in this application, it should be understood that the disclosed systems and methods may be implemented in other ways. The system embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical functional division, and there may be additional divisions in actual implementation, and e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, system or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments provided in the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It should be noted that: like reference numerals and letters in the following figures denote like items, and thus once an item is defined in one figure, no further definition or explanation of it is required in the following figures, and furthermore, the terms "first," "second," "third," etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present application, and are not intended to limit the scope of the present application, but the present application is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, the present application is not limited thereto. Any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or make equivalent substitutions for some of the technical features within the technical scope of the disclosure of the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the corresponding technical solutions. Are intended to be encompassed within the scope of this application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A method for evaluating efficacy of a cosmetic based on image analysis, the method comprising:
acquiring first face data before applying cosmetics to the face and second face data after applying cosmetics; wherein the first facial data comprises moisture data or grease data and the second facial data comprises moisture data or grease data;
constructing a first facial thermal image containing a color corresponding to the first facial data and a second facial thermal image containing a color corresponding to the second facial data;
determining the efficacy of the cosmetic by comparing the difference information of the first color in the first facial thermal image with the second color in the second facial thermal image;
the first facial thermal image is obtained according to a first color image and a first initial image; wherein the first initial image is a facial image acquired when the user does not use cosmetics; the first color image is obtained by a first blank image containing key points, wherein the key points are obtained by inputting the first initial image into a face detection model; the method comprises the following steps:
constructing a coordinate system of the first blank image, and determining coordinates of each key point;
constructing a solid circle by taking the coordinates of the key points as the circle center, taking a preset pixel value as the radius and taking the first face data of the key points as the pixel value of the circle;
and filling gaps among the solid circles by using an interpolation method according to the preset color mapping to obtain a first color image.
2. The method of claim 1, wherein the constructing a first facial thermal image comprising a color corresponding to the first facial data and a second facial thermal image comprising a color corresponding to the second facial data comprises:
acquiring a first initial image of the face before the face is applied with the cosmetics and a second initial image of the face after the face is applied with the cosmetics;
constructing a first color image according to the first facial data and a preset color mapping relation, and constructing a second color image according to the second facial data and a preset color mapping relation;
and fusing the first initial image and the first color image to obtain the first facial thermal image, and fusing the second initial image and the second color image to obtain the second facial thermal image.
3. The method of claim 2, wherein constructing a first color image from the first facial data and a preset color mapping relationship, and constructing a second color image from the second facial data and a preset color mapping relationship, comprises:
constructing a blank image corresponding to the face, and detecting key points in the blank image;
performing color filling on a first blank image according to the first face data of the key points and a preset color mapping relation to obtain a first color image;
and performing color filling on a second blank image according to the second face data of the key points and a preset color mapping relation to obtain the second color image.
4. A method according to claim 3, characterized in that the method is color-filled by:
classifying the first facial data and the second facial data according to each facial area to obtain first sub-data and second sub-data under each facial area;
performing color filling on each face area in the first blank image according to the first sub-data of the key points and a preset color mapping relation; the face area filled with the color is integrated, so that the first color image is obtained;
performing color filling on each face area in the second blank image according to the second sub-data of the key points and a preset color mapping relation; and integrating the face areas filled with the colors to obtain the second color image.
5. The method according to claim 4, wherein the method further comprises:
and adjusting the first face data and the second face data by using a standard distribution interval to obtain the adjusted first face data and second face data.
6. The method according to claim 1, wherein the method further comprises:
normalizing the first facial data to obtain the first facial data in a preset range interval;
and taking the first face data in a preset range interval as a pixel value.
7. A device for testing the efficacy of a cosmetic product, said device comprising:
an acquisition module for acquiring first face data before the application of the cosmetic to the face and second face data after the application of the cosmetic; wherein the first facial data comprises moisture data or grease data and the second facial data comprises moisture data or grease data;
a building module for building a first facial thermal image comprising a color corresponding to the first facial data and a second facial thermal image comprising a color corresponding to the second facial data;
the comparison module is used for determining the efficacy of the cosmetics by comparing the difference information of the first color in the first facial thermal image and the second color in the second facial thermal image;
the first facial thermal image is obtained according to a first color image and a first initial image; wherein the first initial image is a facial image acquired when the user does not use cosmetics; the first color image is obtained by a first blank image containing key points, wherein the key points are obtained by inputting the first initial image into a face detection model; the construction module is specifically used for:
constructing a coordinate system of the first blank image, and determining coordinates of each key point;
constructing a solid circle by taking the coordinates of the key points as the circle center, taking a preset pixel value as the radius and taking the first face data of the key points as the pixel value of the circle;
and filling gaps among the solid circles by using an interpolation method according to the preset color mapping to obtain a first color image.
8. An electronic device, comprising: a processor, a memory and a bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating over the bus when the electronic device is running, said machine readable instructions when executed by said processor performing the steps of the image analysis based cosmetic efficacy assessment method according to any one of claims 1 to 6.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the image analysis-based cosmetic efficacy evaluation method according to any one of claims 1 to 6.
CN202311338178.2A 2023-10-17 2023-10-17 Cosmetic efficacy evaluation method, device, equipment and medium based on image analysis Active CN117078685B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311338178.2A CN117078685B (en) 2023-10-17 2023-10-17 Cosmetic efficacy evaluation method, device, equipment and medium based on image analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311338178.2A CN117078685B (en) 2023-10-17 2023-10-17 Cosmetic efficacy evaluation method, device, equipment and medium based on image analysis

Publications (2)

Publication Number Publication Date
CN117078685A CN117078685A (en) 2023-11-17
CN117078685B true CN117078685B (en) 2024-02-27

Family

ID=88708409

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311338178.2A Active CN117078685B (en) 2023-10-17 2023-10-17 Cosmetic efficacy evaluation method, device, equipment and medium based on image analysis

Country Status (1)

Country Link
CN (1) CN117078685B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011217987A (en) * 2010-04-12 2011-11-04 Hiromi Hatanaka Image display device with skin moisture measuring instrument
CN108257084A (en) * 2018-02-12 2018-07-06 北京中视广信科技有限公司 A kind of automatic cosmetic method of lightweight face based on mobile terminal
KR20190093372A (en) * 2018-02-01 2019-08-09 주식회사 엘지생활건강 Make-up evaluation system and operating method thereof
CN110161027A (en) * 2019-03-06 2019-08-23 上海商路网络科技有限公司 A kind of cosmetic industry evaluation method based on image analysis
CN113487573A (en) * 2021-07-08 2021-10-08 杭州德肤修生物科技有限公司 Cosmetic efficacy quantitative evaluation method based on accurate image comparison
TW202234341A (en) * 2021-02-23 2022-09-01 大陸商北京市商湯科技開發有限公司 Image processing method and device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011217987A (en) * 2010-04-12 2011-11-04 Hiromi Hatanaka Image display device with skin moisture measuring instrument
KR20190093372A (en) * 2018-02-01 2019-08-09 주식회사 엘지생활건강 Make-up evaluation system and operating method thereof
CN108257084A (en) * 2018-02-12 2018-07-06 北京中视广信科技有限公司 A kind of automatic cosmetic method of lightweight face based on mobile terminal
CN110161027A (en) * 2019-03-06 2019-08-23 上海商路网络科技有限公司 A kind of cosmetic industry evaluation method based on image analysis
TW202234341A (en) * 2021-02-23 2022-09-01 大陸商北京市商湯科技開發有限公司 Image processing method and device, electronic equipment and storage medium
CN113487573A (en) * 2021-07-08 2021-10-08 杭州德肤修生物科技有限公司 Cosmetic efficacy quantitative evaluation method based on accurate image comparison

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
图像分析法在化妆品功效评价中的应用;赵小敏;赵云珊;瞿欣;;日用化学品科学(01);全文 *
采用Lab色度系统评价某种美白化妆品的美白功效;李玲, 苏瑾, 李竹, 周华, 宋伟民;环境与职业医学(01);全文 *

Also Published As

Publication number Publication date
CN117078685A (en) 2023-11-17

Similar Documents

Publication Publication Date Title
JP4761924B2 (en) Skin condition diagnosis system and beauty counseling system
EP2731072A1 (en) Face impression analysis method, cosmetic counseling method, and face image generation method
TWI452998B (en) System and method for establishing and analyzing skin parameters using digital image multi-area analysis
KR102485256B1 (en) Customized Skin diagnostic and Managing System
US20080304736A1 (en) Method of estimating a visual evaluation value of skin beauty
KR20100105627A (en) Skin color evaluation method, skin color evaluation apparatus, skin color evaluation program and recording medium with the program recorded thereon
CN108403105B (en) Display method and display device for electrocardio scatter points
CN109891519A (en) Information processing unit, information processing method and program
CN104970797B (en) Skin classification method, the recommended method of cosmetics and skin classification card
JP5651385B2 (en) Face evaluation method
CN117078685B (en) Cosmetic efficacy evaluation method, device, equipment and medium based on image analysis
JP2017120595A (en) Method for evaluating state of application of cosmetics
CN115699113A (en) Intelligent system for skin testing, custom formulation and cosmetic production
CN117078675B (en) Cosmetic efficacy evaluation method, device, equipment and medium based on image analysis
JP7349125B2 (en) Beauty evaluation method, sensory evaluation method, and system
KR102239575B1 (en) Apparatus and Method for skin condition diagnosis
KR20140057934A (en) Apparatus and method for makeup simulation using measurement data
JP2872912B2 (en) How to choose point makeup cosmetics
JP2016159114A (en) Skin analysis system, skin analysis device, and skin analysis program
KR102425873B1 (en) Personal color diagnostic method and system based on machine learning and augmented reality
KR100370271B1 (en) A Measure System for Skin condition
CN106030659A (en) Aging analysis method and aging analysis device
CN112086193A (en) Face recognition health prediction system and method based on Internet of things
TW201721487A (en) Visualization method of skin feature
KR20020022265A (en) A Measure System for Skin condition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant