Nothing Special   »   [go: up one dir, main page]

CN110135929B - System, method and storage medium for implementing virtual makeup application - Google Patents

System, method and storage medium for implementing virtual makeup application Download PDF

Info

Publication number
CN110135929B
CN110135929B CN201811308170.0A CN201811308170A CN110135929B CN 110135929 B CN110135929 B CN 110135929B CN 201811308170 A CN201811308170 A CN 201811308170A CN 110135929 B CN110135929 B CN 110135929B
Authority
CN
China
Prior art keywords
makeup
effects
mode
templates
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811308170.0A
Other languages
Chinese (zh)
Other versions
CN110135929A (en
Inventor
周澄
颜宗芃
吴介中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CN110135929A publication Critical patent/CN110135929A/en
Application granted granted Critical
Publication of CN110135929B publication Critical patent/CN110135929B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Human Computer Interaction (AREA)
  • Development Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A system, method, and storage medium for implementing a virtual makeup application, a computing device obtaining multimedia content associated with a user of the computing device and generating a user interface, in a first mode of operation, the user interface displaying a plurality of graphic thumbnails each depicting a makeup result, each graphic thumbnail corresponding to a makeup template, each makeup template including a list of makeup effects for achieving a respective makeup result, obtaining a selection of one or more graphic thumbnails from the user to select one or more makeup templates, in response to operating in a second mode of operation, a corresponding list of makeup effects displayed in one or more selected makeup templates and obtaining a selection of one or more displayed makeup effects, in response to ending the second mode of operation, applying the one or more selected makeup effects of the one or more selected makeup templates to each makeup template, to generate an updated makeup template.

Description

System, method and storage medium for implementing virtual makeup application
Technical Field
The present invention relates to media editing, and more particularly, to a system, method and storage medium for implementing a virtual makeup application.
Background
With the widespread use of smart phones, tablets, and other display devices, people can view or edit digital content at any time, and in order to edit digital content, application programs in smart phones and other portable display devices are also popular. While people are increasingly relying on portable devices to address computing needs, the displays of portable devices are relatively small when compared to desktop or television, resulting in limited information visible on the display. Accordingly, there is a need to provide an improved platform for efficiently editing media content.
Disclosure of Invention
In one embodiment, a computing device obtains a multimedia content associated with a user of a computing device and generates a user interface. In a first operation mode, the user interface displays a plurality of graphic thumbnails, each graphic thumbnail depicts a makeup result, each graphic thumbnail corresponds to a makeup template, each makeup template comprises a list of makeup effects, and the list of makeup effects is used for achieving the makeup result corresponding to each makeup effect. The computing device obtains a selection of one or more graphical thumbnails from the user for selecting one or more makeup templates. The computing device obtains a user input to initiate a second mode of operation. And responding to the second operation mode, displaying the makeup effect of the list corresponding to the one or more selected makeup templates, and obtaining a selection result of the one or more displayed makeup effects. The computing device obtains a second user input to end the second mode of operation. In response to ending the second mode of operation, applying one or more selected cosmetic effects of the one or more selected cosmetic templates to each cosmetic template to generate an updated cosmetic template.
Preferably, the multimedia content about the user includes at least one of: including video of the user and including images of the user.
Preferably, in the second operation mode, the makeup effects displayed in the corresponding list of the selected makeup template are selected by default, and the one or more selected makeup effects in the selection result include a makeup effect for cancellation.
Preferably, in the second operation mode, the makeup effects displayed in the corresponding list of the selected makeup template are cancelled by default, and the one or more selected makeup effects in the selection result include a makeup effect for selection.
Preferably, the method implemented in a computing device further comprises: obtaining a selection result showing the same makeup effect in response to the plurality of makeup templates selected in the first operation mode in the second operation mode, and identifying a makeup template showing the same makeup effect corresponding to the latest selection, wherein the latest selected makeup effect of the identified makeup templates is displayed in a preview window in the second operation mode, and wherein the latest selected makeup effect of the identified makeup templates is applied to each makeup template to generate an updated makeup template.
Preferably, the method implemented in a computing device further comprises: and responding to the selection results of the plurality of makeup templates obtained in the first operation mode, and displaying the makeup effect of each corresponding makeup template list, wherein the makeup effect is classified according to a makeup effect type or according to the makeup templates.
Preferably, the step of user input to initiate the second mode of operation comprises: a switch is provided on the user interface for switching to an on state.
Preferably, the step of obtaining a second user input to end the second mode of operation comprises: a switch is provided on the user interface to switch to an off state.
Preferably, the method implemented in a computing device further comprises: in response to operating in the second mode of operation, one or more selected cosmetic effects are displayed in a preview window for a facial region depicted in the multimedia content associated with the user.
Preferably, the method implemented in a computing device further comprises: in response to ending the second mode of operation, the makeup result depicted on each of the graphic thumbnails is updated to generate an updated makeup template according to the one or more selected makeup effects.
Preferably, the method implemented in a computing device further comprises: the updated makeup template is stored in a database of the computing device.
Preferably, each of the makeup effects listed in each of the makeup templates corresponds to a makeup product.
In another embodiment, a computing device obtains a multimedia content associated with a user of the computing device, generates a user interface, wherein in a first mode of operation, the user interface displays a plurality of different types of cosmetic effects, obtains a selection of one or more types of cosmetic effects from the user, obtains a selection of a parameter of the selected type of cosmetic effect from the user, and obtains a user input to initiate a second mode of operation.
Another embodiment is a system comprising a display, a memory storing a plurality of instructions, and a memory coupled to the memory. The processor is configured with instructions to obtain a multimedia content related to a user of the system and generate a user interface, wherein in a first mode of operation, the user interface displays a plurality of graphic thumbnails, each graphic thumbnail depicting a makeup result, each graphic thumbnail corresponding to a makeup template, each makeup template including a list of makeup effects, the list of makeup effects configured to achieve a makeup result corresponding to each makeup effect. The processor obtains a selection of one or more graphic thumbnails from the user for selecting one or more makeup templates. The processor obtains a user input to initiate a second mode of operation. And responding to the second operation mode, displaying the makeup effect of the list corresponding to the one or more selected makeup templates respectively, and obtaining the selection result of the one or more displayed makeup effects. The processor obtains a second user input to end the second mode of operation. In response to ending the second mode of operation, applying one or more selected cosmetic effects of the one or more selected cosmetic templates to each cosmetic template to generate an updated cosmetic template.
Preferably, the step of user input to initiate the second mode of operation comprises: a switch is provided on the user interface for switching to an on state.
Preferably, the step of the second user input to end the second operation mode comprises: a switch is provided at the user interface to switch to an off state.
Preferably, in response to operating in the second mode of operation, the processor is further configured to display a preview window to display one or more selected cosmetic effects for use with a facial region depicted in the multimedia content associated with the user.
Preferably, in response to ending the second mode of operation, the processor is further configured to update the makeup result depicted on each of the graphic thumbnails based on the one or more selected makeup effects to generate an updated makeup template.
Preferably, the processor is further configured to store the updated makeup template in a database in the system.
Preferably, each of the makeup effects listed in each of the makeup templates corresponds to a makeup product.
In yet another embodiment, a non-transitory computer readable storage medium stores instructions for execution by a computing device having a processor, and when the instructions are executed by the processor, the computing device obtains multimedia content associated with a user of the computing device and generates a user interface. In a first operation mode, the user interface displays a plurality of graphic thumbnails, each graphic thumbnail depicts a makeup result, each graphic thumbnail corresponds to a makeup template, each makeup template comprises a list of makeup effects, and the list of makeup effects is used for achieving the makeup result corresponding to each makeup effect. The computing device obtains a selection of one or more graphical thumbnails from the user for selecting one or more makeup templates. The computing device obtains a user input to initiate a second mode of operation. And responding to the second operation mode, displaying the makeup effect of the corresponding list of the one or more selected makeup templates, and obtaining the selection result of the one or more displayed makeup effects. The computing device obtains a second user input to end the second mode of operation. In response to ending the second mode of operation, applying one or more selected cosmetic effects of the one or more selected cosmetic templates to each cosmetic template to generate an updated cosmetic template.
Preferably, the step of user input to initiate the second mode of operation comprises: a switch is provided on the user interface for switching to an on state.
Preferably, the step of the second user input to end the second operation mode comprises: a switch is provided at the user interface to switch to an off state.
Preferably, in response to ending the second mode of operation, the processor is further configured to update the makeup result depicted on each of the graphic thumbnails in accordance with the selected one or more makeup effects to generate an updated makeup template.
Preferably, the processor is further configured to store the updated makeup model in a database of the computing device.
Preferably, each of the makeup effects listed in each of the makeup templates corresponds to a makeup product.
For a better understanding of the features and technical content of the present invention, reference should be made to the following detailed description of the invention and accompanying drawings, which are provided for purposes of illustration and description only and are not intended to limit the invention.
Drawings
FIG. 1 is a block diagram of a computing device implementing a pinning mechanism on a virtual makeup application platform according to various embodiments of the invention.
FIG. 2 is a schematic diagram of a computing device of various embodiments of the invention shown in FIG. 1.
FIG. 3 is an illustrative top-level flow diagram of a portion of the functionality performed by the computing device of various embodiments of the invention of FIG. 1 in a virtual makeup application platform implementing a nail selection mechanism.
FIG. 4 is an exemplary diagram of a computing device of FIG. 1 providing a user interface on a display in a first mode of operation according to various embodiments of the invention.
FIG. 5 is an exemplary illustration of a user interface for operation of various embodiments of the present invention in a second mode of operation.
FIG. 6 is an exemplary illustration of a user interface associated with FIGS. 4 and 5 in various embodiments of the invention.
FIG. 7 is a schematic diagram illustrating how selected cosmetic effects may be applied to a cosmetic template in various embodiments of the present invention.
Fig. 8 is an exemplary illustration of a user interface associated with fig. 5 and 7 in various embodiments of the invention.
FIG. 9 is another exemplary diagram of a user interface provided by the display of the computing device of various embodiments of the invention of FIG. 1 in relation to a plurality of selected makeup templates during operation in the first mode of operation.
FIG. 10 is an exemplary illustration of a first type of user interface in relation to a plurality of selected makeup templates in operation in a second mode of operation in accordance with various embodiments of the invention.
FIG. 11 is an exemplary illustration of a second type of user interface in relation to a plurality of selected makeup templates during operation in a second mode of operation in accordance with various embodiments of the invention.
FIG. 12 is another exemplary diagram of a user interface provided by a display of the computing device of various embodiments of the invention of FIG. 1 in a first mode of operation.
FIG. 13 is a diagram illustrating the selection of cosmetic types and corresponding parameters shown in the UI of FIG. 12 according to various embodiments of the invention.
Detailed Description
The following is a description of an improved media editing platform for the public to efficiently access or customize a makeup template that details the desired makeup that can be achieved for one or more makeup products after use, by way of a number of specific examples. In particular, embodiments are described for a virtual makeup application platform implementing a nail selection mechanism. In some embodiments, a system for implementing a pinning mechanism in a virtual makeup application platform is provided, wherein multimedia content about a user is obtained by a computing device. In the present disclosure, the multimedia content about the user is a video including the user and/or an image including the user. A user interface is generated, whereby the user interface can display image thumbnails depicting various makeup results that can be achieved using different makeup products in the first mode of operation. The user selects a desired makeup result and initiates a second mode of operation, whereby the user may select one or more makeup effects to further customize the selected makeup result. By way of example, the operations of customizing may include: certain cosmetic effects are removed or cancelled (e.g., the lipstick is cancelled) or simply selected (e.g., the eyeliner is added) while maintaining certain desired cosmetic effects. The selected makeup effect is expressed as a virtual makeup on the face area depicted in the multimedia contents regarding the user, and the selected makeup effect is automatically used along to other makeup templates after the user ends the second operation mode, as described in detail below.
The following description will be made in detail with respect to a system in which a virtual makeup application platform implements a pinning mechanism, and the operation of each component in the system will be described later. FIG. 1 is a block diagram of a computing device 102 implementing a pinning mechanism for a virtual makeup application platform. Specifically, the computing device 102 may be, but is not limited to: smart phones, desktop computing devices, notebook computers, and the like.
A makeup simulator 104 is implemented on a processor in the computing device 102, the makeup simulator 104 including a camera interface 106, a User Interface (UI) generator 108, a template management server 110 and an image editor 112. The camera interface 106 is configured to obtain multimedia content about a user of the computing device 102, which can be captured by a front-facing camera built into the computing device 102. Additionally, the camera interface 106 can obtain multimedia content from a digital recording device coupled to the computing device 102 or another computing device with digital recording capabilities.
As known to those skilled in the art, multimedia content can be encoded in the form of files: JPEG (Joint Photographic Experts group), TIFF (tagged Image File format), PNG (Portable Network graphics), GIF (graphics Interchange format), BMP (bitmap), or other types of digital File formats, but not limited thereto. In addition, the multimedia content can also be obtained by format conversion of a still image or a moving video in the form of the following files: MPEG-1(Motion Picture Experts Group-1), MPEG-2, MPEG-4, H.264, 3GPP (third Generation Partnership project), 3GPP-2, SD-Video (Standard-Definition Video), HD-Video (High-Definition Video), DVD (Digital Video Disc) multimedia, VCD (Video Compact Disc) multimedia, HD-DVD (High-Definition Digital Video Disc) multimedia, DTV/HDTV (Digital Video/High-Definition Digital Video) multimedia, AVI Media Video Interval, DV (Digital Video), Windows (Windows QT), WMV Video, Video Audio Video (Audio Video) Advance, Video (MPEG-Video) Video, Video-graphics, Video-Audio (MPEG-Video) file, Video-Video file, Video-Audio (Video) file, Video-Audio file, Video-Audio-Video file, Video-Audio-Video file, Audio-Video-Audio-Video file, Audio-Video-Audio file, Audio-Video-Audio-Video file, Audio-Video file, Audio-Video file, Audio-Audio file, Audio-Video-Audio file, Audio-Audio file, Audio-Video file, Audio-Video file, Audio-Audio file, Audio-Video file, Audio-Audio file, Audio-Video file, Audio-Audio file, Audio-Video file, Audio-Audio file, Audio-Video file, Audio-Video file, Audio-Audio file, Audio-Video file, Audio-Video file, Audio-Video file, Audio-Video file, Audio-Audio file, Audio-Video file, Audio-Audio file, Audio-Video file, Audio-Video file, Audio file, 3D Scan Model (3D Scan Model) or other kind of digital form, but not limited to this.
The user interface generator 108 is configured to operate to form a user interface to enhance the virtual application of the cosmetic product. In a first mode of operation, the user interface displays a series of image thumbnails (Graphical Thumbnail), each image Thumbnail depicting a cosmetic result, the different cosmetic results being accomplished by one or more cosmetic effect applications corresponding to the cosmetic product. The makeup effects for completing the makeup result are defined in a makeup template 118, each makeup template 118 corresponds to an image thumbnail with a representation of the makeup effect, and each makeup template 118 includes a list of one or more makeup effects for achieving the makeup result corresponding to each makeup template 118.
In the second mode of operation, the user interface 108 operates to form a user interface to display the makeup effects listed in the selected makeup template 118, which may be used by a user of the computing device 102 to select one or more of the displayed makeup effects. In particular, a user may switch between the first mode of operation and the second mode of operation through a switch on the user interface.
The template management server 110 is configured to apply the makeup effects selected by the user to each makeup template 118 to generate an updated makeup template 118. The template management server 110 may then store the updated makeup template 118 in a database 116. The image editor 112 is configured to apply a simulated makeup receptacle, in which makeup effects may be formed, to the face region depicted in the multimedia content for the user according to the updated makeup template 118.
The computing device 102 may be coupled to a network 120, and the network 120 may be Internet (Internet), intranet (intranet), extranet (extranet), Wide Area Network (WANs), Local Area Network (LANs), Wired network (Wired Networks), Wireless network (Wireless Networks), or other suitable network, and may be coupled to two or more Networks. The computing device 102 may be coupled in communication with other computing devices via the network 120, for example, the computing device 102 may be coupled in communication with a makeup template server device 122 for retrieving new and/or updated makeup templates.
Fig. 2 is a block diagram of the computing device 102 of fig. 1. Computing device 102 may be embodied as various types of wired and/or wireless computing devices, such as a desktop computer, a portable computer, a dedicated server computer, a multi-processor computing device, a smart phone, a tablet, and so forth. As shown in fig. 2, the computing device 102 includes a memory 214, a processing device 202, a plurality of Input/Output interfaces (I/O interfaces) 204, a network Interface 206, a display 208, a peripheral Interface 211, and a mass storage 226, wherein each component of the computing device 102 is connected to a Local Data Bus (Local Data Bus) 210.
The Processing device 202 may include any custom made or commercially available processor, a Central Processing Unit (CPU) associated with the computing device 102 or a coprocessor associated with the processor, a semiconductor microprocessor (in the form of a microchip), a Macroprocessor (microprocessor), one or more Application Specific Integrated Circuits (ASICs), a plurality of suitably configured digital logic gates, and a variety of conventional electronic configurations including discrete components for coordinating the overall operation of the computing system, both individually and in various combinations.
The Memory 214 may include any one of Volatile Memory components (Volatile Memory Elements) or non-Volatile Memory components (Nonvolatile Memory Elements). For example, the volatile Memory component includes a Random Access Memory (RAM), such as a Dynamic Random Access Memory (DRAM) or a Static Random Access Memory (SRAM). The non-volatile Memory component can be Read-Only Memory (ROM), hard disk, magnetic tape, or Compact Disc Read-Only Memory (Compact Disc Read-Only Memory). The memory 214 generally includes a Native operating System 216, one or more Native applications (Native Application), Emulation systems (Emulation System), or Emulated applications (Emulated Application) for any kind of operating System and/or Emulated hardware platform or Emulated operating System. For example, the aforementioned applications (i.e., native applications or simulation applications) may comprise specific software, i.e., comprise some or all of the components of computing device 102 in FIG. 1. In such embodiments, the components are stored in the memory 214 and executed by the processing device 202 such that the processing device 202 may represent the operational steps/functions of the present invention to implement the staple selection mechanism. The components in the memory 214 are known to those skilled in the art from conventional knowledge and are not described in detail in the specification. In some embodiments, the components in the computing device 102 may be implemented in hardware and/or software.
The input/output interface 204 provides a plurality of interfaces to input or output information. For example, when the computing device 102 comprises a personal computer, the aforementioned components may be connected to one or more input/output interfaces 204, such as a keyboard and mouse, as shown in FIG. 2. The display 208 may include a computer monitor, a plasma screen of a personal computer, a liquid crystal display of a hand held device, a touch screen, or other display device.
In the present disclosure, a non-transitory computer readable storage medium stores a program for use by or in connection with an instruction execution system, apparatus, or device. More specifically, specific examples of the computer-readable storage medium may include, but are not limited to, a Portable computer diskette, a random access Memory, a Read-Only Memory, an Erasable Programmable Read-Only Memory (EPROM, EEPROM, or Flash Memory), and a Portable Compact Disc Read-Only Memory (CDROM).
Referring to FIG. 3, FIG. 3 is a flow chart 300 illustrating a nail selection mechanism implemented by the computing device 102 of FIG. 1 on a virtual makeup application platform according to various embodiments. The flow diagram 300 in fig. 3 is merely an example of different types of functional arrangements that may be used to implement the operation of various components of the computing device 102. In other words, the flowchart 300 of fig. 3 may be considered to describe one or more embodiments of a method of practicing the computing apparatus 102.
Although flowchart 300 of fig. 3 discloses a particular order of execution of the steps, the order of the steps is merely to aid in understanding the present invention and the actual order of operation may differ from that described. For example, the order of execution of two or more blocks may be adjusted, reversed, or patched relative to each other. Also, two or more blocks shown in fig. 3 in sequence may be performed simultaneously or partially simultaneously. And such modifications and alterations are still within the scope of the present disclosure.
In block 310, the computing device 102 obtains a multimedia content from a user of the computing device 102, wherein the multimedia content may include images of the user captured by a front-facing camera built into the computing device 102. Alternatively, the multimedia content may be obtained through a digital recording device externally coupled to the computing device 102 or another computing device with digital recording capabilities.
In block 320, the computing device 102 generates a user interface that, in a first mode of operation, displays a plurality of graphic thumbnails each depicting a makeup result, each graphic thumbnail corresponding to a makeup template, and each makeup template including a list of makeup effects to achieve the corresponding makeup result. In some embodiments, each of the cosmetic effects listed in the cosmetic template corresponds to a cosmetic product.
At block 330, the computing device 102 obtains a selection from the user regarding the graphic thumbnail for selecting a makeup template. In some embodiments, in response to obtaining a selection of a plurality of makeup templates in the first mode of operation, the computing device 102 may display a makeup effect corresponding to the list in each makeup template, wherein the makeup effects are categorized according to a makeup effect category or according to a makeup template. In block 340, the computing device 102 obtains a user input to initiate a second mode of operation. In some embodiments, the user input to initiate operation of the second operating mode includes setting a switch on the user interface to switch to an on state.
In block 350, in response to operating in the second mode of operation, the computing device 102 displays a corresponding list of cosmetic effects in a selected cosmetic template and obtains a selection result regarding one or more displayed cosmetic effects. In some embodiments, in the second mode of operation, a preview window displays one or more selected cosmetic effects and applies to the facial region depicted in the multimedia content associated with the user. In some embodiments, in the second operation mode, the makeup effects of the list corresponding to the selected makeup template are selected by default. In this embodiment, the one or more displayed cosmetic effects in the obtained selection result include a cosmetic effect that can be cancelled. In some embodiments, in the second mode of operation, the corresponding makeup effect in the selected makeup template is cancelled by default. In this embodiment, the one or more displayed cosmetic effects in the obtained selection result include the selectable cosmetic effect.
In some embodiments, the plurality of makeup templates selected in the first operation mode are responded to in the second operation mode to obtain a selection result showing the same makeup effect. The computing device 102 identifies a makeup template corresponding to the same makeup effect displayed by the last selection, wherein the makeup effect of the identified makeup template that was selected by the last selection is displayed in a preview window during the second operation mode. In this embodiment, the cosmetic effect that has been selected from the identified cosmetic templates is used in each of the cosmetic templates to generate the updated cosmetic template.
In block 360, the computing device 102 obtains a second user input to end the second mode of operation. In some embodiments, the second user input to end the second mode of operation includes setting a switch on the user interface to transition to an off state. In some embodiments, the multimedia content about the user may be a video including the user and/or an image including the user. In some embodiments, after the second operation mode is ended, the computing device 102 updates the makeup result depicted by each of the graphic thumbnails according to the selected one or more makeup effects to generate an updated makeup template. Next, the computing device 102 stores the updated makeup template in a database 116 (FIG. 1) in the computing device 102.
In block 370, in response to the end of the second mode of operation, the computing device 102 applies the selected one or more makeup effects of the selected makeup templates to each of the makeup templates to generate updated makeup templates. Accordingly, the flow chart of the computing device 102 of fig. 3 ends.
Having described the basic architecture of a system for implementing a pin selection mechanism on a virtual makeup application platform, reference is made to the following drawings, which respectively outline various features of various embodiments of the invention. FIG. 4 is a schematic diagram of a user interface 402 provided by a display of the computing device 102 in the first mode of operation. Computing device 102 may be embodied as a smart phone, a desktop computing device, and so forth. As shown in fig. 4, the user interface 402 includes a switch 404 that is switchable between an on state and an off state. Also shown in FIG. 4 are a plurality of graphical thumbnail representations 406, each graphical thumbnail representation 406 corresponding to a makeup template defining a particular makeup result. The user may select a desired cosmetic result by selection of a graphical thumbnail representation 406. A preview window 408 on the user interface 402 depicts multimedia content related to the user and applies the selected cosmetic results to the facial region depicted in the multimedia content related to the user. The user may transition from the first mode of operation to the second mode of operation by clicking or pressing the switch 404.
FIG. 5 shows the user interface 402 in a second mode of operation. In the second mode of operation, a user interface 402 is provided so that the makeup effects required to achieve the selected makeup result can be displayed to the user in a list. In some embodiments, all of the makeup effects required to achieve the selected makeup result are selected according to a default value (default), and the user may cancel some of the makeup effects in the final makeup result according to the preference. In some embodiments, all makeup effects are selected according to a default value, and the user may select a desired makeup effect according to a preference. The user interface 402 may include two windows, one being a preview window (after makeup) to preview the results of applying the selected makeup effect to the facial region depicted in the multimedia content for the user. Another window is a viewing window (before makeup) showing the facial area in the multimedia content for the user without any makeup effect applied.
Fig. 6 illustrates an example of the user interface 402 depicted in fig. 4 and 5. As shown in FIG. 6, a desired cosmetic result may be selected by clicking or otherwise marking one of the graphical thumbnail representations 604. The selected cosmetic results are applied to the facial region depicted in the multimedia content for the user. The user may initiate the second mode of operation by clicking or otherwise selecting the pinning tool 602 on the upper end of the user interface 402. In a second mode of operation, the user may select a desired cosmetic effect from the cosmetic effects in the list to achieve the selected cosmetic result. As shown in the exemplary embodiment, the user selects to exclude lip gloss and blush effects. The selected cosmetic effects (i.e., eye shadow cosmetic effect, eye line cosmetic effect, eyelash cosmetic effect, eyebrow cosmetic effect) may be nested for the face region depicted in the multimedia content for the user and displayed in a preview window labeled "after make-up" to the right of the user interface 402.
Fig. 7 illustrates how selected cosmetic effects may be applied to the cosmetic template. Referring back to fig. 5, once the user has selected the desired cosmetic effect, the user may click or press the switch 404 to end the second mode of operation. Additionally, another user interface control (e.g., a "store" button) may be provided to the user interface 402 for the user to end the second mode of operation. Referring to FIG. 7, the selected or "pinned" makeup effect is automatically applied to the makeup template. In addition, the graphical thumbnail representation 406 may thus be updated and depicted with the selected cosmetic effect. As shown in the exemplary example, it is assumed that the user selects a makeup effect #3 (e.g., an eyelash makeup effect) and a makeup effect #4 (e.g., an eyebrow makeup effect) from the makeup template # 4.
The makeup effects 702 will automatically follow up to all makeup templates and the updated makeup templates will be stored. As shown in fig. 7, the corresponding graphical thumbnail representation 406 is also updated accordingly. In addition, when the graphic thumbnail representation 406 is updated, the selected cosmetic effect is depicted. As shown in the exemplary example of FIG. 7, assume that the user has selected cosmetic effect #3 (e.g., using
Figure GDA0003348535300000121
Lip gloss effect of No. 416 archness lipstick), make-up effect 702 will automatically follow all make-up templates, so that all make-up templates contain the lip gloss effect (use)
Figure GDA0003348535300000122
Lip gloss effect of naughty lipstick No. 416) to produce a newer makeup template.
Fig. 8 illustrates an example of the user interface 402 depicted in fig. 5 and 7. As shown in fig. 8, in the second operation mode, the user selects a makeup effect (i.e., a face painting effect), and the user can end the second operation mode by clicking or pressing a switch or by pressing a "store" button. This operation will cause the selected makeup effect to automatically follow up to all makeup templates, and, as shown in fig. 8, all graphic thumbnail representations will be updated accordingly.
FIG. 9 illustrates another example of the results of the display of the user interface 902 provided by the computing device of various embodiments of the invention of FIG. 1 when the first mode of operation involves multiple selected cosmetic templates. In some embodiments, the user is not limited to selecting a single makeup template, and may select multiple makeup templates at a time. As shown in the example, the user has selected two graphical thumbnail representations, each of which corresponds to a makeup template depicting a particular makeup result.
FIG. 10 illustrates an example of a first category of user interfaces 1002 relating to a plurality of selected makeup templates in a second mode of operation. In the second mode of operation, a user interface 1002 is provided whereby a list of makeup effects from each of the makeup templates of FIG. 10 may be used to arrive at a selected makeup result and displayed to the user. In some embodiments, all makeup effects used to achieve the selected makeup result are selected according to a default value, and the user may cancel the makeup effects according to the preference to exclude from the final makeup result. In some embodiments, none of the cosmetic effects are selected by default, and the user may select the desired cosmetic effect according to preference. As shown in the exemplary user interface 1002, the makeup effects are categorized according to a makeup template selected by the user. According to some embodiments, if the user selects the same makeup effect (e.g., makeup effect #2) from multiple makeup templates (e.g., template #3 and template #4), the computing device 102 will only select the most recently selected makeup effect version (makeup effect # 2). In this embodiment, the preview window displays the most recent user selection.
FIG. 11 illustrates an example of a second category of user interfaces 1102 in relation to a plurality of selected cosmetic templates in a second mode of operation. In the second mode of operation, a user interface 1102 is provided, whereby makeup effects that are listed in each selected makeup template in FIG. 11 to achieve the selected makeup result are displayed to the user. As shown in the second type user interface 1102, the selected makeup effects are classified according to the type of makeup effect (e.g., eye shadow effect type, eye liner effect type, lip gloss effect type). Similarly, if the user selects the same makeup effect (e.g., makeup effect #2) from a plurality of makeup templates (e.g., template #3 and template #4), the computing device 102 will only select the most recently selected makeup effect version (makeup effect # 2). In this embodiment, the preview window displays the most recent user selection.
FIG. 12 illustrates another example of the display results of the user interface 1202 provided by the computing device 102 of FIG. 1 in a first mode of operation in accordance with various embodiments of the present invention. As shown in fig. 12, when the user interface 1202 includes multiple categories of cosmetic effects 1206, the user interface 1202 may also include parameters 1204 that may be selected from the different categories of cosmetic effects 1206. As shown in the exemplary embodiment, parameters 1204 include different colors. The user interface 1202 includes a switch 1203 that is switchable between an on state and an off state.
FIG. 13 illustrates the selection of makeup category and corresponding parameters in the user interface 1202 of FIG. 12 in accordance with various embodiments of the present invention. As shown in the exemplary embodiment, the user selects a cosmetic effect category 1206 (lipstick) and a corresponding color, and the user may repeatedly select the number of uses. For example, when repeatedly selecting for the second time, the user may obtain another cosmetic effect category 1206 (e.g., eye line) and corresponding color. Once the user has finished selecting the category 1206 of the desired cosmetic effect, the user can initiate the second mode of operation by use of the switch 1203. When the user ends the second operation mode, the selected makeup effects and corresponding parameters are automatically applied to all the makeup templates.
The disclosure is only a preferred embodiment of the invention, and is not intended to limit the scope of the claims, so that all technical equivalents and modifications using the contents of the specification and drawings are included in the scope of the claims.

Claims (19)

1. A method for executing a virtual cosmetic application, comprising:
obtaining a multimedia content relating to a user of a computing device;
generating a user interface, wherein in a first operation mode, the user interface displays a plurality of graphic thumbnails, each of the graphic thumbnails depicts a makeup result, each of the graphic thumbnails corresponds to a makeup template, each of the makeup templates includes a list of makeup effects, and the list of makeup effects is used to achieve the makeup result corresponding to each of the makeup effects;
obtaining a selection result of the plurality of graphic thumbnails from the user to select the plurality of makeup templates;
obtaining a user input to initiate a second mode of operation;
in response to the second operation mode, displaying the makeup effects of the list corresponding to the selected makeup template, wherein the makeup effects are classified according to a makeup effect type or the makeup templates, and the makeup effects of the list corresponding to the selected makeup templates are displayed, wherein in the second operation mode, the makeup effects of the list corresponding to the selected makeup templates are selected by default, and one or more selected makeup effects in the selection result comprise makeup effects for cancellation;
obtaining a second user input to end the second mode of operation;
applying the selected one or more of the plurality of selected makeup templates to generate an updated makeup template in response to ending the second mode of operation; and
and updating the user interface, and displaying the updated makeup template in a preview window.
2. The method of claim 1, wherein the multimedia content for the user comprises at least one of: including a video of the user and including an image of the user.
3. The method of claim 1, further comprising: obtaining a selection result showing the same makeup effect in response to the plurality of selected makeup templates in the first operation mode in the second operation mode, and identifying a makeup template showing the same makeup effect corresponding to the latest selection, wherein the identified latest selected makeup effect of the makeup templates is displayed in a preview window in the second operation mode, and wherein the identified latest selected makeup effect of the makeup templates is applied to each of the makeup templates to generate the updated makeup template.
4. The method of claim 1, wherein the step of user input to initiate the second mode of operation comprises: a switch is arranged on the user interface to be switched to an on state.
5. The method of claim 4, wherein the step of obtaining a second user input to end the second mode of operation comprises: the switch is configured to switch to an off state at the user interface.
6. The method of claim 1, further comprising:
in response to operating in the second mode of operation, displaying one or more of the selected cosmetic effect sets in a preview window for a facial region depicted in the multimedia content with respect to the user.
7. The method of claim 1, further comprising:
storing the updated makeup template in a database of the computing device.
8. The method of claim 1, wherein each of the makeup effects listed in each of the makeup templates corresponds to a makeup product.
9. A system for implementing a virtual cosmetic application, comprising:
a display;
a memory, said memory storing a plurality of instructions; and
a processor coupled to the memory and configured with a plurality of the instructions, the plurality of instructions comprising:
obtaining a multimedia content relating to a user of the system;
generating a user interface, wherein in a first operation mode, the user interface displays a plurality of graphic thumbnails, each of the graphic thumbnails depicts a makeup result, each of the graphic thumbnails corresponds to a makeup template, each of the makeup templates includes a list of makeup effects, and the list of makeup effects is used to achieve the makeup result corresponding to each of the makeup effects;
obtaining a selection result of the plurality of graphic thumbnails from the user to select the plurality of makeup templates;
obtaining a user input to initiate a second mode of operation;
in response to the second operation mode, displaying the makeup effects of the list corresponding to the selected makeup template, wherein the makeup effects are classified according to a makeup effect type or according to the makeup templates, and displaying the makeup effects of the lists corresponding to the selected makeup templates, wherein in the second operation mode, the makeup effects of the lists corresponding to the selected makeup templates are selected by default, and the selected makeup effects in the selection result include makeup effects for cancellation;
obtaining a second user input to end the second mode of operation;
applying a plurality of selected makeup effects of the plurality of selected makeup templates to generate an updated makeup template in response to ending the second mode of operation; and
and updating the user interface, and displaying the updated makeup template in a preview window.
10. The system of claim 9, wherein the step of user input to initiate the second mode of operation comprises:
a switch is arranged on the user interface to be switched to an on state.
11. The system of claim 10, wherein the step of the second user input to end the second mode of operation comprises:
the switch is configured to switch to an off state at the user interface.
12. The system of claim 9, wherein in response to operating in the second mode of operation, the processor is further configured to display a preview window to display a plurality of selected cosmetic effect sets for a facial region depicted in the multimedia content associated with the user.
13. The system of claim 9, wherein the processor is further configured to store the updated makeup template in a database in the system.
14. The system of claim 9, wherein each of the makeup effects of the list in each of the makeup templates corresponds to a makeup product.
15. A non-transitory computer readable storage medium storing instructions, the instructions being executable by a computing device having a processor, wherein when the instructions are executed by the processor, the computing device at least performs:
obtaining a multimedia content related to a user of the computing device;
generating a user interface, wherein in a first operation mode, the user interface displays a plurality of graphic thumbnails, each of the graphic thumbnails depicts a makeup result, each of the graphic thumbnails corresponds to a makeup template, each of the makeup templates includes a list of makeup effects, and the list of makeup effects is used to achieve the makeup result corresponding to each of the makeup effects;
obtaining a selection result of the plurality of graphic thumbnails from the user to select the plurality of makeup templates;
obtaining a user input to initiate a second mode of operation;
in response to the second operation mode, displaying the makeup effects of the list corresponding to the selected makeup template, wherein the makeup effects are classified according to a makeup effect type or according to the makeup template, and displaying the makeup effects of the lists corresponding to the selected makeup templates, wherein in the second operation mode, the makeup effects of the lists corresponding to the selected makeup templates are selected by default, and the plurality of selected makeup effects in the selection result comprise makeup effects for cancellation;
obtaining a second user input to end the second mode of operation;
applying a plurality of selected makeup effects of the plurality of selected makeup templates to generate an updated makeup template in response to ending the second mode of operation; and
and updating the user interface, and displaying the updated makeup template in a preview window.
16. The non-transitory computer readable storage medium of claim 15, wherein the step of user input to initiate the second mode of operation comprises:
a switch is arranged on the user interface to be switched to an on state.
17. The non-transitory computer readable storage medium of claim 16, wherein the step of the second user input to end the second mode of operation comprises:
the switch is configured to switch to an off state at the user interface.
18. The non-transitory computer readable storage medium of claim 15, wherein the processor is further configured to store the updated makeup model in a database of the computing device.
19. The non-transitory computer readable storage medium of claim 15, wherein each of the makeup effects in the list of makeup effects in each of the makeup templates corresponds to a makeup product.
CN201811308170.0A 2018-02-02 2018-11-05 System, method and storage medium for implementing virtual makeup application Active CN110135929B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862625405P 2018-02-02 2018-02-02
US62/625,405 2018-02-02

Publications (2)

Publication Number Publication Date
CN110135929A CN110135929A (en) 2019-08-16
CN110135929B true CN110135929B (en) 2022-05-06

Family

ID=67568233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811308170.0A Active CN110135929B (en) 2018-02-02 2018-11-05 System, method and storage medium for implementing virtual makeup application

Country Status (1)

Country Link
CN (1) CN110135929B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210373752A1 (en) * 2019-11-28 2021-12-02 Boe Technology Group Co., Ltd. User interface system, electronic equipment and interaction method for picture recognition
CN112819718A (en) * 2021-02-01 2021-05-18 深圳市商汤科技有限公司 Image processing method and device, electronic device and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870821A (en) * 2014-04-10 2014-06-18 上海影火智能科技有限公司 Virtual make-up trial method and system
EP3201834B1 (en) * 2014-09-30 2021-05-12 TCMS Transparent Beauty LLC Precise application of cosmetic looks from over a network environment
WO2016182223A1 (en) * 2015-05-08 2016-11-17 스타일미러 주식회사 Mirror system and method enabling photo and video sharing by means of bidirectional communication
US10324739B2 (en) * 2016-03-03 2019-06-18 Perfect Corp. Systems and methods for simulated application of cosmetic effects
CN106251191A (en) * 2016-07-15 2016-12-21 深圳市金立通信设备有限公司 The display control method of a kind of terminal screen and terminal
CN106204691A (en) * 2016-07-19 2016-12-07 马志凌 Virtual make up system
CN107220960B (en) * 2017-05-27 2021-01-05 无限极(中国)有限公司 Make-up trial method, system and equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
通过美图化妆秀 体验虚拟化妆的实用性;美图化妆秀;《PConline》;20101109;第1-4页 *

Also Published As

Publication number Publication date
CN110135929A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
US10324739B2 (en) Systems and methods for simulated application of cosmetic effects
KR20210119438A (en) Systems and methods for face reproduction
US20210217216A1 (en) Presenting Multiple Image Segmentations
US10373348B2 (en) Image processing apparatus, image processing system, and program
JP2001344612A (en) Method and device for preparing animation, and computer-readable recording medium recorded with animation preparing program
WO2015192713A1 (en) Image processing method and device, mobile terminal, and computer storage medium
US10762665B2 (en) Systems and methods for performing virtual application of makeup effects based on a source image
EP3524089B1 (en) Systems and methods for virtual application of cosmetic effects to a remote user
JP2024506639A (en) Image display methods, devices, equipment and media
US12067645B2 (en) High-resolution controllable face aging with spatially-aware conditional GANs
US20190246065A1 (en) Systems and methods for makeup consultation using an improved user interface
US10748579B2 (en) Employing live camera feeds to edit facial expressions
CN110135929B (en) System, method and storage medium for implementing virtual makeup application
US20190053607A1 (en) Electronic apparatus and method for providing makeup trial information thereof
CN112839190A (en) Method for synchronously recording or live broadcasting video of virtual image and real scene
KR20160016812A (en) Image edits propagation to underlying video sequence via dense motion fields
CN104333699A (en) Method and device for synthesizing user-defined photographing region
US10936175B2 (en) Systems and methods for implementing a pin mechanism in a virtual cosmetic application
KR102573822B1 (en) Method for changing style and replaying of vector image
US8271893B1 (en) Transforming representation information
CN112083863A (en) Image processing method and device, electronic equipment and readable storage medium
US20100013838A1 (en) Computer system and motion control method
CN110136272B (en) System and method for virtually applying makeup effects to remote users
US20220101545A1 (en) Apparatus, method, and storage medium
KR102092156B1 (en) Encoding method for image using display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant