Nothing Special   »   [go: up one dir, main page]

CN110198437B - Image processing method and device, storage medium and electronic device - Google Patents

Image processing method and device, storage medium and electronic device Download PDF

Info

Publication number
CN110198437B
CN110198437B CN201810164716.3A CN201810164716A CN110198437B CN 110198437 B CN110198437 B CN 110198437B CN 201810164716 A CN201810164716 A CN 201810164716A CN 110198437 B CN110198437 B CN 110198437B
Authority
CN
China
Prior art keywords
color
target
attribute
color matching
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810164716.3A
Other languages
Chinese (zh)
Other versions
CN110198437A (en
Inventor
谢奕
张辉
程诚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810164716.3A priority Critical patent/CN110198437B/en
Publication of CN110198437A publication Critical patent/CN110198437A/en
Application granted granted Critical
Publication of CN110198437B publication Critical patent/CN110198437B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Color Image Communication Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an image processing method and device, a storage medium and an electronic device. Wherein, the method comprises the following steps: responding to a target operation instruction generated by target operation, and displaying color matching information, wherein the color matching information is used for indicating that the first target image is subjected to color matching according to the color corresponding to the color matching information; selecting first target color matching information from the color matching information; and performing color matching on the first target image according to the first target color corresponding to the first target color matching information to obtain a color-matched first target image. The invention solves the technical problem of low efficiency of color matching of the image in the related technology.

Description

Image processing method and device, storage medium and electronic device
Technical Field
The present invention relates to the field of image processing, and in particular, to a method and an apparatus for processing an image, a storage medium, and an electronic apparatus.
Background
At present, in the tool for producing marketing static picture material creativity in the market, when a picture is optimized, the picture is mostly realized in a filter mode, so that the overall unified color, brightness, saturation, contrast and other numerical values of the whole picture are adjusted, the optimization of the picture on the visual effect is realized, the picture is usually required to be manually matched with colors, and the function of intelligently matching colors according to the effect of a user main body image is not realized.
The method is lack of the capability of intelligent processing of the visual effect of the marketing creative picture and the capability of local and detail optimization of visual optimization, time consumption of manual color matching and uncertain color matching effect exist, and the problem of low efficiency of color matching of the picture exists.
In view of the above-mentioned problem of inefficient color matching of images, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides an image processing method, an image processing device, a storage medium and an electronic device, and aims to at least solve the technical problem of low efficiency of color matching of images in the related art.
According to an aspect of an embodiment of the present invention, there is provided a method of processing an image. The method comprises the following steps: responding to a target operation instruction generated by target operation, and displaying color matching information, wherein the color matching information is used for indicating that the first target image is subjected to color matching according to the color corresponding to the color matching information; selecting first target color matching information from the color matching information; and performing color matching on the first target image according to the first target color corresponding to the first target color matching information to obtain a color-matched first target image.
According to another aspect of the embodiment of the invention, an image processing device is also provided. The image processing apparatus includes: the display unit is used for responding to a target operation instruction generated by target operation and displaying color matching information, wherein the color matching information is used for indicating that the first target image is subjected to color matching according to the color corresponding to the color matching information; the selection unit is used for selecting first target color matching information from the color matching information; and the color matching unit is used for performing color matching on the first target image according to the first target color corresponding to the first target color matching information to obtain the color-matched first target image.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium. The storage medium has stored therein a computer program, wherein the computer program is arranged to execute the image processing method of an embodiment of the present invention when executed.
According to another aspect of the embodiment of the invention, an electronic device is also provided. The electronic device comprises a memory and a processor, and is characterized in that the memory stores a computer program, and the processor is configured to execute the image processing method of the embodiment of the invention through the computer program.
In the embodiment of the invention, color matching information is displayed in response to a target operation instruction generated by target operation, wherein the color matching information is used for indicating that the first target image is subjected to color matching according to the color corresponding to the color matching information; selecting first target color matching information from the color matching information; and performing color matching on the first target image according to the first target color corresponding to the first target color matching information to obtain a color-matched first target image. Through selecting out first target information of matching colors from the information of matching colors that shows, match colors to first target image through the information realization key intelligence of matching colors of first target, avoided matching colors through the manual work, lack the function that intelligent was matched colors, lead to matching colors consuming time and the effect of matching colors uncertain to the image to improve the efficiency of matching colors to the image, realized improving the technical effect of the efficiency of matching colors to the image, and then solved the technical problem of the inefficiency of matching colors to the image among the relevant art.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of a hardware environment for a method of processing an image according to an embodiment of the invention;
FIG. 2 is a flow chart of a method of processing an image according to an embodiment of the invention;
FIG. 3 is a flow diagram of another image process according to an embodiment of the invention;
FIG. 4 is a schematic illustration of dominant color recognition according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a K-means algorithm according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a same color extraction algorithm according to an embodiment of the present invention;
FIG. 7 is a schematic illustration of a principal layer element coloring according to an embodiment of the present invention;
FIG. 8 is a schematic illustration of extended layer element coloring according to an embodiment of the invention;
FIG. 9 is a diagram illustrating the effect of matching dominant color scenes in a same color family color operation according to an embodiment of the present invention;
FIG. 10 is a schematic illustration of approximate color sampling according to an embodiment of the invention;
FIG. 11 is a schematic illustration of another primary layer element coloring according to an embodiment of the present invention;
FIG. 12 is a schematic illustration of another extended layer element coloring according to an embodiment of the invention;
FIG. 13 is a diagram illustrating an adaptation effect of dominant color scenes for an alternative homogeneous color operation according to an embodiment of the present invention;
FIG. 14 is a schematic diagram of another approximate color sampling algorithm according to an embodiment of the present invention;
FIG. 15 is a schematic illustration of another primary layer element coloring according to an embodiment of the present invention;
FIG. 16 is a schematic illustration of another extended layer element coloring according to an embodiment of the invention;
FIG. 17 is a diagram illustrating an adaptation effect of dominant color scenes for an alternative homogeneous color operation according to an embodiment of the present invention;
FIG. 18 is a schematic diagram of a tristimulus algorithm according to an embodiment of the present invention;
FIG. 19 is a schematic illustration of another primary layer element coloring according to an embodiment of the present invention;
FIG. 20 is a schematic illustration of another extended layer element coloring according to an embodiment of the invention;
FIG. 21 is a diagram illustrating an adaptation effect of dominant color scenes for an alternative homogeneous color operation according to an embodiment of the present invention;
FIG. 22 is a schematic diagram of another tristimulus algorithm in accordance with embodiments of the invention;
FIG. 23 is a schematic illustration of another primary layer element coloring according to an embodiment of the present invention;
FIG. 24 is a schematic illustration of another extended layer element coloring according to an embodiment of the invention;
FIG. 25 is a diagram illustrating an adaptation effect of dominant color scenes for an alternative homogeneous color operation according to an embodiment of the present invention;
FIG. 26 is a schematic diagram of core logic at the product application level, according to an embodiment of the present invention;
FIG. 27 is a diagram of an effective canvas composition according to an embodiment of the present invention;
FIG. 28 is a schematic illustration of a user's initial marketing creative design effect, in accordance with an embodiment of the present invention;
FIG. 29 is a schematic diagram of a variation of an creative main map, according to an embodiment of the present invention;
FIG. 30 is a schematic diagram of an intelligent color matching function according to an embodiment of the invention;
FIG. 31 is a schematic diagram of an effect of intelligent color matching validation according to an embodiment of the invention;
FIG. 32 is a schematic illustration of a user replacing marketing creative design effect, in accordance with an embodiment of the present invention;
FIG. 33 is a schematic diagram of an apparatus for processing an image according to an embodiment of the present invention; and
fig. 34 is a block diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an aspect of embodiments of the present invention, there is provided an embodiment of a method of processing an image.
Alternatively, in the present embodiment, the image processing method described above may be applied to a hardware environment formed by the server 102 and the terminal 104 as shown in fig. 1. Fig. 1 is a schematic diagram of a hardware environment of a method for processing an image according to an embodiment of the present invention. As shown in fig. 1, a server 102 is connected to a terminal 104 via a network including, but not limited to: the terminal 104 is not limited to a PC, a mobile phone, a tablet computer, etc. in a wide area network, a metropolitan area network, or a local area network. The image processing method according to the embodiment of the present invention may be executed by the server 102, the terminal 104, or both the server 102 and the terminal 104. The terminal 104 may execute the image processing method according to the embodiment of the present invention by a client installed thereon.
Fig. 2 is a flowchart of a method of processing an image according to an embodiment of the present invention. As shown in fig. 2, the method may include the steps of:
step S202, responding to a target operation instruction generated by the target operation, and displaying color matching information.
In the technical solution provided in the above step S202 of the present application, color matching information is displayed in response to a target operation instruction generated by a target operation, where the color matching information is used to instruct to perform color matching on the first target image according to a color corresponding to the color matching information.
This embodiment can be in the intention preparation instrument that is used for making the image material, the response is by the target operation instruction that the target operation produced, the information of matching colors is displayed, this intention preparation instrument can be for the instrument of matching colors of intelligence, the target operation can be for the user click operation, the double click operation that go on target function button in the intention preparation instrument, stay the operation that the time triggered in the target area is predetermine, and here does not do any restriction, the function button of matching colors of intelligence that the target function button can show on the operation interface of intention preparation instrument. The target operation of this embodiment can produce target operation instruction, this target operation instruction is used for triggering and shows the information of matching colors, this information of matching colors is used for instructing and matches colors to first target image according to the colour that corresponds with the information of matching colors, can be multiple different information of matching colors, be used for instructing intelligence tendency of matching colors, this multiple different information of matching colors can be shown on the operation interface of intention preparation instrument with the form of listing, first target image can be for the target picture of uploading in advance, can be advertisement static picture, show in the current effectual drawing cloth region of intention preparation instrument, can match colors according to a certain color matching style to first target image, for example, match colors according to the same color system, approximate color matching, three-primary colors match colors etc..
Step S204, selecting first target color matching information from the color matching information.
In the technical solution provided in the above step S204 of the present application, the color matching information includes first target color matching information. After responding to a target operation instruction generated by target operation and displaying color matching information, selecting first target color matching information from the color matching information, wherein a user can perform selection operation on the color matching information according to the color matching requirement of the user on the first target image so as to generate an operation instruction for selecting the first target color matching information from the color matching information, and select the first target color matching information from the color matching information, and the color corresponding to the first target color matching information can perform harmonious color matching on the first target image. Optionally, when the first target color matching information is selected from the color matching information, a target identifier is displayed in a region associated with the first target color matching information, and the target identifier is used to indicate that the first target color matching information is selected, for example, without limitation, a pair of a check mark "√" or a circle with a center.
And step S206, performing color matching on the first target image according to the first target color corresponding to the first target color matching information to obtain a color-matched first target image.
In the technical solution provided in the above step S206 of the present application, the color matching information corresponds to a target color for performing color matching on the first target image. After the first target color matching information is selected from the color matching information, color matching is performed on the first target image according to a first target color corresponding to the first target color matching information, for example, color matching is performed on the first target image according to the first target color corresponding to the same color system, color matching is performed on the first target image according to the first target color corresponding to the approximate color matching, color matching is performed on the first target image according to the first target color corresponding to the three-primary color matching, and the like, so that the color-matched first target image is obtained. This first target image after matching colors also is the target image after matching colors through the function of matching colors of intelligence, satisfies the demand that the user matched colors to first target image to avoid the artifical problem consuming time and that the effect of matching colors is uncertain that matches colors and leads to first target image, realize that a key intelligence is matched colors, make first target image realize better visual effect, improved the efficiency of matching colors to the image.
Through the above steps S202 to S206, in response to a target operation instruction generated by the target operation, color matching information is displayed, where the color matching information is used to instruct to perform color matching on the first target image according to a color corresponding to the color matching information; selecting first target color matching information from the color matching information; and performing color matching on the first target image according to the first target color corresponding to the first target color matching information to obtain a color-matched first target image. Through selecting out first target information of matching colors from the information of matching colors that shows, match colors to first target image through the information realization key intelligence of matching colors of first target, avoided matching colors through the manual work, lack the function that intelligent was matched colors, lead to matching colors consuming time and the effect of matching colors uncertain to the image to improve the efficiency of matching colors to the image, realized improving the technical effect of the efficiency of matching colors to the image, and then solved the technical problem of the inefficiency of matching colors to the image among the relevant art.
As an alternative embodiment, the color matching information includes at least one of: the same color matching information is used for indicating that the first target image is matched according to the colors of the same color system; approximate color adaptation information, wherein the approximate color adaptation information is used for indicating color matching of the first target image according to colors of which the color values are within a target threshold; and the three primary color adaptation information is used for indicating that the first target image is subjected to color matching according to the three primary colors.
In this embodiment, the color matching information includes a plurality of different color matching information, which may include same color fitting information, approximate color fitting information, three primary color fitting information, and the like. The same-color matching information is used for indicating that the first target image is subjected to color matching according to the colors of the same color system, for example, the first target image is subjected to color matching according to a warm color system, or the first target image is subjected to color matching according to a cold color system, and the like; the approximate color matching information is used for indicating color matching of the first target image according to colors of which the color values are within a target threshold value, and the target threshold value is used for measuring whether the colors are similar or not, namely, the colors within the target threshold value can be regarded as similar colors; the three primary color adaptation information is used for indicating that the first target image is subjected to color matching according to three primary colors, wherein the three primary colors are red, yellow and blue, and all colors can be mixed according to proportions by the three primary colors.
It should be noted that the color matching information of this embodiment includes the same-color matching information, the approximate-color matching information, and the three-primary-color matching information, which are only preferred embodiments of the embodiment of the present invention, and do not represent that the color matching information of this embodiment of the present invention is only the same-color matching information, the approximate-color matching information, and the three-primary-color matching information, and any color matching information that can be used for performing color matching on the first target image is within the scope of the embodiment of the present invention, and is not illustrated here.
As an alternative embodiment, after the first target color matching information is selected from the color matching information in step S204, the method further includes: displaying first prompt information, wherein the first prompt information is used for prompting whether color matching is carried out on the first target image according to a first target color corresponding to the first target color matching information; performing color matching on the first target image according to a first target color corresponding to the first target color matching information, and obtaining the color-matched first target image includes: and under the condition that the determination information is obtained, performing color matching on the first target image according to a first target color corresponding to the first target color matching information to obtain a color-matched first target image, wherein the determination information is used for indicating that the first target image is determined to be color-matched according to the first target color corresponding to the first target color matching information.
In this embodiment, prompt information may be displayed in the operation interface of the creative production tool. And after the first target color matching information is selected from the color matching information, displaying first prompt information, wherein the first prompt information is used for prompting whether to determine that the first target image is matched according to the first target color corresponding to the first target color matching information, so that sufficient consideration time is provided for a user, and the first target color matching information is prevented from being selected due to misoperation of the user. Optionally, the first prompt message of this embodiment may be a text prompt message, an icon prompt message, a voice prompt message, and the like, which is not limited herein.
After the first prompt information is displayed, in the case that determination information is obtained, color matching is performed on the first target image according to a first target color corresponding to the first target color matching information, where the determination information is used to instruct to determine that the first target image is color-matched according to the first target color corresponding to the first target color matching information, for example, when a user operates a "determination" function button displayed on an operation interface, the determination information is obtained, that is, the first target color matching information is not color matching information selected by the user due to a wrong operation, so that the first target image is color-matched according to the first target color corresponding to the first target color matching information, and the color-matched first target image is obtained.
As an optional implementation manner, after displaying the first prompt message, the method further includes: and under the condition that the cancellation information is obtained, color matching of the first target image according to the first target color corresponding to the first target color matching information is cancelled, wherein the cancellation information is used for indicating cancellation of color matching of the first target image according to the first target color corresponding to the first target color matching information.
In this embodiment, after the first prompt information is displayed, the color matching of the first target image in the first target color corresponding to the first target color matching information may be canceled. The embodiment may obtain, after displaying the first prompt information, cancellation information that is used to instruct to cancel color matching of the first target image according to the first target color corresponding to the first target color matching information, for example, if the user operates a "cancel" function button displayed on the operation interface, the cancellation information is obtained, that is, if the first target color matching information is not the color matching information that the user wants to select for color matching of the first target image, the color matching of the first target image according to the first target color corresponding to the first target color matching information is cancelled.
As an alternative implementation manner, in step S206, after the first target image is color-matched according to the first target color corresponding to the first target color matching information, the method further includes: acquiring a second target image; responding to a target operation instruction generated by the target operation, and displaying color matching information; selecting second target color matching information from the color matching information; and carrying out color matching on the second target image according to a second target color corresponding to the second target color matching information to obtain a color-matched second target image.
In this embodiment, after the first target image is subjected to color matching according to the first target color corresponding to the first target color matching information to obtain the color-matched first target image, color matching may be further continued on a second target image, which may be different from the first target image, and a plurality of different color matching information may be generated in response to the target operation instruction generated by the target operation, for example, same color matching information, approximate color matching information, three primary color matching information, and the like may be generated. After the color matching information is displayed, second target color matching information is selected from the color matching information, the second target color matching information is the user's requirement for matching colors of a second target image, and the selected color matching information may be different from the first target color matching information or the same as the first target color matching information. And after the second target color matching information is selected from the color matching information, performing color matching on the second target image according to a second target color corresponding to the second target color matching information, thereby obtaining a color-matched second target image.
As an alternative implementation manner, in step S204, after the first target color matching information is selected from the color matching information and before the first target image is color-matched according to the first target color corresponding to the first target color matching information to obtain a color-matched first target image, the method further includes: and displaying the target text or the target picture, wherein the target text and the target picture are both used for indicating the first target image after color matching.
In this embodiment, the color matching effect of the color-matched first target image may be previewed. After the first target color matching information is selected from the color matching information, and color matching is performed on the first target image according to the first target color corresponding to the first target color matching information, before the first target image after color matching is obtained, a target text or a target picture can be displayed on an operation interface of the creative production tool, the target text can comprise experience type description words and is used for describing the first target image after color matching, and the target picture can be used for indicating color effect preview of the first target image after color matching, so that a user can be guided to understand the effect of the first target image after color matching more intuitively, the operation experience of the user is optimized, and the efficiency of color matching of the image is improved.
As an alternative implementation manner, in step S206, color matching the first target image according to the first target color corresponding to the first target color matching information, and obtaining the color-matched first target image includes: the method comprises the steps of obtaining a color value of a main image layer of a first target image, an image layer element of the first target image and a color attribute of the image layer element, wherein the image layer element comprises a plurality of image layers, the plurality of image layers comprise the main image layer, first target color matching information is associated with a color value of the main image layer, the image layer element of the first target image and the color attribute of the image layer element, and the color attribute comprises a hue attribute, a saturation attribute and a brightness attribute of the first target image; determining a first target color corresponding to the color value, the layer element and the color attribute of the main picture layer; and carrying out color matching on the first target image according to the first target color to obtain the color-matched first target image.
In this embodiment, the first target image includes an image layer element, the image layer element includes a plurality of image layers, the image layer includes a main image layer, a color value of the main image layer of the first target image is obtained, a dominant color of the main image layer of the first target image in the effective canvas area in the operation interface of the current tool may be obtained, and an HSB value of the color is obtained, where h (hues) is used to represent hue, s (saturation) is used to represent saturation, and b (brightness) is used to represent brightness, so as to achieve obtaining of the dominant color of the main image layer in the effective canvas area of the user. This embodiment can also obtain the layer element of first target image, disassembles the layer to first target image, obtains the layer element, can be in the operation interface of intention preparation instrument, the layer element of intention preparation in the effective canvas of discernment, and this layer element includes a plurality of picture layers, for example, including the design element when matching colors to first target image that contains in the layer of background layer, main part layer, decorative layer, text layer, interaction layer and different grade type. The embodiment may further obtain a color attribute of a layer element of the first target image, where the color attribute may be an attribute color value parameter item of a design element of the layer element, may define a layer attribute and an intelligent assignment parameter of an operation area of the creative production tool, including defining a hue attribute, a saturation attribute, and a brightness attribute of the first target image, and may apply a parameter calculation rule according to an HSB color of the layer element and a color in the marketing creative picture.
After the color value of the main image layer of the first target image, the layer elements of the first target image and the color attributes of the layer elements are obtained, a first target color corresponding to the color value of the main image layer, the layer elements and the color attributes is determined, the first target color corresponding to the color value of the main image layer and the color attributes of the layer elements can show a style tendency of intelligent color matching of the first target image, a user can randomly or freely select the style tendency of intelligent color matching of the first target image, and different style tendencies are associated with color-taking rules and algorithms of corresponding HSB layer design elements.
After the first target color is determined, color matching is conducted on the first target image according to the first target color, the color value of the color attribute calculated through the HSB parameter can be assigned to the attribute corresponding to each layer of the layer element, the picture effect is generated intelligently, the purpose of intelligent color matching of the first target image is achieved and completed, the first target image after color matching is obtained, time consumption and uncertain color matching effect of manual color matching are avoided, color matching of the layer element of the first target image is achieved intelligently through one key, and therefore the better visual effect of the first target image is achieved.
As an optional implementation manner, the layer element further includes at least one of the following: background picture layer, decoration picture layer, text picture layer, interactive picture layer.
In this embodiment, the layer of first target image is disassembled to obtain the layer element, this layer element is the layer element of creative production in the effective canvas in the operation interface of creative production tool, this layer element includes a plurality of picture layers, except including the drawing slice layer, can also include background picture layer, decorate the picture layer, text picture layer, mutual picture layer etc. can also include the design element that contains in the layer of different grade type, and then define the attribute color value parameter item of different design elements to the realization is matched colors to first target image intelligence, has improved the efficiency of matching colors to the image.
As an optional implementation, acquiring the color value of the main picture layer of the first target image includes: acquiring an image of a first target image in a first color mode, wherein the color of the first target image is represented by color values of a red channel, a green channel and a blue channel in the first color mode respectively; converting the image in the first color mode into an image in a second color mode, wherein the second color mode represents colors through a hue attribute, a saturation attribute and a brightness attribute respectively; respectively obtaining a weighted value of the hue attribute, a weighted value of the saturation attribute and a weighted value of the brightness attribute, wherein the weighted value of the hue attribute is used for indicating the proportion occupied by the hue attribute on the visual color of the first target image, the weighted value of the saturation attribute is used for indicating the proportion occupied by the saturation attribute on the visual color, and the weighted value of the brightness attribute is used for indicating the proportion occupied by the brightness attribute on the visual color; clustering the images of the second color mode according to a plurality of randomly selected clustering centers respectively through the weighted value of the hue attribute, the weighted value of the saturation attribute and the weighted value of the brightness attribute to obtain a plurality of clustering results, wherein the clustering centers correspond to the clustering results one to one; determining a clustering result with the lowest dissimilarity degree in the plurality of clustering results as a target clustering result; and determining the average color value of the color category with the largest number of pixel points in the target clustering result as the color value of the main picture layer.
In this embodiment, when obtaining the color value of the main picture layer of the first target image, an image of the first target image in a first color mode may be obtained, where the first color mode represents colors through color values of a red channel, a green channel, and a blue channel, respectively, and the first color mode may be an RGB color mode. Generally, the color values of the image in the RGB color mode are greatly different from the values observed by human vision, that is, the similar RGB values represent two types of colors, and the human vision tends to have a large difference in their observation feelings. The embodiment converts an image in a first color mode into an image in a second color mode, wherein the second color mode represents colors by a hue attribute, a saturation attribute and a brightness attribute respectively, that is, the hue attribute, the saturation attribute and the brightness attribute are color attributes, the hue attribute is used for describing basic attributes of the colors of the first target image and takes a value of 0-360, the saturation attribute is used for controlling the color purity of the first target image, the higher the saturation is, the purer the colors are, the lower the saturation is, the lighter the saturation is, the value is 0-100%, the brightness attribute is also used for controlling the color brightness of the first target image, the higher the brightness is, the darker the brightness is, and the value is 0-100%. The second color mode may be an HSB color mode, and the image in the HSB color mode better conforms to human visual features, that is, the HSB color system is closer to the description of the color by people than the RGB color system, and the HSB color close to the visual description of the color by people may be used as a clustering color vector in this embodiment, so as to obtain a more ideal visual color effect.
In this embodiment, the hue attribute, the saturation attribute, and the brightness attribute of the second color mode have different roles in visual perception, and thus, a weighting method is used to calculate the color space distance, and the weighting value of the hue attribute, the weighting value of the saturation attribute, and the weighting value of the brightness attribute can be obtained respectively. The visual color of the embodiment may be a color visually perceived by human vision on the first target image, the weight value of the hue attribute is used to indicate a proportion occupied by the hue attribute on the visual color of the first target image, the weight value of the saturation attribute is used to indicate a proportion occupied by the saturation attribute on the visual color, and the weight value of the brightness attribute is used to indicate a proportion occupied by the brightness attribute on the visual color.
The embodiment can obtain the color value of the main picture layer by adopting a K-means algorithm, the K-means algorithm is a clustering algorithm, and the main idea is to divide a data set into different categories through an iteration process so that a criterion function for evaluating the clustering performance is optimal, and each generated cluster is compact in inside and independent among the categories. The method comprises the steps of randomly selecting K seed points as initial centroids of K clusters, calculating the dissimilarity degree from the remaining elements to the initial centroids of the K clusters respectively, classifying the elements into the clusters with the lowest dissimilarity degree respectively, recalculating the respective centers of the K clusters by calculating the average value of the respective latitudes of all the elements in the clusters according to a clustering result, then re-clustering according to a new center, repeatedly executing the step of recalculating the respective centers of the K clusters by calculating the average value of the respective latitudes of all the elements in the clusters according to the clustering result, and re-clustering according to the new center until the clustering result is not changed any more.
In the embodiment, the influence of the selection of the clustering center of the K-means algorithm on the clustering result is large, multiple clustering is adopted to prevent the influence of the selection of the clustering center of the K-means algorithm on the clustering result, and the clustering center is randomly selected and is a seed point and an initial point of the clustering algorithm. After the weighted value of the hue attribute, the weighted value of the saturation attribute and the weighted value of the brightness attribute of the first target image are respectively obtained, the method is startedAnd clustering the images of the second color mode according to a plurality of randomly selected clustering centers respectively to obtain a plurality of clustering results, for example, clustering the images of the second color mode through fifteen randomly selected clustering centers. In the original algorithm, the formula for calculating the dissimilarity between sample points generally adopts the euclidean distance formula, and the method defaults that the influence of each component in the color vector on the color difference is consistent, that is, the influence of the hue value change unit 1 and the saturation change unit 1 on the visual color is the same. The hue component, however, often has a greater effect on the visual color than the saturation, and thus, this embodiment employs a formula for calculating a weighted dissimilarity, that is,
Figure GDA0003196011260000141
wherein d isijFor indicating the degree of dissimilarity, wk=(wkh,wks,wkb) The k is used for representing the kth clustering center, and n is used for representing the number of the clustering centers.
After the images of the second color mode are clustered according to the randomly selected clustering centers respectively to obtain a plurality of clustering results, determining the clustering result with the lowest dissimilarity among the plurality of clustering results as a target clustering result, namely, taking the result with the lowest overall weighted dissimilarity among the plurality of clustering results as a final target clustering result so as to eliminate the influence of the clustering centers, and further determining the average color value of the color category with the largest number of pixel points in the target clustering result as the color value of the main picture layer. Optionally, in order to reasonably analyze the color of the first target image, the embodiment may specify that the cluster category of the target clustering result is 10, that is, finally obtain 10 color categories, where the K value is equal to 10, perform pixel statistics and average value calculation on the target clustering result, respectively count the number of pixels and the average color value of each color category, select the color category with the largest number of pixels, perform reverse order arrangement according to the number of pixels in each color category, where the average value arranged in the first corresponding category is the color value that we find, and the average color is the dominant color that we find for the main body image layer of the embodiment.
As an optional implementation manner, the obtaining the weight value of the hue attribute, the weight value of the saturation attribute, and the weight value of the brightness attribute respectively includes: determining a weighted value of the hue attribute, a weighted value of the saturation attribute and a weighted value of the brightness attribute through a target model, the hue attribute, the saturation attribute and the brightness attribute, wherein the target model is obtained by training a predetermined model through a target sample, and the target sample comprises the hue attributes of a plurality of images, the saturation attributes of the plurality of images, the brightness attributes of the plurality of images, the weighted values of the hue attributes of the plurality of images, the weighted values of the saturation attributes of the plurality of images and the weighted values of the brightness attributes of the plurality of images which are obtained in advance.
In this embodiment, when the weight value of the hue attribute, the weight value of the saturation attribute, and the weight value of the brightness attribute are obtained respectively, the weight value of the hue attribute, the weight value of the saturation attribute, and the weight value of the brightness attribute may be finally determined through observation, training, and debugging of a large number of samples. Determining a weighted value of the hue attribute, a weighted value of the saturation attribute and a weighted value of the brightness attribute through a target model, the hue attribute, the saturation attribute and the brightness attribute, wherein the target model is obtained by training a predetermined model through a target sample, the target sample comprises the hue attributes of a plurality of images, the saturation attributes of the images, the brightness attributes of the images, the weighted values of the hue attributes of the images, the weighted values of the saturation attributes of the images and the weighted values of the brightness attributes of the images, the predetermined model can be an initially established detection model and can be an initial neural network model, the initial neural network model is described on the basis of a mathematical model of neurons, and the neurons are described by the hue attributes of the images, the saturation attributes of the images, the weighted values of the saturation attributes of the images, and the weighted values of the brightness attributes of the images, the initial neural network model can be an initial neural network model, the initial neural network model is obtained by collecting the hue attributes of the images, the saturation attributes of the images, and the luminance attributes of the images, The brightness attributes of the plurality of images, the weight values of the hue attributes of the plurality of images, the weight values of the saturation attributes of the plurality of images, and the weight values of the brightness attributes of the plurality of images are established.
Optionally, when the target model is obtained by training the predetermined model, the embodiment analyzes hue attributes of the plurality of images, saturation attributes of the plurality of images, brightness attributes of the plurality of images, weight values of the hue attributes of the plurality of images, weight values of the saturation attributes of the plurality of images, and weight values of the brightness attributes of the plurality of images through machine learning. The method comprises the steps of preprocessing hue attributes of a plurality of images, saturation attributes of the plurality of images, brightness attributes of the plurality of images, weight values of the hue attributes of the plurality of images, weight values of the saturation attributes of the plurality of images and weight values of the brightness attributes of the plurality of images according to algorithms such as a distribution consistency algorithm, denoising and sampling, then performing feature extraction, feature transformation, feature normalization, feature combination and the like on preprocessed data to obtain features for training a preset model, and further processing the features through an optimization algorithm, an assumed function, a loss function, a decision boundary, a convergence speed, an iteration strategy and the like to obtain a target model. And finally, cross validation, target evaluation, overfitting, underfitting and other evaluations can be carried out on the target model, so that the trained target model is determined, then the weighted value of the hue attribute, the weighted value of the saturation attribute and the weighted value of the brightness attribute are determined through the target model, the hue attribute, the saturation attribute and the brightness attribute, the images of the second color mode are clustered respectively according to a plurality of randomly selected clustering centers through the weighted value of the hue attribute, the weighted value of the saturation attribute and the weighted value of the brightness attribute, a plurality of clustering results are obtained, the clustering result with the lowest dissimilarity in the clustering results is determined as the target clustering result, and therefore the average color value of the color category with the largest number of pixels in the target clustering result is determined as the color value of the main body picture layer.
As an optional implementation manner, obtaining the color attribute of the layer element includes: acquiring the color attribute of the layer element by at least one of the following steps: the color attributes of the shapes in the layer elements comprise hue attributes, saturation attributes, brightness attributes and transparency attributes of the shapes; the color attributes of the text in the layer elements comprise hue attributes, saturation attributes, brightness attributes and transparency attributes of the text; and the dominant colors in the layer elements comprise hue attributes, saturation attributes, brightness attributes and transparency attributes.
In this embodiment, when obtaining the color attribute of the layer element, a color attribute sharp(s) of the shape in the layer element may be obtained, and may be used to describe the color value of the shape in the layer element, where the color attribute of the shape in the layer element includes a hue attribute h(s), a saturation attribute s(s), a brightness attribute b(s), and a transparency attribute a(s), and a plurality of shapes are described in sequence by h (s1), s (s1), b (s1), a (s1), h (s2), s (s2), b (s2), and a (s2) … ….
The color attribute of the layer element obtained in this embodiment may also be a color attribute text (t) of a text in the layer element, and may be used to describe a color value of a font in the layer element. The color attribute of the text in the layer element comprises a hue attribute h (t), a saturation attribute s (t), a brightness attribute b (t) and a transparency attribute a (t) of the text.
The color attribute of the obtained layer element in this embodiment may also be a Primary color (Primary) in the layer element, and the Primary color value may be obtained by an image recognition module in the effective canvas area, where the Primary color includes a hue attribute h (p), a saturation attribute s (p), a brightness attribute b (p), and a transparency attribute a (p), and may be expressed as: h (p), s (p), b (p).
Optionally, this embodiment performs same color extraction, and primary layer element extraction, e.g., sharp1(s1) -sharp (s3), text (t), remains unchanged as a primary rule with dominant color h (p); the color of the element of the expansion layer can be selected, and the main color h (p) +30 degrees and 330 degrees are used as main rules; an approximate color sampling algorithm can be carried out, and based on the dominant color h (p), the forward color sampling is 30 degrees and +30 degrees; color extraction of main layer elements can also be performed, for example, sharp1(s1) -sharp (s3), text (t), with main color h (p) +30 degrees as a main rule; the color of the element of the expansion layer can be selected, and the main colors h (p) +20 degree and +10 degree are taken as main rules; an approximate color sampling algorithm can be carried out, and based on the dominant color h (p), the negative color sampling is 30 degrees and minus 30 degrees; color extraction of main layer elements can be carried out, such as sharp1(s1) -sharp (s3), text (t), and a main color h (p) -30 degrees is taken as a main rule; the color of the element of the expansion layer can be selected, and the dominant color h (p) -20 degree and-10 degree are used as main rules; the ternary color sampling algorithm can also be carried out, based on the dominant color h (p), the forward direction color sampling is 120 degrees and +120 degrees, and the approximate diffusion is carried out for +/-30 degrees based on the color value; color extraction of main layer elements can also be performed, for example, sharp1(s1) -sharp (s3), text (t), with main color h (p) +120 degrees as a main rule; the color selection of the elements of the expansion layer can be carried out, and the main colors h (p) +90 degrees and +150 degrees are taken as main rules; the ternary color sampling algorithm can also be carried out, based on the dominant color h (p), the forward direction color sampling is 240 degrees, +240, and the approximate diffusion is carried out for +/-30 degrees based on the color value; color extraction of main layer elements can also be performed, for example, sharp1(s1) -sharp (s3), text (t), with main color h (p) +240 degrees as a main rule; the color of the element of the expansion layer can be selected by taking the dominant color h (p) +210 degree, +270 degree as the main rule, etc.
As an optional implementation manner, obtaining the color attribute of the layer element includes: and acquiring the color attribute of each picture layer according to the priorities of the plurality of picture layers included in the picture layer elements.
In this embodiment, in the intelligent processing effect of the color of the first target image, a layer element priority identifier may be further added to indicate the priorities of the plurality of picture layers, and the color attribute of each picture layer is obtained according to the priorities of the plurality of picture layers included in the layer element, so that the color attribute of the layer element is obtained through the distinguishing processing.
According to the embodiment, the images of the marketing creative production tool can be subjected to layered processing, and the layer attributes of different types are automatically assigned through color value rules of color HSB according to the dominant color identification of the images, so that high-quality marketing creative pictures are intelligently generated.
The key technology related to the embodiment is to identify the dominant color of the current image, disassemble the image layer and define the attribute of the current image, use the color extracting method of the HSB and the corresponding parameter calculation rule, assign the attribute of the image layer and generate the attribute intelligently.
The embodiment can merge the image content of the effective region of the user canvas of the tool product through the image dominant color identification module, identify the color of the main picture layer, obtain the HSB color value, the layer attribute and the intelligent assignment parameter of the tool operation area are defined by a layer disassembling and attribute defining module, by applying the scene designation parameter calculation rule according to the HSB color and the color in the marketing creative picture by the HSB color-taking parameter calculation method, assigning the attribute color value calculated by the HSB parameter to the corresponding attribute of the layer through a layer attribute assignment module to intelligently generate a picture effect, thereby combining the image dominant color recognition technology with the color HSB color value rule, automatically and intelligently generating a high-quality creative picture for a user, and then solve non-professional designer and can't select harmonious color matching when manually matching colors, lead to the not good problem of visual effect of marketing intention picture.
The technical solution of the present invention will be described below with reference to preferred embodiments.
According to the embodiment, through the application of image identification and HSB color-taking rules in intelligent assignment of layer parameters, a layer element color matching scheme matched with the visual environment of the current marketing creative picture of the user is intelligently generated, and therefore creative production efficiency and effect are greatly improved.
The technical implementation flow of the embodiment of the present invention is described below.
FIG. 3 is a flow diagram of another image processing according to an embodiment of the invention. As shown in fig. 3, the method comprises the steps of:
and step S31, creating creative pictures.
And step S32, intelligently matching colors of the creative pictures.
Wherein, the intelligent color matching comprises step S321 and step S322. Step S321 includes steps S3211 to S3217:
in step S3211, an original RGB image is obtained by the dominant color recognition module.
In step S3212, the original RGB image is converted into an HSB image.
In step S3213, an initial point of the clustering operation is randomly selected.
In step S3214, a weighted dissimilarity degree is calculated.
This embodiment may repeat steps S3213 and S3214 10 times.
Step S3215, selecting an optimal clustering result.
Step S3216, counting the pixels and calculating an average value.
Step S3217, sorting and outputting the main color values.
Step S322 includes steps S3221 to S3223:
in step S3221, the intelligent color matching application module describes various elements on the canvas through JSON metadata elements.
In step S3222, the HSB color value is calculated.
In step S3223, element attribute value replacement and effect application.
The following describes dominant color recognition according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of dominant color recognition according to an embodiment of the present invention. As shown in fig. 4, the dominant color recognition of the image may use an improved clustering algorithm K-means, and since the conventional RGB color values are greatly different from the human visual observation values, that is, the visual perception of two types of colors represented by similar RGB values is often greatly different. Therefore, in the clustering process of the embodiment, firstly, the color space of the RGB image is converted into the HSB image more conforming to the human visual characteristics, and since the three basic color attributes of the HSB image have different effects on the visual perception, the embodiment can calculate the color space distance in a weighting manner, allocate different weights to the sizes of the influences of the three basic color attributes of the HSB image on the visual color, and finally determine the sizes of the weights through a large number of sample training.
Optionally, in order to prevent the influence of the selection of the seed points of the K-means algorithm on the clustering result, the embodiment adopts a mode of clustering for multiple times and randomly selecting the seed points. And respectively counting the number of pixel points and the average color value of each color category in the finally obtained ten color categories, and selecting the color category with the largest number of pixel points, wherein the average color is the main color required by the scheme.
In the embodiment, the K-means algorithm is a clustering algorithm, and the data set is divided into different categories through an iterative process, so that a criterion function for evaluating the clustering performance is optimal, and each generated cluster is compact and independent.
FIG. 5 is a schematic diagram of a K-means algorithm according to an embodiment of the present invention. As shown in fig. 5, K points may be randomly selected as the initial centroid of the K clusters, for example, A, B, C, D, E is selected, the degrees of dissimilarity from the remaining elements to the centers of the K clusters are respectively calculated, for example, the degrees of dissimilarity from the remaining elements to A, B, C, D, E are calculated as indicated by arrows in the figure, the elements are respectively classified into the clusters with the lowest degrees of dissimilarity, the respective centers of the K clusters are recalculated by averaging the respective latitudes of all the elements in the clusters according to the clustering result, and re-clustering is performed according to the new centers. And repeatedly calculating respective centers of the K clusters by calculating the average value of respective latitudes of all elements in the clusters according to the clustering result, and re-clustering according to the new centers until the clustering result is not changed any more.
In this embodiment, HSB colors that are closer to the visual description of colors by people can be used as the cluster color vector to obtain a more desirable visual color effect. In the original algorithm, the formula for calculating the dissimilarity between sample points generally adopts the euclidean distance formula, that is,
Figure GDA0003196011260000201
the method defaults to the effect of each component in the color vector on color difference being consistent, i.e., the effect of hue value change unit 1 and saturation change unit 1 on the visual color is the same. However, in practice, the influence of hue components on visual colors is often greater than the influence of saturation on visual colors. This embodiment thus employs a weighted dissimilarity calculation formula, that is,
Figure GDA0003196011260000211
wherein d isijFor indicating the degree of dissimilarity, wk=(wkh,wks,wkb) Respectively, for expressing each of hue, saturation and lightnessThe weight values of the components, specific values, can be obtained by observing, training and debugging a large number of samples, wherein k is used for representing the kth clustering center, and n is used for representing the number of the clustering centers.
In this embodiment, since the initial point selection of the K-means algorithm has a large influence on the clustering result, the clustering operation may be performed through fifteen randomly selected initial points, and the result with the lowest overall weighted dissimilarity among the obtained clustering results is taken as the final result, so as to eliminate the influence of the initial value.
In order to reasonably analyze the color of the image, the embodiment may specify that the clustering category is 10, that is, the K value is equal to 10, perform pixel statistics and average value calculation on the finally determined clustering result, finally perform reverse order arrangement according to the number of the pixels in each category, and the average value arranged in the first corresponding category is the color value that we have found, and so on.
The following describes a color extraction algorithm according to an embodiment of the present invention.
Compared with an RGB color system, the HSB color system is closer to the description of people on colors, wherein a Hue (Hue) is used for describing the basic attribute of the colors and takes a value of 0-360; saturation (Saturation) for controlling color purity, wherein the higher the color is, the purer the color is, and the lower the color is, the lighter the color is, and the value is 0-100%; lightness (Brightness) is used for controlling color Brightness, and the higher the lightness is, the brighter the lightness is, and the lower the lightness is, the darker the lightness is, and the lightness is 0-100%. The color extraction algorithm in this embodiment is an operation of applying the HSB color system to perform a color extraction rule.
The correspondence between colors and layer elements according to the embodiment of the present invention is described below.
Primary, for the dominant color value, obtains the dominant color value through the regional image recognition module of effective canvas, can be expressed as: h (p), s (p), b (p); alpha (a), a transparency parameter item; sharp(s) for describing color values of shapes in layer elements, a color value may be expressed as: h(s), s(s), b(s), and the transparency is expressed as a(s); the plurality of shapes are described in order of (s1), (s2), (s3) … …; text (t) for describing color values of the font in the layer element, one color value can be expressed as: h (t), s (t), b (t), and transparency expressed as a (t).
Fig. 6 is a schematic diagram of a same color extraction algorithm according to an embodiment of the present invention. As shown in fig. 6, in the text (text), h (t) is 0, s (t) is 100, b (t) is 40, and a (t) is 100%; in shape 1(sharp1), h (s1) is 0, s (s1) is 10, b (s1) is 100, a (s1) is 95%; in shape 2(sharp2), h (s2) is 0, s (s2) is 50, b (s2) is 100, a (s2) is 60%; in shape 3(sharp3), h (s3) is 0, s (s3) is 30, b (s3) is 100, a (s3) is 95%; in shape 4(sharp4), h (s4) is 0, s (s4) is 100, b (s4) is 100, a (s4) is 60%; in primary color (primary), h (p) is 0, s (p) is 100, b (p) is 100, and a (s4) is 100%. The same color family may vary between +30 degrees and-30 degrees of the dominant color.
FIG. 7 is a schematic illustration of a principal layer element coloring according to an embodiment of the invention. As shown in fig. 7, the primary layer elements take colors, i.e., sharp1(s1) -sharp (s3), text (t), remain unchanged as the primary rule with primary color h (p). Wherein, the dominant colors are h (p), s (p), b (p), a (p); shape 1 is h (s1) ═ h (p), s (s1) ═ s (p) 0.1, b (s1) ═ 100, a (s1) ═ 95%; shape 2 is h (s2) ═ h (p), s (s2) ═ s (p) 0.5, b (s2) ═ 100, a (s2) ═ 60%; the characters are h (t) ═ h (p), s (t) ═ s (p), b (t) ═ b (p) × 0.4, a (t) ═ 100%; shape 3 is h (s3) ═ h (p), s (s3) ═ s (p) 0.3, b (s3) ═ 100, a (s3) ═ 95%; the shape 4 is h (s4) ═ h (p), s (s4) ═ s (p), b (s4) ═ 100, and a (s4) ═ 60%.
FIG. 8 is a schematic diagram of extended layer element coloring according to an embodiment of the invention. As shown in fig. 8, when the extended layer element is colored, the dominant color h (p) +30, +330 degrees is used as the main rule. Wherein, the shape 5 is h (s5) ═ h (p) +30, s (s5) ═ s (p) × 0.2, b (s5) ═ 100, a (s5) ═ 95%; shape 6 is h (s6) ═ h (p) +30, s (s6) ═ s (p) × 0.5, b (s6) ═ 100, a (s6) ═ 60%; shape 7 is h (s7) ═ h (p) +30, s (s7) ═ s (p) × 0.4, b (s7) ═ 100, a (s7) ═ 95%; shape 8 is h (s8) ═ h (p) +330, s (s8) ═ s (p) × 0.2, b (s8) ═ 100, a (s8) ═ 95%; shape 9 is h (s9) ═ h (p) +330, s (s9) ═ s (p) × 0.5, b (s9) ═ 100, a (s9) ═ 60%; the shape 10 is h (s10) ═ h (p) +330, s (s10) ═ s (p) × 0.4, b (s10) ═ 100, and a (s10) ═ 95%.
FIG. 9 is a schematic diagram of a same color family color operation of the effect of adapting each dominant color scene according to an embodiment of the present invention. As shown in fig. 9, the images shown in the figure are in the same color family, and the same color family can indicate the adaptation effect of the color operation of each dominant color scene.
FIG. 10 is a schematic diagram of approximate color extraction according to an embodiment of the invention. As shown in fig. 10, when the approximate color matching algorithm is performed, 30 degrees, +30 are positively matched based on the dominant color h (p). Wherein, in text (text), h (t) is 60, s (t) is 100, b (t) is 40, and a (t) is 100%; in shape 1(sharp1), h (s1) is 90, s (s1) is 20, b (s1) is 100, a (s1) is 95%; in shape 2(sharp2), h (s2) is 90, s (s2) is 50, b (s2) is 100, a (s2) is 60%; in shape 3(sharp3), h (s3) is 90, s (s3) is 40, b (s3) is 100, a (s3) is 95%; in shape 4(sharp4), h (s4) is 90, s (s4) is 100, b (s4) is 100, a (s4) is 60%; in primary color (primary), h (p) is 60, s (p) is 100, b (p) is 100, and a (p) is 100%.
FIG. 11 is a schematic illustration of another primary layer element coloring according to an embodiment of the invention. As shown in fig. 11, when the main layer element is colored, sharp1(s1) -sharp (s3), text (t), and main color h (p) +30 are used as main rules. Wherein, the dominant colors are h (p), s (p), b (p), a (p); shape 1 is h (s1) ═ h (p) +30, s (s1) ═ s (p) × 0.2, b (s1) ═ 100, a (s1) ═ 95%; shape 2 is h (s2) ═ h (p) +30, s (s2) ═ s (p) × 0.5, b (s2) ═ 100, a (s2) ═ 60%; the characters are h (t) ═ h (p), s (t) ═ s (p), b (t) ═ b (p) × 0.4, a (t) ═ 100%; shape 3 is h (s3) ═ h (p) +30, s (s3) ═ s (p) × 0.4, b (s3) ═ 100, a (s3) ═ 95%; shape 4 is h (s4) ═ h (p) +30, s (s4) ═ s (p), b (s4) ═ 100, a (s4) ═ 60%.
FIG. 12 is a schematic diagram of another extended layer element coloring according to an embodiment of the invention. As shown in fig. 12, when the extended layer element is colored, the dominant colors h (p) +20 and +10 are used as the main rules. Wherein, the shape 5 is h (s5) ═ h (p) +20, s (s5) ═ s (p) × 0.2, b (s5) ═ 100, a (s5) ═ 95%; shape 6 is h (s6) ═ h (p) +20, s (s6) ═ s (p) × 0.5, b (s6) ═ 100, a (s6) ═ 60%; shape 7 is h (s7) ═ h (p) +20, s (s7) ═ s (p) × 04, b (s7) ═ 100, a (s7) ═ 95%; shape 8 is h (s8) ═ h (p) +10, s (s8) ═ s (p) × 0.2, b (s8) ═ 100, a (s8) ═ 95%; shape 9 is h (s9) ═ h (p) +10, s (s9) ═ s (p) × 0.5, b (s9) ═ 100, a (s9) ═ 60%; the shape 10 is h (s10) ═ h (p) +10, s (s10) ═ s (p) × 0.4, b (s10) ═ 100, and a (s10) ═ 95%.
FIG. 13 is a diagram illustrating an adaptation effect of each dominant color scene in an alternative homogeneous color operation according to an embodiment of the present invention. As shown in fig. 13, the images shown in the figure are in the same color family, and the same color family can indicate the adaptation effect of the color operation of each dominant color scene.
FIG. 14 is a schematic diagram of another approximate color extraction algorithm according to an embodiment of the invention. As shown in fig. 14, when the approximate color is extracted, 30 degrees and-30 degrees are extracted in the negative direction based on the dominant color h (p). Wherein, in text (text), h (t) is 60, s (t) is 100, b (t) is 40, and a (t) is 100%; in shape 1(sharp1), h (s1) is 30, s (s1) is 20, b (s1) is 100, a (s1) is 95%; in shape 2(sharp2), h (s2) is 30, s (s2) is 50, b (s2) is 100, a (s2) is 60%; in shape 3(sharp3), h (s3) is 30, s (s3) is 40, b (s3) is 100, a (s3) is 95%; in shape 4(sharp4), h (s4) is 30, s (s4) is 100, b (s4) is 100, a (s4) is 60%; in primary color (primary), h (p) is 60, s (p) is 100, b (p) is 100, and a (p) is 100%.
FIG. 15 is a schematic illustration of another primary layer element coloring according to an embodiment of the invention. As shown in fig. 15, when the primary layer element is colored, sharp1(s1) -sharp (s3), text (t) and primary color h (p) -30 degrees are used as the primary rule. Wherein, the dominant colors are h (p), s (p), b (p), a (p); shape 1 is h (s1) ═ h (p) -30, s (s1) ═ s (p) × 0.2, b (s1) ═ 100, a (s1) ═ 95%; shape 2 is h (s2) ═ h (p) -30, s (s2) ═ s (p) × 0.5, b (s2) ═ 100, a (s2) ═ 60%; the characters are h (t) ═ h (p), s (t) ═ s (p), b (t) ═ b (p) × 0.4, a (t) ═ 100%; shape 3 is h (s3) ═ h (p) -30, s (s3) ═ s (p) × 0.4, b (s3) ═ 100, a (s3) ═ 95%; shape 4 is h (s4) ═ h (p) -30, s (s4) ═ s (p), b (s4) ═ 100, a (s4) ═ 60%.
FIG. 16 is a schematic illustration of another extended layer element coloring according to an embodiment of the invention. As shown in fig. 16, when the extended layer element is colored, the dominant colors h (p) -20 degrees and-10 degrees are used as the main rules. Wherein, the shape 5 is h (s5) ═ h (p) -20, s (s5) ═ s (p) × 0.2, b (s5) ═ 100, a (s5) ═ 95%; shape 6 is h (s6) ═ h (p) -20, s (s6) ═ s (p) × 0.5, b (s6) ═ 100, a (s6) ═ 60%; shape 7 is h (s7) ═ h (p) -20, s (s7) ═ s (p) 04, b (s7) ═ 100, a (s7) ═ 95%; shape 8 is h (s8) ═ h (p) -10, s (s8) ═ s (p) × 0.2, b (s8) ═ 100, a (s8) ═ 95%; shape 9 is h (s9) ═ h (p) -10, s (s9) ═ s (p) × 0.5, b (s9) ═ 100, a (s9) ═ 60%; the shape 10 is h (s10) ═ h (p) -10, s (s10) ═ s (p) × 0.4, b (s10) ═ 100, and a (s10) ═ 95%.
FIG. 17 is a diagram illustrating an adaptation effect of each dominant color scene in an alternative homogeneous color operation according to an embodiment of the present invention. As shown in fig. 17, the images shown in the figure are in the same color family, and the same color family can indicate the adaptation effect of the color operation of each dominant color scene.
Fig. 18 is a schematic diagram of a tristimulus algorithm according to an embodiment of the present invention. As shown in fig. 18, the forward direction is shifted by 120 degrees and +120 degrees based on the dominant color h (p), and the forward direction is diffused by ± 30 degrees based on the color values. In text (text), h (t) is 210, s (t) is 100, b (t) is 40, and a (t) is 100%; in shape 1(sharp1), h (s1) is 210, s (s1) is 20, b (s1) is 100, a (s1) is 95%; in shape 2(sharp2), h (s2) is 210, s (s2) is 50, b (s2) is 100, a (s2) is 60%; in shape 3(sharp3), h (s3) is 210, s (s3) is 40, b (s3) is 100, a (s3) is 95%; in shape 4(sharp4), h (s4) is 90, s (s4) is 100, b (s4) is 100, a (s4) is 60%; in primary color (primary), h (p) is 90, s (p) is 100, b (p) is 100, and a (p) is 100%.
FIG. 19 is a schematic illustration of another primary layer element coloring according to an embodiment of the invention. As shown in fig. 19, sharp1(s1) -sharp (s3), text (t), has dominant color h (p) +120 degrees as the main rule. Wherein, the dominant colors are h (p), s (p), b (p), a (p); shape 1 is h (s1) ═ h (p) +120, s (s1) ═ s (p) × 0.2, b (s1) ═ 100, a (s1) ═ 95%; shape 2 is h (s2) ═ h (p) +120, s (s2) ═ s (p) × 0.5, b (s2) ═ 100, a (s2) ═ 60%; the characters are h (t) ═ h (p) +120, s (t) ═ s (p), b (t) ═ b (p) × 0.4, a (t) ═ 100%; shape 3 is h (s3) ═ h (p) +120, s (s3) ═ s (p) × 0.4, b (s3) ═ 100, a (s3) ═ 95%; shape 4 is h (s4) ═ h (p) +120, s (s4) ═ s (p), b (s4) ═ 100, a (s4) ═ 60%.
FIG. 20 is a schematic illustration of another extended layer element coloring according to an embodiment of the invention. As shown in fig. 20, when the extended layer element is colored, the dominant colors h (p) +90 degrees and +150 degrees may be used as the main rules. Wherein, the shape 5 is h (s5) ═ h (p) +90, s (s5) ═ s (p) × 0.2, b (s5) ═ 100, a (s5) ═ 95%; shape 6 is h (s6) ═ h (p) +90, s (s6) ═ s (p) × 0.5, b (s6) ═ 100, a (s6) ═ 60%; shape 7 is h (s7) ═ h (p) +90, s (s7) ═ s (p) × 04, b (s7) ═ 100, a (s7) ═ 95%; shape 8 is h (s8) ═ h (p) +150, s (s8) ═ s (p) × 0.2, b (s8) ═ 100, a (s8) ═ 95%; shape 9 is h (s9) ═ h (p) +150, s (s9) ═ s (p) × 0.5, b (s9) ═ 100, a (s9) ═ 60%; the shape 10 is h (s10) ═ h (p) +150, s (s10) ═ s (p) × 0.4, b (s10) ═ 100, and a (s10) ═ 95%.
FIG. 21 is a diagram illustrating an adaptation effect of each dominant color scene in an alternative homogeneous color operation according to an embodiment of the present invention. As shown in fig. 21, the images shown in the figure are of the same color family, and the same color family can indicate the adaptation effect of the color operation of each dominant color scene.
Fig. 22 is a schematic diagram of another tristimulus algorithm according to an embodiment of the present invention. As shown in fig. 22, the forward direction color is shifted by 240 degrees and +240 degrees based on the dominant color h (p), and is diffused by ± 30 degrees based on the color values. In text (text), h (t) 330, s (t) 100, b (t) 40, and a (t) 100%; in shape 1(sharp1), h (s1) is 330, s (s1) is 20, b (s1) is 100, a (s1) is 95%; in shape 2(sharp2), h (s2) is 330, s (s2) is 50, b (s2) is 100, a (s2) is 60%; in shape 3(sharp3), h (s3) is 330, s (s3) is 40, b (s3) is 100, a (s3) is 95%; in shape 4(sharp4), h (s4) is 330, s (s4) is 100, b (s4) is 100, a (s4) is 60%; in primary color (primary), h (p) is 90, s (p) is 100, b (p) is 100, and a (p) is 100%.
FIG. 23 is a schematic illustration of another primary layer element coloring according to an embodiment of the invention. As shown in fig. 23, sharp1(s1) -sharp (s3), text (t), has dominant color h (p) +240 degrees as the main rule. Wherein, the dominant colors are h (p), s (p), b (p), a (p); shape 1 is h (s1) ═ h (p) +240, s (s1) ═ s (p) × 0.2, b (s1) ═ 100, a (s1) ═ 95%; shape 2 is h (s2) ═ h (p) +240, s (s2) ═ s (p) × 0.5, b (s2) ═ 100, a (s2) ═ 60%; the characters are h (t) ═ h (p) +240, s (t) ═ s (p), b (t) ═ b (p) × 0.4, a (t) ═ 100%; shape 3 is h (s3) ═ h (p) +240, s (s3) ═ s (p) × 0.4, b (s3) ═ 100, a (s3) ═ 95%; the shape 4 is h (s4) ═ h (p) +240, s (s4) ═ s (p), b (s4) ═ 100, and a (s4) ═ 60%.
FIG. 24 is a schematic illustration of another extended layer element coloring according to an embodiment of the invention. As shown in fig. 24, when the extended layer element is colored, the dominant colors h (p) +210 degrees and +270 degrees may be used as main rules. Wherein, the shape 5 is h (s5) ═ h (p) +210, s (s5) ═ s (p) × 0.2, b (s5) ═ 100, a (s5) ═ 95%; shape 6 is h (s6) ═ h (p) +210, s (s6) ═ s (p) × 0.5, b (s6) ═ 100, a (s6) ═ 60%; shape 7 is h (s7) ═ h (p) +210, s (s7) ═ s (p) × 0.4, b (s7) ═ 100, a (s7) ═ 95%; shape 8 is h (s8) ═ h (p) +270, s (s8) ═ s (p) × 0.2, b (s8) ═ 100, a (s8) ═ 95%; shape 9 is h (s9) ═ h (p) +270, s (s9) ═ s (p) × 0.5, b (s9) ═ 100, a (s9) ═ 60%; the shape 10 is h (s10) ═ h (p) +270, s (s10) ═ s (p) × 0.4, b (s10) ═ 100, and a (s10) ═ 95%.
FIG. 25 is a diagram illustrating an adaptation effect of each dominant color scene in an alternative homogeneous color operation according to an embodiment of the present invention. As shown in fig. 25, the images shown in the figure are of the same color family, and the same color family can indicate the adaptation effect of the color operation of each dominant color scene.
In the embodiment, when intelligently matching colors, elements on the canvas may be described by metadata, for example, various elements on the canvas may be described by using custom JSON metadata, which may be defined as follows:
Figure GDA0003196011260000271
Figure GDA0003196011260000281
the application environment of the embodiment of the present invention may refer to the application environment in the above embodiments, but is not described herein again. The embodiment of the invention provides an optional specific application for implementing the image processing method.
The product logic at the application level of embodiments of the present invention is described below.
FIG. 26 is a schematic diagram of core logic at the product application level according to an embodiment of the present invention. As shown in fig. 26, the dominant color of the user's main body picture layer in the effective layout area in the current tool operation interface is obtained, and the HSB value of the color is obtained; identifying layer elements of creative production in an effective canvas in a layer disassembling and definition disassembling user tool operation interface formed by the creative, and defining attribute color value parameter items of different design elements; a user can randomly/freely select the style tendency of intelligent color matching, such as same color system, approximate color, three primary colors and the like, and different styles are associated with corresponding HSB layer design element color selection rules and algorithms; and assigning and applying the HSB color value operation result of the design elements contained in the layer obtained by the algorithm, thereby realizing and completing the intelligent color matching scheme.
Figure 27 is a diagram illustrating an efficient canvas composition according to an embodiment of the present invention. As shown in fig. 27, the layer of the drawing is disassembled and defined, and the layer elements of the creative production in the effective canvas are identified, which may include design elements contained in an interaction layer, a text layer, a decoration layer, a body layer, a background layer, and different types of layers.
The following describes the practice of the embodiments of the invention in a product.
The method of the embodiment can be suitable for getting rid of time consumption and uncertain color matching effect of manual color matching based on the image content of the current effective canvas area when a user uses the tool platform to manufacture the advertisement static picture material, and realizes better visual effect by intelligently realizing layer element color matching through one key.
In the aspect of product application practice, tool products can be made for originality.
Firstly, a user initiates a marketing creative design effect.
FIG. 28 is a diagram illustrating the design effect of an initial marketing creative of a user, according to an embodiment of the present invention. As shown in fig. 28, the user's initial marketing creative is designed as a joint series of shoes.
And secondly, when the main creative map is changed.
FIG. 29 is a schematic diagram of a change in the main creative map, according to an embodiment of the present invention. As shown in fig. 29, the main creative map is fashion apparel of top-quality.
And thirdly, clicking an intelligent color matching function.
FIG. 30 is a schematic diagram of an intelligent color matching function according to an embodiment of the invention. As shown in fig. 30, clicking the "intelligent color matching" button on the operation interface pops up a panel including an intelligent color matching option, which includes information of the same color system fitting, the approximate color fitting, and the three primary colors fitting, so as to select and determine the same, and trigger the intelligent color matching.
And fourthly, intelligently matching colors and taking effect.
Fig. 31 is a schematic diagram of effects of intelligent color matching validation according to an embodiment of the invention. As shown in fig. 31, when the approximate color matching is selected, the image shown in fig. 31 is an effect of the approximate color matching performed on the image (indicated by diagonal lines in fig. 31).
FIG. 32 is a diagram illustrating the effect of a user replacing a marketing creative design, according to an embodiment of the present invention. As shown in fig. 32, in the operation interface, the marketing creative may be replaced, and the displayed target image may be re-color-matched according to the operations shown in fig. 30 and 31.
The embodiment can convert perceptual visual design into an automatic processing method which can be expressed by machine language by dispersing the main color of the picture and applying a mode of combining a visual color matching method with a color calculation numerical value, thereby solving the problem that a user cannot quickly finish color matching with better visual effect in the material creative production, and further improving the efficiency and effect of the user marketing creative material production.
In the embodiment, on the aspect of the intelligent color processing effect, the identifiers of the priority levels of the layer elements can be added for distinguishing and processing; for the application of the user on the intelligent color matching tool, the user can select and guide through some perception type words and color effect preview modes, so that the more intuitive understanding of the user on the color intelligent operation rule is promoted, and the operation experience of the user is optimized.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
According to another aspect of the embodiments of the present invention, there is also provided an image processing apparatus for implementing the above-described image processing method. Fig. 33 is a schematic diagram of an image processing apparatus according to an embodiment of the present invention. As shown in fig. 33, the apparatus 330 may include: a display unit 10, a selection unit 20 and a color matching unit 30.
And a display unit 10 for displaying color matching information in response to a target operation instruction generated by the target operation, wherein the color matching information is used for indicating that the first target image is color-matched according to the color corresponding to the color matching information.
And a selecting unit 20 for selecting the first target color matching information from the color matching information.
And the color matching unit 30 is configured to perform color matching on the first target image according to a first target color corresponding to the first target color matching information, so as to obtain a color-matched first target image.
It should be noted that the display unit 10 in this embodiment may be configured to execute step S202 in embodiment 1 of this application, the selection unit 20 in this embodiment may be configured to execute step S204 in embodiment 1 of this application, and the color matching unit 30 in this embodiment may be configured to execute step S206 in embodiment 1 of this application.
It should be noted here that the above units are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure of the above embodiment 1. It should be noted that the above units as a part of the apparatus may operate in a hardware environment as shown in fig. 1, may be implemented by software, and may also be implemented by hardware, where the hardware environment includes a network environment.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device for implementing the image processing method.
Fig. 34 is a block diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 34, the electronic device may include, as shown in fig. 34: comprising a memory 341 and a processor 343, the memory 341 having stored therein a computer program, the processor 343 being arranged to perform the steps of any of the method embodiments described above by means of the computer program. Optionally, as shown in fig. 34, the electronic apparatus may further include a transmission device 345 and an input-output device 347.
Optionally, in this embodiment, the electronic apparatus may be located in at least one network device of a plurality of network devices of a computer network.
Optionally, in this embodiment, the processor 343 may be configured to execute the following steps by a computer program:
responding to a target operation instruction generated by target operation, and displaying color matching information, wherein the color matching information is used for indicating that the first target image is subjected to color matching according to the color corresponding to the color matching information;
selecting first target color matching information from the color matching information;
and performing color matching on the first target image according to the first target color corresponding to the first target color matching information to obtain a color-matched first target image.
Processor 343 is further configured to perform the following steps: after first target color matching information is selected from the color matching information, displaying first prompt information, wherein the first prompt information is used for prompting whether color matching is determined to be carried out on a first target image according to a first target color corresponding to the first target color matching information; performing color matching on the first target image according to a first target color corresponding to the first target color matching information, and obtaining the color-matched first target image includes: and under the condition that the determination information is obtained, performing color matching on the first target image according to a first target color corresponding to the first target color matching information to obtain a color-matched first target image, wherein the determination information is used for indicating that the first target image is determined to be color-matched according to the first target color corresponding to the first target color matching information.
Processor 343 is further configured to perform the following steps: after the first prompt information is displayed, under the condition that cancel information is obtained, color matching of the first target image according to the first target color corresponding to the first target color matching information is cancelled, wherein the cancel information is used for indicating cancellation of color matching of the first target image according to the first target color corresponding to the first target color matching information.
Processor 343 is further configured to perform the following steps: obtaining a second target image after color matching is carried out on the first target image according to a first target color corresponding to the first target color matching information to obtain the first target image after color matching; responding to a target operation instruction generated by the target operation, and displaying color matching information; selecting second target color matching information from the color matching information; and carrying out color matching on the second target image according to a second target color corresponding to the second target color matching information to obtain a color-matched second target image.
Processor 343 is further configured to perform the following steps: and after the first target color matching information is selected from the color matching information and the first target image is subjected to color matching according to the first target color corresponding to the first target color matching information, and before the color-matched first target image is obtained, displaying a target text or a target picture, wherein the target text and the target picture are both used for indicating the color-matched first target image.
Processor 343 is further configured to perform the following steps: the method comprises the steps of obtaining a color value of a main image layer of a first target image, an image layer element of the first target image and a color attribute of the image layer element, wherein the image layer element comprises a plurality of image layers, the plurality of image layers comprise the main image layer, first target color matching information is associated with a color value of the main image layer, the image layer element of the first target image and the color attribute of the image layer element, and the color attribute comprises a hue attribute, a saturation attribute and a brightness attribute of the first target image; determining a first target color corresponding to the color value, the layer element and the color attribute of the main picture layer; and carrying out color matching on the first target image according to the first target color to obtain the color-matched first target image.
Processor 343 is further configured to perform the following steps: acquiring an image of a first target image in a first color mode, wherein the color of the first target image is represented by color values of a red channel, a green channel and a blue channel in the first color mode respectively; converting the image in the first color mode into an image in a second color mode, wherein the second color mode represents colors through a hue attribute, a saturation attribute and a brightness attribute respectively; respectively obtaining a weighted value of the hue attribute, a weighted value of the saturation attribute and a weighted value of the brightness attribute, wherein the weighted value of the hue attribute is used for indicating the proportion occupied by the hue attribute on the visual color of the first target image, the weighted value of the saturation attribute is used for indicating the proportion occupied by the saturation attribute on the visual color, and the weighted value of the brightness attribute is used for indicating the proportion occupied by the brightness attribute on the visual color; clustering the images of the second color mode according to a plurality of randomly selected clustering centers respectively through the weighted value of the hue attribute, the weighted value of the saturation attribute and the weighted value of the brightness attribute to obtain a plurality of clustering results, wherein the clustering centers correspond to the clustering results one to one; determining a clustering result with the lowest dissimilarity degree in the plurality of clustering results as a target clustering result; and determining the average color value of the color category with the largest number of pixel points in the target clustering result as the color value of the main picture layer.
Processor 343 is further configured to perform the following steps: determining a weighted value of the hue attribute, a weighted value of the saturation attribute and a weighted value of the brightness attribute through a target model, the hue attribute, the saturation attribute and the brightness attribute, wherein the target model is obtained by training a predetermined model through a target sample, and the target sample comprises the hue attributes of a plurality of images, the saturation attributes of the plurality of images, the brightness attributes of the plurality of images, the weighted values of the hue attributes of the plurality of images, the weighted values of the saturation attributes of the plurality of images and the weighted values of the brightness attributes of the plurality of images which are obtained in advance.
The processor 343 is further configured to obtain the color attribute of the layer element by at least one of: the color attributes of the shapes in the layer elements comprise hue attributes, saturation attributes, brightness attributes and transparency attributes of the shapes; the color attributes of the text in the layer elements comprise hue attributes, saturation attributes, brightness attributes and transparency attributes of the text; and the dominant colors in the layer elements comprise hue attributes, saturation attributes, brightness attributes and transparency attributes.
Processor 343 is further configured to perform the following steps: and acquiring the color attribute of each picture layer according to the priorities of the plurality of picture layers included in the picture layer elements.
Alternatively, it can be understood by those skilled in the art that the structure shown in fig. 34 is merely an illustration, and the electronic device may also be a terminal device such as a smart phone (e.g., an AndroID phone, an iOS phone, etc.), a tablet computer, a palm computer, a Mobile Internet Device (MID), a PAD, and the like. Fig. 34 is a diagram illustrating a structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in FIG. 34, or have a different configuration than shown in FIG. 34.
The memory 341 may be configured to store software programs and modules, such as program instructions/modules corresponding to the image processing method and apparatus in the embodiment of the present invention, and the processor 343 executes various functional applications and data processing by running the software programs and modules stored in the memory 341, that is, implements the image processing method described above. The memory 341 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 341 may further include memory located remotely from processor 343, which may be connected to the terminal through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 345 is used for receiving or transmitting data via a network. Examples of the network may include a wired network and a wireless network. In one example, the transmission device 345 includes a Network adapter (NIC) that can be connected to a router via a Network cable and other Network devices to communicate with the internet or a local area Network. In one example, the transmission device 345 is a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
Among them, the memory 341 is used to store an application program, in particular.
The embodiment of the invention provides a scheme for processing images. Responding to a target operation instruction generated by target operation, and displaying color matching information, wherein the color matching information is used for indicating that the first target image is subjected to color matching according to the color corresponding to the color matching information; selecting first target color matching information from the color matching information; and performing color matching on the first target image according to the first target color corresponding to the first target color matching information to obtain a color-matched first target image. Through selecting out first target information of matching colors from the information of matching colors that shows, match colors to first target image through the information realization key intelligence of matching colors of first target, avoided matching colors through the manual work, lack the function that intelligent was matched colors, lead to matching colors consuming time and the effect of matching colors uncertain to the image to improve the efficiency of matching colors to the image, realized improving the technical effect of the efficiency of matching colors to the image, and then solved the technical problem of the inefficiency of matching colors to the image among the relevant art.
Embodiments of the present invention also provide a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
responding to a target operation instruction generated by target operation, and displaying color matching information, wherein the color matching information is used for indicating that the first target image is subjected to color matching according to the color corresponding to the color matching information;
selecting first target color matching information from the color matching information;
and performing color matching on the first target image according to the first target color corresponding to the first target color matching information to obtain a color-matched first target image.
Optionally, the storage medium is further arranged to store program code for performing the steps of: after first target color matching information is selected from the color matching information, displaying first prompt information, wherein the first prompt information is used for prompting whether color matching is determined to be carried out on a first target image according to a first target color corresponding to the first target color matching information; performing color matching on the first target image according to a first target color corresponding to the first target color matching information, and obtaining the color-matched first target image includes: and under the condition that the determination information is obtained, performing color matching on the first target image according to a first target color corresponding to the first target color matching information to obtain a color-matched first target image, wherein the determination information is used for indicating that the first target image is determined to be color-matched according to the first target color corresponding to the first target color matching information.
Optionally, the storage medium is further arranged to store program code for performing the steps of: after the first prompt information is displayed, under the condition that cancel information is obtained, color matching of the first target image according to the first target color corresponding to the first target color matching information is cancelled, wherein the cancel information is used for indicating cancellation of color matching of the first target image according to the first target color corresponding to the first target color matching information.
Optionally, the storage medium is further arranged to store program code for performing the steps of: obtaining a second target image after color matching is carried out on the first target image according to a first target color corresponding to the first target color matching information to obtain the first target image after color matching; responding to a target operation instruction generated by the target operation, and displaying color matching information; selecting second target color matching information from the color matching information; and carrying out color matching on the second target image according to a second target color corresponding to the second target color matching information to obtain a color-matched second target image.
Optionally, the storage medium is further arranged to store program code for performing the steps of: and after the first target color matching information is selected from the color matching information and the first target image is subjected to color matching according to the first target color corresponding to the first target color matching information, and before the color-matched first target image is obtained, displaying a target text or a target picture, wherein the target text and the target picture are both used for indicating the color-matched first target image.
Optionally, the storage medium is further arranged to store program code for performing the steps of: the method comprises the steps of obtaining a color value of a main image layer of a first target image, an image layer element of the first target image and a color attribute of the image layer element, wherein the image layer element comprises a plurality of image layers, the plurality of image layers comprise the main image layer, first target color matching information is associated with a color value of the main image layer, the image layer element of the first target image and the color attribute of the image layer element, and the color attribute comprises a hue attribute, a saturation attribute and a brightness attribute of the first target image; determining a first target color corresponding to the color value, the layer element and the color attribute of the main picture layer; and carrying out color matching on the first target image according to the first target color to obtain the color-matched first target image.
Optionally, the storage medium is further arranged to store program code for performing the steps of: acquiring an image of a first target image in a first color mode, wherein the color of the first target image is represented by color values of a red channel, a green channel and a blue channel in the first color mode respectively; converting the image in the first color mode into an image in a second color mode, wherein the second color mode represents colors through a hue attribute, a saturation attribute and a brightness attribute respectively; respectively obtaining a weighted value of the hue attribute, a weighted value of the saturation attribute and a weighted value of the brightness attribute, wherein the weighted value of the hue attribute is used for indicating the proportion occupied by the hue attribute on the visual color of the first target image, the weighted value of the saturation attribute is used for indicating the proportion occupied by the saturation attribute on the visual color, and the weighted value of the brightness attribute is used for indicating the proportion occupied by the brightness attribute on the visual color; clustering the images of the second color mode according to a plurality of randomly selected clustering centers respectively through the weighted value of the hue attribute, the weighted value of the saturation attribute and the weighted value of the brightness attribute to obtain a plurality of clustering results, wherein the clustering centers correspond to the clustering results one to one; determining a clustering result with the lowest dissimilarity degree in the plurality of clustering results as a target clustering result; and determining the average color value of the color category with the largest number of pixel points in the target clustering result as the color value of the main picture layer.
Optionally, the storage medium is further arranged to store program code for performing the steps of: determining a weighted value of the hue attribute, a weighted value of the saturation attribute and a weighted value of the brightness attribute through a target model, the hue attribute, the saturation attribute and the brightness attribute, wherein the target model is obtained by training a predetermined model through a target sample, and the target sample comprises the hue attributes of a plurality of images, the saturation attributes of the plurality of images, the brightness attributes of the plurality of images, the weighted values of the hue attributes of the plurality of images, the weighted values of the saturation attributes of the plurality of images and the weighted values of the brightness attributes of the plurality of images which are obtained in advance.
Optionally, the storage medium is further arranged to store program code for performing the steps of: the color attributes of the shapes in the layer elements comprise hue attributes, saturation attributes, brightness attributes and transparency attributes of the shapes; the color attributes of the text in the layer elements comprise hue attributes, saturation attributes, brightness attributes and transparency attributes of the text; and the dominant colors in the layer elements comprise hue attributes, saturation attributes, brightness attributes and transparency attributes.
Optionally, the storage medium is further arranged to store program code for performing the steps of: and acquiring the color attribute of each picture layer according to the priorities of the plurality of picture layers included in the picture layer elements.
Optionally, the storage medium is further configured to store a computer program for executing the steps included in the method in the foregoing embodiment, which is not described in detail in this embodiment.
Alternatively, in this embodiment, a person skilled in the art may understand that all or part of the steps in the methods of the foregoing embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing one or more computer devices (which may be personal computers, servers, network devices, etc.) to execute all or part of the steps of the method according to the embodiments of the present invention.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (15)

1. A method of processing an image, comprising:
responding to a target operation instruction generated by target operation, and displaying color matching information, wherein the color matching information is used for indicating that a first target image is subjected to color matching according to the color corresponding to the color matching information;
selecting first target color matching information from the color matching information;
matching colors for the first target image according to a first target color corresponding to the first target color matching information to obtain the first target image after color matching, which specifically comprises:
acquiring a dominant color of a main image layer of the first target image, an image layer element of the first target image and a color attribute of the image layer element in an effective canvas area;
and when the first target color matching information is homochromatic adaptation information, taking colors of main layer elements by keeping hue attributes of the main colors unchanged to be rules, wherein the main layer elements refer to shapes and texts in the layer elements.
2. The method of claim 1, wherein the color matching information comprises at least one of:
the homochromatic adaptation information is used for indicating that the first target image is matched according to colors of the same color system;
approximate color adaptation information, wherein the approximate color adaptation information is used for indicating that the first target image is color-matched according to a color of which the color value is within a target threshold value;
the three primary color adaptation information is used for indicating that the first target image is subjected to color matching according to three primary colors.
3. The method of claim 1,
after selecting the first target color matching information from the color matching information, the method further comprises: displaying first prompt information, wherein the first prompt information is used for prompting whether color matching is carried out on the first target image according to the first target color corresponding to the first target color matching information or not;
under the condition that the determination information is obtained, performing color matching on the first target image according to the first target color corresponding to the first target color matching information to obtain the color-matched first target image, wherein the determination information is used for indicating that the first target image is determined to be color-matched according to the first target color corresponding to the first target color matching information.
4. The method of claim 3, wherein after displaying the first prompt message, the method further comprises:
and under the condition that cancellation information is obtained, color matching of the first target image according to the first target color corresponding to the first target color matching information is cancelled, wherein the cancellation information is used for indicating cancellation of color matching of the first target image according to the first target color corresponding to the first target color matching information.
5. The method of claim 1, wherein after the first target image is color-matched according to a first target color corresponding to the first target color matching information, resulting in the color-matched first target image, the method further comprises:
acquiring a second target image;
selecting second target color matching information from the color matching information;
and performing color matching on the second target image according to a second target color corresponding to the second target color matching information to obtain the second target image after color matching.
6. The method of any of claims 1 to 5, wherein after obtaining the color-matched first target image, the method further comprises:
displaying target text or a target picture, wherein the target text comprises a perceptual description word for describing the color-matched first target image, and the target picture is used for indicating a color effect preview of the color-matched first target image.
7. The method of claim 1, wherein the layer elements comprise a plurality of layer layers including the body layer, and wherein the color attributes comprise the hue attribute, the saturation attribute, and the brightness attribute.
8. The method of claim 7, wherein the plurality of picture layers further comprises at least one of: background picture layer, decoration picture layer, text picture layer, interactive picture layer.
9. The method of claim 1, wherein obtaining the dominant color of the subject picture layer of the first target image within the active canvas area comprises:
acquiring an image of the first target image in a first color mode, wherein the color of the first color mode is represented by color values of a red channel, a green channel and a blue channel respectively;
converting the image in the first color mode into an image in a second color mode, wherein the second color mode represents colors through a hue attribute, a saturation attribute and a brightness attribute respectively;
respectively obtaining a weight value of the hue attribute, a weight value of the saturation attribute and a weight value of the brightness attribute, wherein the weight value of the hue attribute is used for indicating the proportion of the hue attribute to the visual color of the first target image, the weight value of the saturation attribute is used for indicating the proportion of the saturation attribute to the visual color, and the weight value of the brightness attribute is used for indicating the proportion of the brightness attribute to the visual color;
clustering the images of the second color mode according to a plurality of randomly selected clustering centers respectively through the weight values of the hue attributes, the weight values of the saturation attributes and the weight values of the brightness attributes to obtain a plurality of clustering results, wherein the clustering centers correspond to the clustering results one to one;
determining the clustering result with the lowest dissimilarity degree in the plurality of clustering results as a target clustering result;
and determining the average color value of the color category with the largest number of pixel points in the target clustering result as the dominant color of the main picture layer.
10. The method according to claim 9, wherein the obtaining the weight value of the hue attribute, the weight value of the saturation attribute, and the weight value of the brightness attribute respectively comprises:
determining a weight value of the hue attribute, a weight value of the saturation attribute and a weight value of the brightness attribute through a target model, the hue attribute, the saturation attribute and the brightness attribute, wherein the target model is obtained by training a predetermined model through a target sample, and the target sample comprises the hue attributes of a plurality of images, the saturation attributes of the images, the brightness attributes of the images, the weight values of the hue attributes of the images, the weight values of the saturation attributes of the images and the weight values of the brightness attributes of the images which are obtained in advance.
11. The method according to claim 1, wherein the color attributes of the shapes in the layer elements comprise hue attributes, saturation attributes, brightness attributes and transparency attributes of the shapes;
the color attributes of the text in the layer elements comprise hue attributes, saturation attributes, brightness attributes and transparency attributes of the text.
12. The method according to claim 1, wherein the obtaining the color attribute of the layer element comprises:
and acquiring the color attribute of each picture layer according to the priorities of the plurality of picture layers included in the picture layer elements.
13. An apparatus for processing an image, comprising:
the display unit is used for responding to a target operation instruction generated by target operation and displaying color matching information, wherein the color matching information is used for indicating that the first target image is subjected to color matching according to the color corresponding to the color matching information;
the selection unit is used for selecting first target color matching information from the color matching information;
the color matching unit is used for performing color matching on the first target image according to a first target color corresponding to the first target color matching information to obtain the first target image after color matching, and specifically comprises: acquiring a dominant color of a main image layer of the first target image, an image layer element of the first target image and a color attribute of the image layer element in an effective canvas area; and when the first target color matching information is homochromatic adaptation information, taking colors of main layer elements by keeping hue attributes of the main colors unchanged to be rules, wherein the main layer elements refer to shapes and texts in the layer elements.
14. A computer-readable storage medium, in which a computer program is stored, wherein the computer program is arranged to execute a method of processing an image as claimed in any one of claims 1 to 12 when executed.
15. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, and the processor is arranged to execute the method of processing an image as claimed in any one of claims 1 to 12 by means of the computer program.
CN201810164716.3A 2018-02-27 2018-02-27 Image processing method and device, storage medium and electronic device Active CN110198437B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810164716.3A CN110198437B (en) 2018-02-27 2018-02-27 Image processing method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810164716.3A CN110198437B (en) 2018-02-27 2018-02-27 Image processing method and device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN110198437A CN110198437A (en) 2019-09-03
CN110198437B true CN110198437B (en) 2021-11-05

Family

ID=67751368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810164716.3A Active CN110198437B (en) 2018-02-27 2018-02-27 Image processing method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN110198437B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110969170B (en) * 2019-12-03 2024-03-08 北京奇艺世纪科技有限公司 Image theme color extraction method and device and electronic equipment
CN111091607B (en) * 2019-12-24 2023-08-29 厦门美图之家科技有限公司 Color matching method and device, electronic equipment and storage medium
CN111191424B (en) * 2019-12-31 2023-03-03 北京华为数字技术有限公司 Page color matching method and device, storage medium and chip
CN111292394A (en) * 2020-02-07 2020-06-16 腾讯科技(深圳)有限公司 Image color matching relationship determination method and device
US20210264191A1 (en) * 2020-02-24 2021-08-26 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and device for picture generation, electronic device, and storage medium
CN111737800B (en) * 2020-06-23 2024-01-12 广联达科技股份有限公司 Primitive selection method and device and electronic equipment
CN112748829B (en) * 2020-07-14 2024-08-06 腾讯数码(天津)有限公司 Picture editing method, device, equipment and storage medium
CN111563937B (en) * 2020-07-14 2020-10-30 成都四方伟业软件股份有限公司 Picture color extraction method and device
CN111984164B (en) * 2020-08-31 2023-05-02 Oppo广东移动通信有限公司 Wallpaper generation method, device, terminal and storage medium
CN112035210B (en) * 2020-09-30 2024-05-07 北京百度网讯科技有限公司 Method, apparatus, device and medium for outputting color information
CN114625292B (en) * 2020-11-27 2024-10-11 华为技术有限公司 Icon setting method and electronic equipment
CN112598761B (en) * 2020-12-30 2023-06-16 东华大学 Collaborative design and intelligent recommendation method for color matching of textile and clothing
CN113658287B (en) * 2021-07-14 2024-05-03 支付宝(杭州)信息技术有限公司 User interface color matching processing method, device and equipment
CN115024561B (en) * 2022-05-30 2023-07-21 广东时谛智能科技有限公司 Automatic adjustment method and device for associated color matching in shoe body design process
CN117528045B (en) * 2024-01-04 2024-03-22 深圳市云影天光科技有限公司 Video image processing method and system based on video fog-penetrating anti-reflection technology

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6999617B1 (en) * 1998-07-24 2006-02-14 Canon Kabushiki Kaisha Image processing method and apparatus
CN103123581B (en) * 2011-11-21 2015-11-04 腾讯科技(深圳)有限公司 A kind of application interface replacing options of mobile terminal and device and mobile terminal
CN105549928B (en) * 2015-12-02 2017-11-14 广州阿里巴巴文学信息技术有限公司 The color matching method and device of display content

Also Published As

Publication number Publication date
CN110198437A (en) 2019-09-03

Similar Documents

Publication Publication Date Title
CN110198437B (en) Image processing method and device, storage medium and electronic device
US9396560B2 (en) Image-based color palette generation
US10915744B2 (en) Method for evaluating fashion style using deep learning technology and system therefor
CN109711345B (en) Flame image identification method and device and storage medium thereof
CN108052765A (en) Scheme of colour automatic generation method and device based on personality impression
CN107993131B (en) Putting-through recommendation method, device, server and storage medium
CN107945175A (en) Evaluation method, device, server and the storage medium of image
US11682143B2 (en) System and method for hair analysis of user
US10871884B1 (en) Product image characteristic detection and manipulation
CN107391150A (en) The adaptive variation of mobile terminal topic tone and system
US20100110100A1 (en) Method and System For Extracting and Applying Colour Schemes Across Domains
CN108846869A (en) A kind of clothes Automatic color matching method based on natural image color
CN112218006B (en) Multimedia data processing method and device, electronic equipment and storage medium
CN110444181A (en) Display methods, device, terminal and computer readable storage medium
Koshy et al. A complexion based outfit color recommender using neural networks
KR102429863B1 (en) Method for servicing personal styling
CN111274145A (en) Relationship structure chart generation method and device, computer equipment and storage medium
CN115147259A (en) Image color migration method, system and computer medium
CN114647984A (en) Intelligent clothing design method and system based on customer preference
CN113298921A (en) Theme template color matching method and device, electronic equipment and storage medium
KR20220097255A (en) Method and apparatus for image creation
CN115335864A (en) Method for determining a representative color from at least one digital color image
CN113658287A (en) User interface color matching processing method, device and equipment
CN114120002A (en) Image color extraction method and device, electronic equipment and storage medium
CN109472840B (en) Color replacement method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant