CN110944163A - Image processing method and electronic equipment - Google Patents
Image processing method and electronic equipment Download PDFInfo
- Publication number
- CN110944163A CN110944163A CN201911150964.3A CN201911150964A CN110944163A CN 110944163 A CN110944163 A CN 110944163A CN 201911150964 A CN201911150964 A CN 201911150964A CN 110944163 A CN110944163 A CN 110944163A
- Authority
- CN
- China
- Prior art keywords
- sky
- image
- target image
- module
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/68—Circuits for processing colour signals for controlling the amplitude of colour signals, e.g. automatic chroma control circuits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/643—Hue control means, e.g. flesh tone control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/646—Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the invention discloses an image processing method and electronic equipment, relates to the technical field of communication, and aims to solve the problem that the color of an image shot by the electronic equipment is dark. The method comprises the following steps: determining a sky area of a target image; judging whether the target image is a sunset scene image or a sunrise scene image according to the sky area; and enhancing the color saturation of the sky area under the condition that the target image is determined to be a sunset scene image or a sunrise scene image.
Description
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an image processing method and electronic equipment.
Background
With the continuous development of electronic equipment technology and photographing technology, people have higher and higher requirements on photographing effects.
However, since the photographing module of the electronic device is limited by the design volume and cost, the photographing effect of the electronic device may not reach the photographing effect of the professional camera. Especially in some special scenes, the color of the image usually taken is dark.
Disclosure of Invention
The embodiment of the invention provides an image processing method and electronic equipment, and aims to solve the problem that the color of an image shot by the electronic equipment is dark.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present invention provides an image processing method, where the method includes: determining a sky area of a target image; judging whether the target image is a sunset scene image or a sunrise scene image according to the sky area; and enhancing the color saturation of the sky area under the condition that the target image is determined to be a sunset scene image or a sunrise scene image.
In a second aspect, an embodiment of the present invention provides an electronic device, including: the device comprises a determining module, a judging module and an enhancing module; the determining module is used for determining a sky area of the target image; the judging module is used for judging whether the target image is a sunset scene image or a sunrise scene image according to the sky area determined by the determining module; the enhancement module is used for enhancing the color saturation of the sky area under the condition that the judgment module determines that the target image is a sunset scene image or a sunrise scene image.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the image processing method according to the first aspect.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the image processing method as in the first aspect.
In the embodiment of the invention, the electronic device may determine the sky area of the target image; judging whether the target image is a sunset scene image or a sunrise scene image according to the sky area; and enhancing the color saturation of the sky area under the condition that the target image is determined to be a sunset scene image or a sunrise scene image. By the scheme, under the condition that the target image is judged to be a sunset scene image or a sunrise scene image according to the sky area of the target image, the color saturation of the sky area is enhanced, so that the color of the target image is more gorgeous and more vivid, and the overall effect of the target image is further improved.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
FIG. 2 is a flowchart of an image processing method according to an embodiment of the present invention;
FIG. 3 is a second flowchart of an image processing method according to an embodiment of the present invention;
FIG. 4 is a third flowchart of an image processing method according to an embodiment of the present invention;
FIG. 5 is a fourth flowchart of an image processing method according to an embodiment of the present invention;
FIG. 6 is a fifth flowchart of an image processing method according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 8 is a hardware schematic diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
The terms "first," "second," "third," and "fourth," etc. in the description and in the claims of the present application are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first input, the second input, the third input, the fourth input, etc. are used to distinguish between different inputs, rather than to describe a particular order of inputs.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present invention, unless otherwise specified, "a plurality" means two or more, for example, a plurality of processing units means two or more processing units; plural elements means two or more elements, and the like.
The embodiment of the invention provides an image processing method, wherein electronic equipment can determine a sky area of a target image; judging whether the target image is a sunset scene image or a sunrise scene image according to the sky area; and enhancing the color saturation of the sky area under the condition that the target image is determined to be a sunset scene image or a sunrise scene image. By the scheme, under the condition that the target image is judged to be a sunset scene image or a sunrise scene image according to the sky area of the target image, the color saturation of the sky area is enhanced, so that the color of the target image is more gorgeous and more vivid, and the overall effect of the target image is further improved.
The following describes a software environment to which the image processing method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the image processing method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the image processing method may operate based on the android operating system shown in fig. 1. Namely, the processor or the electronic device can implement the image processing method provided by the embodiment of the invention by running the software program in the android operating system.
The electronic device in the embodiment of the invention can be a mobile electronic device or a non-mobile electronic device. The mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), etc.; the non-mobile electronic device may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, or the like; the embodiments of the present invention are not particularly limited.
An execution subject of the image processing method provided in the embodiment of the present invention may be the electronic device (including a mobile electronic device and a non-mobile electronic device), or may also be a functional module and/or a functional entity capable of implementing the method in the electronic device, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited. The following takes an electronic device as an example to exemplarily describe the image processing method provided by the embodiment of the present invention.
Referring to fig. 2, an embodiment of the present invention provides an image processing method, which may include steps 201 to 203 described below.
Optionally, the electronic device may determine the sky region in the target image by using an edge detection method.
For example, the electronic device may divide the target image into a plurality of regions, extract a first region in the target image, and perform color edge detection on the first region to obtain an edge image; then, carrying out binarization processing on the edge image to obtain a processed fine edge image; and finally, detecting the fine edge image, and if the fine edge image meets a first preset condition (determined according to actual use requirements), judging that a first area corresponding to the fine edge image is a sky area. For the specific edge detection process, reference may be made to related technologies, which are not described herein in detail.
Optionally, the electronic device may also determine the sky region in combination with the depth image and the two-dimensional image of the content of the target image.
For example, the electronic device may determine the sky region according to a depth value and a depth gradient value of each pixel point of the depth image and a gray value of each pixel point of the two-dimensional image. The detailed process can refer to the related art, and is not described herein.
Optionally, the electronic device may also determine the sky region based on the sky segmentation network model.
For example, the electronic device first trains and generates a sky segmentation network model according to a large amount of sky sample data, and then inputs a target image into the sky segmentation network model to obtain a sky region.
It can be understood that the electronic device determines whether the target image is a sunset scene image or a sunrise scene image according to the sky area. If yes, go to step 203; otherwise, the target image is not processed.
Optionally, the electronic device may determine whether the target image is a sunset scene image or a sunrise scene image according to a proportion of yellow pixel points in the sky region.
Alternatively, the electronic device may calculate a color temperature of the sky region, and then determine whether the target image is a sunset scene image or a sunrise scene image according to the color temperature.
For example, the electronic device may divide the sky region into M × N blocks, say 25 × 25, and count the basic information (number of white pixels and mean of the components of the R/G/B channel) for each block. Then, according to the basic information of each block, all white blocks in the image are found out, and the color temperature of each white block is judged according to the color temperature curve, so that all possible color temperatures in the image are obtained. For example, in 25 × 25 ═ 625 blocks, a total of 100 effective white blocks are found, and 80 white blocks inside represent the color temperature of about 4500, so that the color temperature of the sky region is basically 4500. When the color temperature of the sky area is within a preset range, the target image can be judged to be a sunset scene image or a sunrise scene image, otherwise, the target image is not. The value of the preset range can be determined according to actual use requirements, and the embodiment of the invention is not limited. The process of specifically calculating the color temperature of the sky region and determining according to the color temperature may refer to related technologies, which are not described herein again.
Illustratively, the step 202 can be specifically realized by the following steps 202a to 202 c.
Step 202a, the electronic device calculates a target proportion.
The target ratio is: the ratio of the number of yellow pixel points in the sky area to the number of total pixel points in the sky area.
It can be understood that, in the embodiment of the present invention, taking the RGB color space as an example, the electronic device may determine whether a pixel is a yellow pixel according to the R, G, B value of the pixel.
For example, the electronic device may set the R, G, B value ranges belonging to the yellow pixels in advance, and if the R, G, B value of one pixel falls within the corresponding ranges, it may be determined that the one pixel is the yellow pixel, otherwise, it is not.
Step 202b, the electronic device determines that the target image is a sunset scene image or a sunrise scene image when the target ratio is greater than or equal to a preset threshold.
Step 202c, the electronic device determines that the target image is not a sunset scene image or a sunrise scene image when the target proportion is smaller than a preset threshold.
The preset proportion can be set according to actual use requirements, and the embodiment of the invention is not limited.
It is understood that the electronic device determines whether the target ratio is greater than or equal to a preset threshold. If the target proportion is larger than or equal to a preset threshold value, determining that the target image is a sunset scene image or a sunrise scene image; otherwise, the target image is determined not to be a sunset scene image or a sunrise scene image.
In the embodiment of the invention, the electronic equipment calculates the target proportion, and judges whether the target image is a sunset scene image or a sunrise scene image according to the target proportion, so that the calculation amount is small, and the accuracy is high.
And the electronic equipment enhances the color saturation of the sky area under the condition that the target image is determined to be a sunset scene image or a sunrise scene image.
It can be understood that, in the embodiment of the present invention, the electronic device can improve the problem that the color of the sunset scene image or the sunrise scene image is darker by enhancing the color saturation of the pixel points in the sky region.
For example, the electronic device may increase the S (saturation) value of a pixel point in the HSI (or HSV) color space. If the sky area is not an image in the HSI (or HSV) color space, the electronic device may convert the sky area into an image in the HSI (or HSV) color space, and the specific conversion process may refer to related technologies, which are not described herein again.
It should be noted that, in the embodiment of the present invention, the electronic device may enhance the color saturation of the sky area according to a predetermined rule, and the embodiment of the present invention is not limited. For example, the electronic device may enhance the color saturation of the sky region by a predetermined ratio.
The predetermined rule may be determined according to actual usage requirements, and the embodiment of the present invention is not limited.
For ease of understanding, the following explains the concept of color space, and illustrates several color spaces.
Color is usually described by three independent attributes, and the three independent variables are combined to form a space coordinate, namely a color space. The color object being described is itself objective and different color spaces simply weigh the same object from different angles. Color spaces can be divided into two main categories according to basic mechanisms: a primary color space (also referred to as a trichromatic color space) and a color, luminance separating color space. The primary color space and the color and brightness separated color space can be mutually converted, and the specific conversion process can refer to the related technology, which is not described herein again.
Primary color space: what color a pixel point exhibits is determined by the three primary color components. For example, one pixel includes three sub-pixels of red (R), green (G), and blue (B), and the three primary color components are the R value, the G value, and the B value. For another example, if a pixel includes three sub-pixels of cyan (C), magenta (M), and yellow (Y), the three primary color components may be C, M, and Y values. In the present embodiment, the component values are preferably RGB component values or CMY component values, and may be other color component values.
It should be noted that, in the embodiment of the present invention, one pixel point may be formed by only the three sub-pixels, and certainly, a fourth sub-pixel, for example, a white (W) sub-pixel, may also be included. At this time, since white does not belong to a primary color, reference is still made to the above description for the three primary color components of such a pixel.
Color, brightness separation color space: i.e., a color space consisting of luminance, which may also be referred to as gray scale or gray scale, and chrominance, which includes hue and saturation. In the embodiment of the present invention, the color-luminance separation color space may be a YUV color space, a YCbCr color space, an HSI color space, an HSV color space, or the like.
In the following, a few common color spaces including luminance and chrominance are briefly described.
YUV: where Y denotes Luminance (Luma or Luma), i.e., gray scale, and U and V denote Chrominance (Chroma or Chroma). In YUV, the Y value is the gray value.
YCbCr: is a scaled and shifted version of YUV. Wherein Y is consistent with Y meaning in YUV, Cb and Cr are color, and are different only in representation method. In the YUV family, YCbCr is the most used member in computer systems, and its application field is very wide, and JPEG and MPEG adopt the format. YUV is mostly referred to as YCbCr. In YCbCr, the Y value is the gray scale value.
HSI: reflecting the way that the human visual system perceives color, color is perceived in three basic feature quantities of hue (H), saturation (S), and brightness (I). In HSI, the value of I is the gray scale value.
HSV: the method is a color space created according to the visual characteristics of colors, and the parameters are as follows: hue (H), saturation (S), lightness (V). In HSV, the V value is the gray scale value.
The embodiment of the invention provides an image processing method, wherein electronic equipment can determine a sky area of a target image; judging whether the target image is a sunset scene image or a sunrise scene image according to the sky area; and enhancing the color saturation of the sky area under the condition that the target image is determined to be a sunset scene image or a sunrise scene image. By the scheme, under the condition that the target image is judged to be a sunset scene image or a sunrise scene image according to the sky area of the target image, the color saturation of the sky area is enhanced, so that the color of the target image is more gorgeous and more vivid, and the overall effect of the target image is further improved.
Optionally, the electronic device may determine a sky region of the target image based on the sky segmentation network model.
Exemplarily, referring to fig. 2, as shown in fig. 3, the step 201 can be specifically implemented by the following step 201 a.
The sky segmentation network model is generated according to the sky sample image.
It is to be appreciated that the electronic device can input the target image into the sky segmentation network model to obtain a sky region.
In the embodiment of the invention, the electronic equipment determines the sky area of the target image based on the sky segmentation network model, so that the processing speed can be increased, and the target image can be rapidly processed to improve the display effect of the target image.
Optionally, before step 201a, the electronic device needs to generate the space-based segmentation network model.
Illustratively, in conjunction with fig. 3, as shown in fig. 4, before step 201a described above, the image processing method provided by the embodiment of the present invention may further include steps 204 to 205 described below.
In the embodiment of the invention, the sky sample data is a large number of sample images comprising a sky area.
For example, the electronic device may capture a plurality of sample images including a sky area, download a plurality of sample images including a sky area from a network, and the like.
And step 205, the electronic device generates the sky segmentation network model according to the sky sample data and the artificial intelligence learning model.
The artificial intelligence learning model is established according to an artificial intelligence learning algorithm. The artificial intelligence learning algorithm includes a machine learning algorithm and others, and specific reference is made to the related art, which is not described herein again. The machine learning algorithm further includes a regression algorithm, a bayesian algorithm, a decision tree algorithm, a deep learning algorithm, and others, which are specifically referred to the related art and will not be described herein. Therefore, the artificial intelligence learning model includes a machine learning model and the like, and the machine learning model includes a regression algorithm model, a bayesian algorithm model, a decision tree algorithm model, a deep learning model and the like.
For example, in the embodiment of the present invention, the artificial intelligence learning model may be a tensrflow or other multitask deep learning model, and the embodiment of the present invention is not limited.
In the following, an artificial intelligence learning model is taken as an example to explain how the electronic device generates a sky segmentation network model according to the sky sample data and the artificial intelligence learning model.
A basic model (deep learning model) can be established according to a deep learning algorithm, and then the basic model is continuously trained by using collected sky sample data to generate a model meeting the target requirement, namely a sky segmentation network model.
The target requirement is a criterion for judging accuracy of the generated sky region, for example, accuracy (or score) of the sky region generated by the statistical user according to the sky segmentation network model compared with an actual sky region is counted, and when the accuracy (or score) is greater than or equal to a threshold value, the target requirement is judged to be met. For a specific process of generating the sky segmentation network model, reference may be made to the related art, which is not described herein again.
Optionally, before step 201, the electronic device may download the target image from a network or obtain the target image from a gallery of the electronic device, or the electronic device may capture the target image through a camera application of the electronic device.
Illustratively, in conjunction with fig. 2, as shown in fig. 5, before step 201 described above, the image processing method provided by the embodiment of the present invention may further include steps 206 to 207 described below.
Optionally, the input of the user is an input for triggering the electronic device to shoot a target image, and may be a touch screen input of the user on a shooting preview interface, or a shortcut key input of the user on a screen locking interface, and the like.
Optionally, the input of the user may be a click input for triggering the electronic device to shoot the target image, a slide input for triggering the electronic device to shoot the target image, or other feasibility inputs, which is not limited in the embodiment of the present invention.
Illustratively, the click input may be any number of click inputs or multi-finger click inputs, such as a single click input, a double click input, a three click input, a double-finger click input, or a three-finger click input; the slide input may be a slide input in an arbitrary direction or a multi-finger slide input, for example, an upward slide input, a downward slide input, a leftward slide input, a rightward slide input, a two-finger slide input, or a three-finger slide input.
In the embodiment of the invention, in the process of shooting the target image, if the electronic equipment judges that the target image is a sunset scene image or a sunrise scene image according to the sky area of the target image, the electronic equipment can enhance the color saturation of the sky area, so that the display effect of the target image can be improved, the target image is more gorgeous and more vivid, and compared with the traditional shooting technology, the electronic equipment does not need to process the image through image processing software after the user obtains the target image through shooting, and the image shooting quality and efficiency can be improved.
Optionally, after step 203, the electronic device may further display the processed target image, so as to facilitate the user to preview the processing effect.
Illustratively, referring to fig. 5, as shown in fig. 6, after the step 203, the image processing method provided by the embodiment of the present invention may further include a step 208 described below.
And step 208, the electronic equipment displays the processed target image.
The processed target image is the target image after the color saturation of the sky area is enhanced.
In the embodiment of the invention, the user can trigger the electronic equipment to store the processed target image through input according to the display effect of the processed target image, or trigger the electronic equipment to process the target image again through input, so that a better display effect is achieved.
The drawings in the embodiments of the present invention are all exemplified by the drawings in the independent embodiments, and when the embodiments of the present invention are specifically implemented, each of the drawings can also be implemented by combining any other drawings which can be combined, and the embodiments of the present invention are not limited. For example, with reference to fig. 3, before step 201a, the image processing method provided in the embodiment of the present invention may further include steps 206 to 207 described above.
As shown in fig. 7, an embodiment of the present invention provides an electronic device 120, where the electronic device 120 includes: a determining module 121, a judging module 122 and an enhancing module 123; the determining module 121 is configured to determine a sky region of the target image; the judging module 122 is configured to judge whether the target image is a sunset scene image or a sunrise scene image according to the sky area determined by the determining module 121; the enhancing module 123 is configured to enhance the color saturation of the sky area when the determining module 122 determines that the target image is a sunset scene image or a sunrise scene image.
Optionally, the determining module 122 is specifically configured to calculate a target ratio, where the target ratio is: the ratio of the number of yellow pixel points in the sky area to the number of total pixel points in the sky area; determining the target image as a sunset scene image or a sunrise scene image under the condition that the target proportion is greater than or equal to a preset threshold value; and determining that the target image is not a sunset scene image or a sunrise scene image in the case that the target proportion is smaller than the preset threshold.
Optionally, the determining module 121 is specifically configured to determine a sky region of the target image based on a sky segmentation network model, where the sky segmentation network model is generated according to a sky sample image.
Optionally, the electronic device 120 further includes: an acquisition module 124 and a generation module 125; the obtaining module 124 is configured to obtain sky sample data before the determining module 121 determines the sky region of the target image based on the sky segmentation network model; the generating module 125 is configured to generate the sky segmentation network model according to the sky sample data and the artificial intelligence learning model acquired by the acquiring module 124.
Optionally, the electronic device 120 further includes: a receiving module 126 and a photographing module 127; the receiving module 126 is configured to receive a user input before the determining module 121 determines the sky area of the target image; the photographing module 127 is configured to photograph the target image in response to the user input received by the receiving module 126.
It should be noted that, as shown in fig. 7, modules that are necessarily included in the electronic device 120 are indicated by solid line boxes, such as a determining module 121, a determining module 122, and an enhancing module 123; modules that may or may not be included in the electronic device 120 are illustrated with dashed boxes, such as an acquisition module 124, a generation module 125, a reception module 126, and a capture module 127.
The electronic device provided in the embodiment of the present invention is capable of implementing each process shown in any one of fig. 2 to 6 in the above method embodiments, and details are not described here again to avoid repetition.
The embodiment of the invention provides electronic equipment, which can determine a sky area of a target image; judging whether the target image is a sunset scene image or a sunrise scene image according to the sky area; and enhancing the color saturation of the sky area under the condition that the target image is determined to be a sunset scene image or a sunrise scene image. By the scheme, under the condition that the target image is judged to be a sunset scene image or a sunrise scene image according to the sky area of the target image, the color saturation of the sky area is enhanced, so that the color of the target image is more gorgeous and more vivid, and the overall effect of the target image is further improved.
Fig. 8 is a schematic hardware structure diagram of an electronic device implementing various embodiments of the present application. As shown in fig. 8, the electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 8 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
The processor 110 is configured to determine a sky region of the target image; judging whether the target image is a sunset scene image or a sunrise scene image according to the sky area; and enhancing the color saturation of the sky area under the condition that the target image is determined to be a sunset scene image or a sunrise scene image.
According to the electronic device provided by the embodiment of the invention, the electronic device can determine the sky area of the target image; judging whether the target image is a sunset scene image or a sunrise scene image according to the sky area; and enhancing the color saturation of the sky area under the condition that the target image is determined to be a sunset scene image or a sunrise scene image. By the scheme, under the condition that the target image is judged to be a sunset scene image or a sunrise scene image according to the sky area of the target image, the color saturation of the sky area is enhanced, so that the color of the target image is more gorgeous and more vivid, and the overall effect of the target image is further improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 102, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic apparatus 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the electronic device 100 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 8, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the electronic apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power supply 111 (e.g., a battery) for supplying power to each component, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 100 includes some functional modules that are not shown, and are not described in detail herein.
Optionally, an embodiment of the present invention further provides an electronic device, which may include the processor 110 shown in fig. 8, the memory 109, and a computer program stored on the memory 109 and capable of being executed on the processor 110, where the computer program, when executed by the processor 110, implements each process of the image processing method shown in any one of fig. 2 to fig. 6 in the foregoing method embodiments, and can achieve the same technical effect, and details are not described here to avoid repetition.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the image processing method shown in any one of fig. 2 to 6 in the foregoing method embodiments, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (10)
1. An image processing method applied to an electronic device, the method comprising:
determining a sky area of a target image;
judging whether the target image is a sunset scene image or a sunrise scene image according to the sky area;
enhancing color saturation of the sky region if the target image is determined to be a sunset scene image or a sunrise scene image.
2. The method of claim 1, wherein the determining whether the target image is a sunset scene image or a sunrise scene image according to the sky area comprises:
calculating a target proportion, wherein the target proportion is as follows: the ratio of the number of yellow pixel points in the sky region to the number of total pixel points in the sky region;
determining the target image as a sunset scene image or a sunrise scene image under the condition that the target proportion is greater than or equal to a preset threshold value;
determining that the target image is not a sunset scene image or a sunrise scene image if the target proportion is less than the preset threshold.
3. The method of claim 1, wherein the determining the sky region of the target image comprises:
determining a sky region of the target image based on a sky segmentation network model, wherein the sky segmentation network model is generated according to a sky sample image.
4. The method of claim 3, wherein determining the sky region of the target image based on the sky segmentation network model is preceded by:
acquiring sky sample data;
and generating the sky segmentation network model according to the sky sample data and an artificial intelligence learning model.
5. The method of claim 1, wherein prior to determining the sky region of the target image, further comprising:
receiving input of a user;
in response to a user input, the target image is captured.
6. An electronic device, characterized in that the electronic device comprises: the device comprises a determining module, a judging module and an enhancing module;
the determining module is used for determining a sky area of the target image;
the judging module is used for judging whether the target image is a sunset scene image or a sunrise scene image according to the sky area determined by the determining module;
the enhancement module is used for enhancing the color saturation of the sky area under the condition that the judgment module determines that the target image is a sunset scene image or a sunrise scene image.
7. The electronic device according to claim 6, wherein the determining module is specifically configured to calculate a target ratio, where the target ratio is: the ratio of the number of yellow pixel points in the sky region to the number of total pixel points in the sky region; determining the target image as a sunset scene image or a sunrise scene image under the condition that the target proportion is greater than or equal to a preset threshold value; determining that the target image is not a sunset scene image or a sunrise scene image if the target proportion is less than the preset threshold.
8. The electronic device of claim 6, wherein the determining module is specifically configured to determine the sky region of the target image based on a sky segmentation network model, and the sky segmentation network model is generated from a sky sample image.
9. The electronic device of claim 8, further comprising: the device comprises an acquisition module and a generation module;
the acquisition module is used for acquiring sky sample data before the determination module determines the sky area of the target image based on a sky segmentation network model;
the generation module is used for generating the sky segmentation network model according to the sky sample data and the artificial intelligence learning model acquired by the acquisition module.
10. The electronic device of claim 6, further comprising: the device comprises a receiving module and a shooting module;
the receiving module is used for receiving the input of a user before the determining module determines the sky area of the target image;
the shooting module is used for responding to the input of the user received by the receiving module and shooting the target image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911150964.3A CN110944163A (en) | 2019-11-21 | 2019-11-21 | Image processing method and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911150964.3A CN110944163A (en) | 2019-11-21 | 2019-11-21 | Image processing method and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110944163A true CN110944163A (en) | 2020-03-31 |
Family
ID=69907227
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911150964.3A Pending CN110944163A (en) | 2019-11-21 | 2019-11-21 | Image processing method and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110944163A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113034514A (en) * | 2021-03-19 | 2021-06-25 | 影石创新科技股份有限公司 | Sky region segmentation method and device, computer equipment and storage medium |
CN114286005A (en) * | 2021-12-29 | 2022-04-05 | 合众新能源汽车有限公司 | Image display method and device for vehicle skylight |
CN114510187A (en) * | 2022-01-28 | 2022-05-17 | 北京百度网讯科技有限公司 | Image display method and apparatus, electronic device, and medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103067661A (en) * | 2013-01-07 | 2013-04-24 | 华为终端有限公司 | Image processing method, image processing device and shooting terminal |
CN103942523A (en) * | 2013-01-18 | 2014-07-23 | 华为终端有限公司 | Sunshine scene recognition method and device |
CN103945088A (en) * | 2013-01-21 | 2014-07-23 | 华为终端有限公司 | Method and device for scene recognition |
CN107948538A (en) * | 2017-11-14 | 2018-04-20 | 广东欧珀移动通信有限公司 | Imaging method, device, mobile terminal and storage medium |
CN108024105A (en) * | 2017-12-14 | 2018-05-11 | 珠海市君天电子科技有限公司 | Image color adjusting method, device, electronic equipment and storage medium |
CN108600630A (en) * | 2018-05-10 | 2018-09-28 | Oppo广东移动通信有限公司 | Photographic method, device and terminal device |
-
2019
- 2019-11-21 CN CN201911150964.3A patent/CN110944163A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103067661A (en) * | 2013-01-07 | 2013-04-24 | 华为终端有限公司 | Image processing method, image processing device and shooting terminal |
CN103942523A (en) * | 2013-01-18 | 2014-07-23 | 华为终端有限公司 | Sunshine scene recognition method and device |
CN103945088A (en) * | 2013-01-21 | 2014-07-23 | 华为终端有限公司 | Method and device for scene recognition |
CN107948538A (en) * | 2017-11-14 | 2018-04-20 | 广东欧珀移动通信有限公司 | Imaging method, device, mobile terminal and storage medium |
CN108024105A (en) * | 2017-12-14 | 2018-05-11 | 珠海市君天电子科技有限公司 | Image color adjusting method, device, electronic equipment and storage medium |
CN108600630A (en) * | 2018-05-10 | 2018-09-28 | Oppo广东移动通信有限公司 | Photographic method, device and terminal device |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113034514A (en) * | 2021-03-19 | 2021-06-25 | 影石创新科技股份有限公司 | Sky region segmentation method and device, computer equipment and storage medium |
CN114286005A (en) * | 2021-12-29 | 2022-04-05 | 合众新能源汽车有限公司 | Image display method and device for vehicle skylight |
CN114510187A (en) * | 2022-01-28 | 2022-05-17 | 北京百度网讯科技有限公司 | Image display method and apparatus, electronic device, and medium |
CN114510187B (en) * | 2022-01-28 | 2023-06-23 | 北京百度网讯科技有限公司 | Image display method and device, electronic equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108111754B (en) | Method for determining image acquisition mode and mobile terminal | |
WO2017071219A1 (en) | Method for detecting skin region and device for detecting skin region | |
CN107621738B (en) | Control method of mobile terminal and mobile terminal | |
CN109238460B (en) | Method for obtaining ambient light intensity and terminal equipment | |
CN107977652B (en) | Method for extracting screen display content and mobile terminal | |
CN108307109B (en) | High dynamic range image preview method and terminal equipment | |
CN110944160B (en) | Image processing method and electronic equipment | |
CN107644396B (en) | Lip color adjusting method and device | |
CN109495616B (en) | Photographing method and terminal equipment | |
CN109068063B (en) | Three-dimensional image data processing and displaying method and device and mobile terminal | |
CN110944163A (en) | Image processing method and electronic equipment | |
JP7467667B2 (en) | Detection result output method, electronic device and medium | |
CN109246351B (en) | Composition method and terminal equipment | |
CN109104578B (en) | Image processing method and mobile terminal | |
CN109005314B (en) | Image processing method and terminal | |
CN108616687B (en) | Photographing method and device and mobile terminal | |
CN111031178A (en) | Video stream clipping method and electronic equipment | |
CN110493510B (en) | Image processing method and terminal equipment | |
CN110209324B (en) | Display method and terminal equipment | |
CN109840476B (en) | Face shape detection method and terminal equipment | |
CN109639981B (en) | Image shooting method and mobile terminal | |
CN109348212B (en) | Image noise determination method and terminal equipment | |
CN109462727B (en) | Filter adjusting method and mobile terminal | |
CN109104573B (en) | Method for determining focusing point and terminal equipment | |
CN110636225A (en) | Photographing method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200331 |
|
RJ01 | Rejection of invention patent application after publication |