Nothing Special   »   [go: up one dir, main page]

CN111510698A - Image processing method, device, storage medium and mobile terminal - Google Patents

Image processing method, device, storage medium and mobile terminal Download PDF

Info

Publication number
CN111510698A
CN111510698A CN202010327508.8A CN202010327508A CN111510698A CN 111510698 A CN111510698 A CN 111510698A CN 202010327508 A CN202010327508 A CN 202010327508A CN 111510698 A CN111510698 A CN 111510698A
Authority
CN
China
Prior art keywords
image
mobile terminal
target
original image
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010327508.8A
Other languages
Chinese (zh)
Inventor
吴泰云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huizhou TCL Mobile Communication Co Ltd
Original Assignee
Huizhou TCL Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huizhou TCL Mobile Communication Co Ltd filed Critical Huizhou TCL Mobile Communication Co Ltd
Priority to CN202010327508.8A priority Critical patent/CN111510698A/en
Publication of CN111510698A publication Critical patent/CN111510698A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/68Circuits for processing colour signals for controlling the amplitude of colour signals, e.g. automatic chroma control circuits

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses an image processing method, an image processing device, a storage medium and a mobile terminal, wherein the method is applied to the mobile terminal and comprises the following steps: acquiring an original image acquired by a mobile terminal, wherein the color depth of the original image is at least 10 bits per pixel; determining the brightest area in the original image, and carrying out exposure processing on the brightest area to obtain intermediate image data; and carrying out logarithmic conversion on the intermediate image data to obtain a target image. According to the embodiment of the application, the gamma correction is not needed to be carried out on the intermediate image data, but the logarithmic conversion is carried out, and more detailed characteristics of the bright part and the dark part of the image are reserved by using the logarithmic conversion; on the other hand, exposure processing is carried out according to the brightest area, so that loss of detail features is avoided, and more detail features are reserved.

Description

Image processing method, device, storage medium and mobile terminal
Technical Field
The present application relates to the field of communications technologies, and in particular, to an image processing method and apparatus, a storage medium, and a mobile terminal.
Background
At present, in the process of obtaining image data or video data by shooting through a mobile terminal, a gamma correction link is provided, and the link mainly aims to compensate the characteristics of human vision, so that data bits or bandwidth for representing black and white are utilized to the maximum extent according to human perception of light or black and white. Specifically, a dark portion and a light portion in an image signal are detected and the ratio of the two is increased, thereby improving the image contrast effect to conform to the perceptual characteristics of human vision. However, due to the characteristics of gamma correction, the bright and dark portions of the image may not be well represented, resulting in a lot of detail loss. Many details in the image data or video data are not represented and many detailed features are lost. In the post-processing process, many detail features are lost, so that the expressiveness is poor.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a storage medium and a mobile terminal, which can enable the mobile terminal to realize a brand-new shooting mode, and enable shot image data to keep more detailed characteristics.
The embodiment of the application provides an image processing method, which comprises the following steps:
acquiring an original image acquired by a mobile terminal, wherein the color depth of the original image is at least 10 bits per pixel;
determining the brightest area in the original image, and carrying out exposure processing on the brightest area to obtain intermediate image data;
and carrying out logarithmic conversion on the intermediate image data to obtain a target image.
An embodiment of the present application further provides an image transmission device, including:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring an original image acquired by a mobile terminal, and the color depth of the original image is at least 10 bits per pixel;
the processing unit is used for determining the brightest area in the original image and carrying out exposure processing on the brightest area to obtain intermediate image data;
and the conversion unit is used for carrying out logarithmic conversion on the intermediate image data to obtain the target image.
Further, the obtaining unit is further configured to obtain a preset color index table corresponding to the color depth; and the conversion unit is also used for carrying out color gamut conversion on the target image according to the preset color index table so as to obtain a converted image.
Further, the obtaining unit is specifically configured to obtain a display mode determined by a user; and determining a preset color index table corresponding to the color depth according to the display mode.
Further, a conversion unit, specifically configured to normalize pixel values of the intermediate image data; carrying out logarithmic conversion on the normalized pixel value by using a logarithmic function to obtain an intermediate pixel value; and converting the intermediate pixel value into a target pixel value corresponding to the color depth, and taking an image corresponding to the target pixel value as a target image.
Further, an exposure unit, in particular for dividing the original image into a plurality of image blocks; calculating the average value of pixels of a plurality of image blocks; and taking the area corresponding to the image block with the highest pixel average value as the brightest area in the original image.
Furthermore, the original image comprises each frame of original image collected when the video is shot, the target image comprises each frame of corresponding target image, and the conversion unit is also used for obtaining video data according to each frame of target image; the image processing apparatus further includes: and the compression unit is used for compressing the video data to obtain target video data.
Further, the obtaining unit is further configured to obtain target video data according to the video viewing instruction if the video viewing instruction is received; and the compression unit is also used for decompressing the target video data to obtain video data, and the video data comprises a plurality of frames of target images.
The embodiment of the application also provides a computer readable storage medium, wherein a plurality of instructions are stored in the computer readable storage medium, and the instructions are suitable for being loaded by a processor to execute any one of the image processing methods.
The embodiment of the application further provides a mobile terminal, which comprises a processor and a memory, wherein the processor is electrically connected with the memory, the memory is used for storing instructions and data, and the processor is used for any step in the image processing method.
According to the image processing method, the device, the storage medium and the mobile terminal, the original image with at least 10-bit color depth acquired by the mobile terminal is obtained, the brightest area exposure processing is carried out on the original image to obtain the intermediate image data, the intermediate image data is subjected to logarithmic conversion to obtain the target image, the intermediate image data is understandably subjected to logarithmic conversion without gamma correction, and more detail characteristics of a bright part and a dark part of the image are reserved by using logarithmic conversion, so that log shooting of the mobile terminal is realized, and more detail characteristics of the bright part and the dark part of the image can be reserved by log shooting; on the other hand, after the original image is obtained, the brightest area in the original image is determined, exposure processing is carried out on the brightest area, and therefore the situation that the detail features are lost due to overexposure of the target image obtained by the mobile terminal is avoided.
Drawings
The technical solution and other advantages of the present application will become apparent from the detailed description of the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application.
Fig. 2a is a diagram illustrating a processing flow of a camera according to an embodiment of the present application.
Fig. 2b is a flowchart illustrating a preset image processing procedure according to an embodiment of the present disclosure.
Fig. 2c is a diagram illustrating a simple flow of image processing in the prior art according to an embodiment of the present application.
Fig. 2d is a diagram illustrating a simple flow of image processing according to an embodiment of the present application.
Fig. 2e is a comparison graph of a general image and a logarithmically converted image provided in the embodiment of the present application.
Fig. 2f is a flowchart illustrating an image processing according to an embodiment of the present application.
Fig. 3 is another schematic flow chart of an image processing method according to an embodiment of the present application.
Fig. 4a is a flowchart of a video processing method according to an embodiment of the present application.
Fig. 4b is a diagram illustrating a process of obtaining target video data according to an embodiment of the present application.
Fig. 4c is a diagram illustrating an example of a process for viewing target video data according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of an image transmission device according to an embodiment of the present application.
Fig. 6 is another schematic structural diagram of the image transmission device according to the embodiment of the present application.
Fig. 7 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Fig. 8 is another schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides an image processing method, an image processing device, a storage medium and a mobile terminal. Any kind of picture transmission device that this application embodiment provided can integrate in mobile terminal, and this mobile terminal can include smart mobile phone, Pad, wearable equipment, robot etc.. The terminal device is provided with a camera, the camera comprises a photosensitive element, the photosensitive element can also be called an image Sensor, such as a Complementary Metal-oxide semiconductor Sensor (SMOS Sensor), and the color depth of the photosensitive element is at least 10 bits of color depth.
Color depth in the field of computer graphics represents the number of bits, also called bits/pixel (bpp), used to store a pixel's color in a bitmap or video frame buffer. The higher the color depth, the more colors are available. The color depth is described by "n-bit color", which is 2 if the color depth is n bitsnThe color is selected and the number of bits used to store each pixel is n. For example, 8 bit color depth, the number of bits used to store the color of a pixel is 8, corresponding to 28For 256 colors, simply understanding the color of one pixel can be represented by 0 to 255; the depth of the 10-bit color depth,the number of bits used to store the color of a pixel is 10, corresponding to 210For 1024 colors, simply understanding the color of one pixel can be represented by 0 to 1023; and so on.
Referring to fig. 1, fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application, where the image processing method includes the following steps:
101, acquiring an original image collected by a mobile terminal, wherein the color depth of the original image is at least 10 bits per pixel.
When the mobile terminal shoots a video or a picture, the mobile terminal shoots the video or the picture by using the camera. As shown in fig. 2a, the steps of the camera processing are as follows: the scene generates an optical Signal through a lens (lens), the lens projects the optical Signal to a photosensitive area of an Image sensor, the Image sensor outputs an original Image after photosensitive processing (photoelectric conversion), the original Image is an original Image collected by the mobile terminal, and an Image information Processor (Image Signal Processor, ISP) processes the original Image to output an Image of an RGB spatial domain. In the whole process, the image information processor correspondingly controls the logic, the lens and the image sensor of the image information processor through firmware running on the image information processor, and further, the functions of automatic exposure, automatic white balance and the like are completed. The original image may be an image of Bayer format, among others. It should be noted that steps 101-104 in the embodiment of the present application are executed in an image information processor of a mobile terminal.
It should be noted that, the color depth of the image sensor of the current mobile terminal is 8 bits, and the original image after the 8-bit image sensor is exposed to light has not so much color latitude, so that the desired effect is not achieved. Thus, step 101 comprises: the method comprises the steps of changing a current image sensor of the mobile terminal into an image sensor with at least 10-bit color depth, carrying out photosensitive processing by using the image sensor with at least 10-bit color depth, and obtaining an original image after photosensitive processing of the image sensor with at least 10-bit color depth, wherein the original image after photosensitive processing of the image sensor is an original image collected by the mobile terminal. Thus, the required color latitude can be obtained, and more details can be reserved. After being subjected to a sensitization process using an image sensor with at least 10-bit color depth, the obtained original image is also image data with corresponding color depth, i.e. each pixel in the original image is represented by using the corresponding color depth. If the color depth of the image sensor is 10 bits, each pixel in the original image is represented using a 10-bit color depth; the color depth of the image sensor is 12 bits, then each pixel in the original image is represented using a 12-bit color depth.
In some cases, the mobile terminal has multiple cameras, such as three cameras, four cameras, etc. There is one image sensor in one camera, and it is assumed that the color depth of the image sensor in some cameras is 8 bits and the color depth of the image sensor in some cameras is 10 bits. The mobile terminal can provide a corresponding control, such as a "log shooting" control, when a user wants to realize log shooting, the "log shooting" control is triggered, namely an adjustment instruction is triggered, and according to the adjustment instruction, the mobile terminal adjusts the current camera to the camera of the image sensor with 10-bit color depth. As such, step 101 includes: the method comprises the steps of replacing a current image sensor of the mobile terminal with an image sensor with at least 10-bit color depth, carrying out photosensitive processing by using the image sensor with at least 10-bit color depth, and obtaining an original image after photosensitive processing of the image sensor with at least 10-bit color depth. Specifically, the method for replacing the current image sensor of the mobile terminal with the image sensor with at least 10-bit color depth comprises the following steps: when receiving the adjustment instruction, the camera currently used by the mobile terminal is replaced (adjusted) to the camera of the image sensor including at least 10-bit color depth. And carrying out sensitization processing by using the image sensor of the replaced camera, and acquiring an original image after sensitization processing of the corresponding image sensor, wherein the original image after sensitization processing of the image sensor is the original image collected by the mobile terminal. Each pixel in the original image is represented using a corresponding color depth.
And 102, determining the brightest area in the original image, and performing exposure processing on the brightest area to obtain intermediate image data.
Specifically, step 102 includes: carrying out preset image processing on the original image to obtain processed image data; and determining the brightest area in the processed image data, and carrying out exposure processing on the brightest area to obtain intermediate image data.
The method for processing the image data by the image information processor comprises the following steps: linear correction, lens shading correction, denoising, dead pixel removal, demosaicing, edge enhancement, color correction, automatic white balance, etc., as shown in fig. 2 b. It is noted that the step of performing the preset image processing may also comprise more steps and may be performed in a different order, and the step of performing the preset image processing shown in fig. 2b is only an example. The steps in fig. 2b will now be briefly described.
Linear rectification is required for the original image to be linearly rectified because the response of the image sensor is not linear in all ranges of 0 to 1023 (taking 10-bit color depth as an example), and in a high-brightness area, the higher the brightness value is, the more saturated the image is, and the oversaturation is easier in the case of dark light, so that the input rectification can be performed on the original image by using a piecewise linear function.
The lens shading Correction (L ens Shade Correction, L SC) is that the oblique light beam passing through the camera lens is gradually reduced as the field angle is gradually increased when the camera is at a longer imaging distance, so that the obtained image is brighter in the middle and darker in the edge, which is the vignetting in the optical system.
Denoising, raw image acquisition using CMOS sensors, illumination level and Sensor problems are the main factors in generating a lot of noise in the image. At the same time, some other noise is introduced when performing the digital-to-analog conversion. The noise can make the image blur as a whole, and many details are lost, so that the image needs to be denoised, and a non-linear denoising algorithm, such as a bilateral filter, is generally adopted for denoising.
Dead Pixel (DPC) refers to a pixel in a pixel array that shows a significantly different change from surrounding pixels because an image sensor has thousands of elements working together, and thus the probability of dead pixel is high. Generally, the dead spots are classified into three categories: the first type is dead spots, i.e. spots that always appear as the darkest values; the second category is bright spots, i.e. the spots that always appear as the brightest values: the third type is a drift point, which is a pixel point with a change rule significantly different from surrounding pixels. Due to the CFA application in image sensors, only one color information can be obtained per pixel, and the missing two color information needs to be obtained from surrounding pixels. If the dead pixel exists in the image, the dead pixel can be diffused outwards along with the process of color interpolation until the whole image is affected. Therefore, the elimination of the dead pixel must be performed before the color interpolation. And processing the dead pixel by using a dead pixel removing module in the image information processing.
Demosaic (Demosaic), because each pixel can only sense one color, a method must be found to recover the information of the other two channels of the pixel, and the process of finding the values of the other two channels of the pixel is the process of color interpolation. Since the image is continuously changing, the value R, G, B of one pixel should be associated with the surrounding pixels, so that the values of the surrounding pixels can be used to obtain the values of the other two channels of the pixel. The most common interpolation algorithm at present is to calculate the interpolation value of the pixel by using the average value of the pixels around the pixel.
Color Correction Matrix (CCM), the response of the image sensor to the spectrum, usually deviated from the response of the human eye to the spectrum in the RGB components, needs to be corrected. Not only is the cross-over effect, but the intensity of the response to each component of the color also needs to be corrected. It is common practice to correct the color once by means of a color correction matrix.
Automatic White Balance (AWB), the human visual system has the characteristic of color constancy, so that human observation of things can be unaffected by the color of a light source. However, the image sensor itself does not have such a characteristic of color constancy, and therefore, images captured under different light rays are affected by the color of the light source and change. For example, an image taken in a clear sky may be blue, while an object taken in candlelight may be red in color. Therefore, in order to eliminate the influence of the light source color on the imaging of the image sensor, the automatic white balance function simulates the color constancy characteristic of the human visual system to eliminate the influence of the light source color on the image.
And after the original image is subjected to preset image processing, obtaining processed image data, determining the brightest area in the processed image data, and performing exposure processing on the brightest area.
It should be noted that, in order to meet the requirements of users, cameras in general mobile terminals have several fixed exposure modes during shooting, for example, center-weighted exposure is adopted, that is, exposure is performed according to the central brightness of a picture; or average exposure, namely exposure is carried out according to the average brightness of the whole picture so as to meet the public demand. While the use of center-weighted or average exposures may lose a significant amount of the detail of the dark or brightness, contrary to the purpose of this application. In a professional camera, exposure processing is performed depending on adjustment by a professional photographer, and there is no specific exposure mode.
In the present application, it is desirable that the maximum color dynamic range (color latitude) be obtained and the brightest region be used for exposure processing. It can be understood that the brightest area in the whole frame is exposed, so that it can be ensured that other parts in the frame are not over-exposed, and the darkest part is combined to obtain a larger color dynamic range.
The determination of the brightest area in the processed image data may be performed in various ways. For example, it includes: dividing the processed image data into a plurality of image blocks; calculating the average value of pixels of a plurality of image blocks; and taking the area corresponding to the image block with the highest pixel average value as the brightest area. Or dividing the processed image data into a plurality of data blocks; calculating the difference between the pixel average value of the plurality of data blocks and the pixel extreme value of the plurality of data blocks; and taking the area corresponding to the image block with the highest pixel average value as the brightest area, wherein the difference of the pixel extreme values is within the preset pixel range. Wherein, the difference of the pixel extreme values of the data block refers to the difference between the highest value and the lowest value of the pixels in the data block; the predetermined pixel range may be one pixel range of 0 to 30. The difference between the extreme values of the pixels of the data block is combined with the average value of the pixels to determine the brightest area, so that the accuracy of determining the brightest area is improved.
After the brightest area is determined, the brightest area is subjected to exposure processing, and the image data subjected to exposure processing is used as intermediate image data. It should be noted that if the exposure process is not performed using the brightest area, overexposure may occur, and after overexposure, on the one hand, restoration is not possible, and on the other hand, some detail features are lost.
The above steps of the preset image processing and the exposure processing are the processes of the image processing of the original image.
And 103, carrying out logarithmic conversion on the intermediate image data to obtain a target image.
Fig. 2c is a simplified schematic diagram of a general image information processor in a mobile terminal processing an original image. The method comprises the steps of obtaining an original image collected by a mobile terminal, namely obtaining the original image subjected to photosensitive processing by a camera, firstly carrying out image processing, and then carrying out gamma correction to output an image of an RGB space domain.
Fig. 2d is a simplified schematic diagram of the image information processor in the embodiment of the present application processing the original image. The method includes the steps of acquiring an original image acquired by the mobile terminal, namely acquiring the original image subjected to photosensitive processing by the camera, performing image processing' (including the preset image processing and the brightest area exposure in the step 102), and performing logarithmic conversion to output an image in an RGB space domain.
It is noted that the image processing' in fig. 2d differs from the image processing in fig. 2 c. The image processing in fig. 2c, processes the original image at 8-bit color depth. Whereas the image processing' in fig. 2d processes the original image with at least 10 bit color depth, the image data in each processing step is different and the resolution is different. The exposure in the image processing' in fig. 2d uses the brightest area exposure, while the exposure in fig. 2c uses the existing exposure method, such as center-weighted exposure or average exposure, which is contrary to the purpose of the embodiment of the present application.
In addition, the biggest difference between the image information processor in fig. 2d and fig. 2c in processing the original image is that: in fig. 2d logarithmic conversion is used, while in fig. 2c gamma correction is used. Gamma correction can cause the characteristics of the bright and dark portions of the image to be lost, which is not the purpose of the embodiments of the present application.
Therefore, in the embodiment of the present application, the gamma correction section in the image information processor is deleted, and the log conversion section is newly added. The logarithmic conversion expands the low gray value part of the intermediate image data, compresses the high gray value part of the intermediate image data, and can well compress the dynamic range of the image with larger pixel value change and highlight the required details.
Specifically, step 104 includes: normalizing the pixel values of the intermediate image data; carrying out logarithmic conversion on the normalized pixel value by using a logarithmic function to obtain an intermediate pixel value; and converting the intermediate pixel value into a target pixel value corresponding to the color depth, and taking an image corresponding to the target pixel value as a target image.
In this case, a 10-bit color depth is taken as an example for explanation. Normalizing pixel values of the intermediate image data, comprising: different pixel values of the three channels of RGB are acquired and normalized, and for example, when a certain pixel value of the R channel is 256, the normalization is performed so that 256/1023 is 0.25. And carrying out logarithmic conversion on the normalized pixel value by using a logarithmic function to obtain an intermediate pixel value. The logarithmic function can be shown in formula (1). And converting the intermediate pixel value into a target pixel value corresponding to the color depth, wherein understandably, the intermediate pixel value is restored to a pixel value corresponding to 0 to 1023. For example, when the intermediate pixel value is 0.48, the intermediate pixel value is converted into the target pixel value, 0.48 × 1023 — 491, that is, the target pixel value is 491. In this way, the target pixel values of all the pixels are obtained, and the image corresponding to the target pixel values is used as the target image. The target image is an image subjected to logarithmic transformation, and the target image can also be understood as an image in a logarithmic space. And saving the target image for further subsequent processing.
s ═ clog (1+ γ) formula (1)
Where s is the intermediate pixel value, γ is the normalized pixel value, and c is a constant.
The logarithmically transformed target image is shown in the second diagram of fig. 2 e. The first image in fig. 2e is an image obtained by general photography, such as an image obtained by gamma correction, and the second image in fig. 2d is a target image after logarithmic conversion. As can be seen from the image in fig. 2e, many details of the dark and bright portions of the image obtained by the shooting are lost, while many details of the bright and dark portions of the target image after the logarithmic transformation are retained.
According to the method, gamma correction is not needed to be carried out on the intermediate image data, logarithmic transformation is carried out, more detail characteristics of the bright part and the dark part of the image are reserved by using logarithmic transformation, namely log shooting of the mobile terminal is realized, and more detail characteristics of the bright part and the dark part of the image can be reserved by log shooting; on the other hand, after the original image is subjected to preset image processing, the brightest area in the processed image data is determined, and the brightest area is subjected to exposure processing, so that the target image obtained by the mobile terminal is prevented from being overexposed, and the loss of detail features is avoided.
Although the target image after the logarithmic transformation processing in fig. 2e retains many details of the bright part and the dark part, the whole image looks gray, and it is necessary to process the image to an image that looks like human eyes, and to process the image as a post-processing. As such, the image processing method further includes: acquiring a preset color index table corresponding to the color depth; and performing color gamut conversion on the target image according to a preset color index table to obtain a converted image.
The color index Table, L ook Up Table (L UT), takes the color (pixel value) of each pixel as an "index", and maps (remaps) the color value of each index color to a new color value after the Table is repositioned L UT is essentially a mathematical conversion model that converts the color input value into a specific value output by sampling and difference calculation of the color.
The method comprises the steps of taking an example that the color depth is 10-bit color depth, wherein a preset color index table corresponding to the color depth refers to that the color numerical value of an index color comprises pixel values of 0-1023, the preset color index table can comprise a color index table for converting image data of a logarithmic space after logarithmic conversion into image data of a display space of a common display, and can also be called Camera L UT., for example, the image data of the logarithmic space is converted into image data of Rec.709 standard, and the converted image is displayed.
If the preset color index table is a color index table (Camera L UT) for converting image data in logarithmic space into image data in the rec.709 standard, the preset color index table corresponding to the color depth is directly obtained, the color gamut conversion is performed on the target image according to the preset color index table to obtain a converted image, and the converted image is displayed.
The preset color index table may be a color index table (excluding Camera L UT) that converts image data in a logarithmic space into image data in any other space, for example, a color index table that corresponds to image data in a canvas style (mode), a sketch style (mode), a nostalgic style (mode), a cool tone style (mode), or the like.
The preset color index table (Creative L UT) corresponding to different display modes is stored in the mobile terminal, when shooting is carried out, if a user selects a certain display mode, the preset color index table corresponding to the display mode is correspondingly obtained, after a target image is obtained, color gamut conversion can be carried out on the target image according to the preset color index table corresponding to the display mode, so that the image corresponding to the display mode is obtained, and the image corresponding to the display mode is displayed.
In some cases, the image processing method may include converting image data of a logarithmic space into a preset color index table (Camera L UT) of image data of Rec.709 standard, which is referred to as a first preset color index table, converting a preset color index table (Creative L UT) corresponding to different display modes, which is referred to as a second preset color index table, performing color gamut conversion on a target image by using the first preset color index table to obtain a converted image, performing color gamut conversion on the converted image according to the second preset color index table corresponding to the display mode determined by a user to obtain an image in the display mode, specifically, obtaining the image processing method further includes obtaining the first preset color index table corresponding to a color depth, converting the target image according to the first preset color index table to obtain a converted image, obtaining a display mode determined by the user, determining the second preset color index table corresponding to the color depth according to the display mode, converting the converted image according to the second preset color index table to obtain a display mode of the image in the display mode, and obtaining a display mode of the image obtained by combining the Creative image and understanding the display mode of the image obtained by the user.
In some cases, when the image data needs to be viewed, the image processing method further includes: if an image viewing instruction is received, acquiring a preset color index table corresponding to the color depth; and performing color gamut conversion on the target image according to a preset color index table to obtain a converted image. Specifically, the image data may be converted into image data in the rec.709 standard, may be converted into image data in a display mode determined by a user, and may be converted into image data in the rec.709 standard first and then into image data in the display mode determined by the user.
In the above embodiment, it is understood that, when the captured target image is stored, the target image in the logarithmic space is stored, and the stored target image retains more detailed features of the bright part and the dark part of the image, then the color gamut conversion is performed according to Camera L UT (and/or Creative L UT), and the converted image is displayed, and then the converted image is displayed as the image data of the rec.709 standard and/or the image data in the display mode determined by the user, which does not affect the user experience.
Fig. 3 is a schematic flowchart of an image processing method provided in an embodiment of the present application, which is applied in a mobile terminal, and the image processing method includes the following steps:
and 201, acquiring each frame of original image acquired by the mobile terminal when the video is shot, wherein the color depth of the original image is at least 10 bits per pixel.
It is understood that when receiving a video shooting instruction, a scene generates an optical signal through a lens of the mobile terminal, the lens projects the optical signal to a photosensitive area of an image sensor, and the image sensor outputs an original image of each frame after being subjected to a photosensitive process (photoelectric conversion). It should be noted that the image sensor of the mobile terminal is an image sensor with at least 10-bit color depth, and each frame of original image after being subjected to the sensitization processing of the mobile terminal, that is, the image data collected by the mobile terminal is also corresponding to the 10-bit color depth. Please refer to the description in step 101.
And 202, performing preset image processing on each frame of original image to obtain processed image data.
And 203, determining the brightest area in the processed image data, and performing exposure processing on the brightest area to obtain intermediate image data.
And 204, carrying out logarithmic conversion on the intermediate image data to obtain a corresponding target image of each frame.
In steps 202 to 204, please refer to corresponding descriptions of steps 102 to 103, wherein the preset image processing in step 202 refers to the part of the preset image processing in step 102, and the exposure processing in step 203 refers to the part of the exposure processing in step 102, which is not described herein, after step 204, step 205 and step 207 are then executed, which is specifically understood with reference to fig. 4a, wherein L UT conversion includes multiple cases, namely, using Camera L UT to perform color gamut conversion, using Creative L UT to perform color gamut conversion, using Camera L UT to perform color gamut conversion, and then performing Creative L UT to perform color gamut conversion.
Video data is obtained from each frame of target image 205.
That is, the corresponding video data is finally obtained according to the sequence of each frame of target image. As will be understood, the video data is composed of a plurality of frames of target images.
And 206, compressing the video data to obtain target video data, and storing the target video data.
For example, the video data is compressed according to the H264 video compression standard to obtain the target video data, and the target video data is stored. In particular, please refer to fig. 4 b. It can be understood that the stored target video data is log-converted and log-space video data, that is, the mobile terminal implements log shooting, so that more detailed features of bright portions and dark portions of the image are reserved in the target video data for further subsequent processing. On the other hand, the video data is compressed, and the obtained target video data occupies smaller content when being stored, so that the storage space of the mobile terminal is saved.
207, a preset color index table corresponding to the color depth is obtained.
And 208, performing color gamut conversion on each frame of target image according to a preset color index table to obtain each frame of converted image, and displaying each frame of converted image.
The preset color index table may be a color index table (Camera L UT) for converting image data in a logarithmic space into image data in the rec.709 standard, a color index table (Creative L UT) for converting image data in a logarithmic space into image data in any other space, or a color index table (Camera L UT) for converting image data in a logarithmic space into image data in the rec.709 standard or a color index table (Creative L UT) for converting image data in a logarithmic space into image data in any other space.
Correspondingly, if the preset color index table is Camera L UT, the log video of the log space is quickly converted into a rec.709 standard image with a normal gray scale range, normal contrast and normal saturation so as to be convenient for a user to watch, the preset color index table is set as the color index table for converting the image data of the log space into the image data of the rec.709 standard under the default condition so that the user can normally watch the logarithmically converted image, if the preset color index table is Creative L UT, the log video of the log space is converted into the image data corresponding to the display mode according to the display mode determined by the user so as to realize Creative shooting of the user, if the preset color index table comprises a first preset color index table (Camera L UT) and a second preset color index table (Creative L UT), the log video is converted into the image data of the rec.709 standard, then the Creative shooting is realized according to the second preset color index table corresponding to the display mode determined by the user, so as to obtain the image data under the Creative shooting mode, and the Creative shooting is not described in detail.
The embodiment realizes the shooting of the log video of the mobile terminal, the log video can keep more detail characteristics of the bright part and the dark part, and the shot log video is stored so as to be convenient for subsequent further processing. And meanwhile, the preset color index table is used for converting the video data under the display mode which is normal to display or determined by the user, so that the user can watch the video data or the user creative idea can be realized.
In some cases, when the saved target video data needs to be viewed, the image processing method further includes: if a video viewing instruction is received, acquiring target video data according to the video viewing instruction, and decompressing the target video data to obtain decompressed video data, wherein the video data comprises a plurality of frames of target images; acquiring a preset color index table corresponding to the color depth; and performing color gamut conversion on the multi-frame target image according to a preset color index table to obtain converted video data. For specific details, please refer to the description above. It will be appreciated that log video data may also be converted to video data that displays the normal rec.709 standard, and/or video data in a user-determined display mode, at a subsequent review. As shown in fig. 4 c.
It should be noted that the Creative L UT conversion may or may not be performed in the image information processor, for example, the CPU/GPU of the mobile terminal may be used to implement the conversion.
According to the method described in the foregoing embodiment, the embodiment will be further described from the perspective of a picture transmission device, which may be specifically implemented as an independent entity, or integrated in a mobile terminal, where the mobile terminal may include a smart phone, a Pad, a wearable device, a robot, and the like. The mobile terminal comprises a camera, wherein the camera comprises a photosensitive element, and the color depth of the photosensitive element is at least 10-bit color depth.
Referring to fig. 5, fig. 5 specifically describes an image transmission device provided in the embodiment of the present application, which is applied in a mobile terminal, and the image transmission device may include: an acquisition unit 301, a processing unit 302, and a conversion unit 303. Wherein:
the acquiring unit 301 is configured to acquire an original image acquired by the mobile terminal, where a color depth of the original image is at least 10 bits per pixel.
A processing unit 302, configured to determine a brightest area in the original image, and continue a exposure process on the brightest area to obtain intermediate image data.
The processing unit 302 includes a preprocessing unit and an exposure unit. The preprocessing unit is used for carrying out preset image processing on the original image to obtain processed image data. The exposure unit is used for determining the brightest area in the processed image data and carrying out exposure processing on the brightest area to obtain intermediate image data.
Wherein, when the exposure unit executes the brightest area in the image area after the determination processing, the exposure unit specifically executes: dividing an original image into a plurality of image blocks; calculating the average value of pixels of a plurality of image blocks; and taking the area corresponding to the image block with the highest pixel average value as the brightest area in the original image.
A conversion unit 303, configured to perform logarithmic conversion on the intermediate image data to obtain a target image.
The conversion unit 303 is specifically configured to normalize a pixel value of the intermediate image data; carrying out logarithmic conversion on the normalized pixel value by using a logarithmic function to obtain an intermediate pixel value; and converting the intermediate pixel value into a target pixel value corresponding to the color depth, and taking an image corresponding to the target pixel value as a target image.
Further, the obtaining unit 301 is further configured to obtain a preset color index table corresponding to the color depth; the converting unit 303 is further configured to perform color gamut conversion on the target image according to a preset color index table to obtain a converted image.
Further, when the obtaining unit 301 obtains the preset color index table corresponding to the color depth, it specifically performs: acquiring a display mode determined by a user; and determining a preset color index table corresponding to the color depth according to the display mode.
Further, the obtaining unit 301 is further configured to obtain a preset color index table corresponding to the color depth if an image viewing instruction is received; the converting unit 303 is further configured to perform color gamut conversion on the target image according to a preset color index table to obtain a converted image.
Fig. 6 is a picture transmission apparatus provided in an embodiment of the present application, and is applied in a mobile terminal, where the picture transmission apparatus may include: an acquisition unit 401, a preprocessing unit 402, an exposure unit 403, a conversion unit 404, and a compression unit 405. Wherein,
the acquiring unit 401 is configured to acquire each frame of original image acquired by the mobile terminal when the video is captured, where a color depth of the original image is at least 10 bits per pixel.
The preprocessing unit 402 is configured to perform preset image processing on each frame of original image to obtain processed image data.
An exposure unit 403, configured to determine a brightest area in the processed image data, and perform exposure processing on the brightest area to obtain intermediate image data.
A conversion unit 404, configured to perform logarithmic conversion on the intermediate image data to obtain each corresponding frame of target image, and obtain video data according to each frame of target image.
The compressing unit 405 is configured to compress the video data to obtain target video data, and store the target video data.
The obtaining unit 401 is further configured to obtain a preset color index table corresponding to the color depth. The conversion unit 404 is further configured to perform color gamut conversion on each frame of target image according to a preset color index table to obtain a converted each frame of image, and display the converted each frame of image.
Further, the obtaining unit 401 is further configured to, if a video viewing instruction is received, obtain target video data according to the video viewing instruction; the compression unit 405 is further configured to decompress the target video data to obtain video data, where the video data includes multiple frames of target images.
In a specific implementation, each of the modules and/or units may be implemented as an independent entity, or may be implemented as one or several entities by any combination, where the specific implementation of each of the modules and/or units may refer to the foregoing method embodiment, and specific achievable beneficial effects also refer to the beneficial effects in the foregoing method embodiment, which are not described herein again.
In addition, the embodiment of the application further provides a mobile terminal, and the mobile terminal can be a smart phone, a Pad, wearable equipment, a robot and other equipment comprising a camera. The camera comprises a photosensitive element, and the color depth of the photosensitive element is at least 10 bits of color depth. As shown in fig. 7, the mobile terminal 500 includes a processor 501, a memory 502. The processor 501 is electrically connected to the memory 502.
The processor 501 is a control center of the mobile terminal 500, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or loading an application stored in the memory 502 and calling data stored in the memory 502, thereby integrally monitoring the mobile terminal.
In this embodiment, the processor 501 in the mobile terminal 500 loads instructions corresponding to processes of one or more application programs into the memory 502 according to the following steps, and the processor 501 runs the application programs stored in the memory 502, so as to implement various functions:
acquiring an original image acquired by a mobile terminal, wherein the color depth of the original image is at least 10 bits per pixel;
determining the brightest area in the original image, and carrying out exposure processing on the brightest area to obtain intermediate image data;
and carrying out logarithmic conversion on the intermediate image data to obtain a target image.
The mobile terminal may implement the steps in any embodiment of the image processing method provided in the embodiment of the present application, and therefore, beneficial effects that can be achieved by any image processing method provided in the embodiment of the present invention can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
Fig. 8 is a block diagram showing a specific structure of a mobile terminal according to an embodiment of the present invention, which can be used to implement the image processing method provided in the above-mentioned embodiment. The mobile terminal 600 may be a mobile phone, Pad, wearable device, robot, etc. The mobile terminal comprises a camera, wherein the camera comprises a photosensitive element, and the color depth of the photosensitive element is at least 10 bits of color depth.
The RF circuit 610 is used for receiving and transmitting electromagnetic waves, and performs interconversion between the electromagnetic waves and electrical signals, thereby communicating with a communication network or other devices. RF circuit 610 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The RF circuit 610 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices over a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The Wireless network may use various Communication standards, protocols and technologies, including but not limited to Global System for Mobile Communication (GSM), Enhanced Mobile Communication (EDGE), Wideband Code Division Multiple Access (WCDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Wireless Fidelity (Wi-Fi) (e.g., IEEE802.11a, IEEE802.11 b, IEEE802.11g and/or IEEE802.11 n), Voice over Internet Protocol (VoIP), world wide Microwave Access (Microwave for Wireless), Max-1, and other short message protocols, as well as any other suitable communication protocols, and may even include those that have not yet been developed.
The memory 620 may be used to store software programs and modules, such as the corresponding program instructions/modules in the above-described embodiments, and the processor 680 may execute various functional applications and data processing by operating the software programs and modules stored in the memory 620. The memory 620 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 620 can further include memory located remotely from the processor 680, which can be connected to the mobile terminal 600 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input unit 630 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 630 may include a touch sensitive surface 631 as well as other input devices 632. The touch-sensitive surface 631, also referred to as a touch display screen (touch screen) or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface 631 using any suitable object or attachment such as a finger, a stylus, etc.) and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 631 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 680, and can receive and execute commands sent by the processor 680. In addition, the touch sensitive surface 631 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 630 may include other input devices 632 in addition to the touch-sensitive surface 631. In particular, other input devices 632 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Display unit 640 may be used to Display information input by or provided to a user, as well as various graphical user interfaces of mobile terminal 600, which may be comprised of graphics, text, icons, video, and any combination thereof Display panel 641, optionally Display panel 641 may be configured in the form of L CD (L acquired Crystal Display, liquid Crystal Display), O L ED (Organic L light-Emitting Diode), etc. further, touch-sensitive surface 631 may overlay Display panel 641, and upon touch-sensitive surface 631 detecting a touch operation on or near it, communicate to processor 680 to determine the type of touch event, and then processor 680 provides a corresponding visual output on the Display panel based on the type of touch event.
The mobile terminal 600 may also include at least one sensor 650, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 641 according to the brightness of ambient light, and a proximity sensor that may generate an interrupt when the folder is closed or closed. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured in the mobile terminal 600, further description is omitted here.
Audio circuit 660, speaker 661, and microphone 662 may provide an audio interface between a user and the mobile terminal 600. The audio circuit 660 may transmit the electrical signal converted from the received audio data to the speaker 661, and convert the electrical signal into an audio signal through the speaker 661 for output; on the other hand, the microphone 662 converts the collected sound signal into an electrical signal, which is received by the audio circuit 660 and converted into audio data, which is then processed by the audio data output processor 680 and then passed through the RF circuit 610 to be transmitted to, for example, another terminal, or output to the memory 620 for further processing. The audio circuit 660 may also include an earbud jack to provide communication of a peripheral headset with the mobile terminal 600.
The mobile terminal 600, which can assist a user in receiving requests, transmitting information, etc., provides the user with wireless broadband internet access through a transmission module 670 (e.g., a Wi-Fi module). Although the transmission module 670 is illustrated, it is understood that it does not belong to the essential constitution of the mobile terminal 600 and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 680 is a control center of the mobile terminal 600, connects various parts of the entire handset using various interfaces and lines, and performs various functions of the mobile terminal 600 and processes data by operating or executing software programs and/or modules stored in the memory 620 and calling data stored in the memory 620, thereby integrally monitoring the mobile terminal. Optionally, processor 680 may include one or more processing cores; in some embodiments, processor 680 may integrate an application processor, which handles primarily the operating system, user interface, applications, etc., and a modem processor, which handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 680.
The mobile terminal 600 also includes a power supply 690 (e.g., a battery) that provides power to the various components and, in some embodiments, may be logically coupled to the processor 680 through a power management system that may enable management of charging, discharging, and power consumption. The power supply 690 may also include any component including one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the mobile terminal 600 further includes a camera (e.g., a front camera, a rear camera), a bluetooth module, and the like, which are not described in detail herein. Specifically, in this embodiment, the display unit of the mobile terminal is a touch screen display, the mobile terminal further includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs include instructions for:
acquiring an original image acquired by a mobile terminal, wherein the color depth of the original image is at least 10 bits per pixel;
determining the brightest area in the original image, and carrying out exposure processing on the brightest area to obtain intermediate image data;
and carrying out logarithmic conversion on the intermediate image data to obtain a target image.
In specific implementation, the above modules may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and specific implementation of the above modules may refer to the foregoing method embodiments, which are not described herein again.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor. To this end, the present invention provides a storage medium, in which a plurality of instructions are stored, and the instructions can be loaded by a processor to execute the steps of any embodiment of the image processing method provided by the present invention.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any embodiment of the image processing method provided in the embodiment of the present invention, the beneficial effects that any image processing method provided in the embodiment of the present invention can achieve can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The foregoing detailed description has provided an image processing method, an image processing apparatus, a storage medium, and a mobile terminal according to embodiments of the present application, and specific examples are applied herein to explain the principles and implementations of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and the core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. An image processing method applied to a mobile terminal is characterized by comprising the following steps:
acquiring an original image acquired by a mobile terminal, wherein the color depth of the original image is at least 10 bits per pixel;
determining the brightest area in the original image, and carrying out exposure processing on the brightest area to obtain intermediate image data;
and carrying out logarithmic conversion on the intermediate image data to obtain a target image.
2. The image processing method according to claim 1, further comprising:
acquiring a preset color index table corresponding to the color depth;
and performing color gamut conversion on the target image according to the preset color index table to obtain a converted image.
3. The image processing method according to claim 2, wherein the obtaining of the preset color index table corresponding to the color depth comprises:
acquiring a display mode determined by a user;
and determining a preset color index table corresponding to the color depth according to the display mode.
4. The image processing method according to claim 1, wherein the logarithmically converting the intermediate image data to obtain the target image comprises:
normalizing the pixel values of the intermediate image data;
carrying out logarithmic conversion on the normalized pixel value by using a logarithmic function to obtain an intermediate pixel value;
and converting the intermediate pixel value into a target pixel value corresponding to the color depth, and taking an image corresponding to the target pixel value as a target image.
5. The method of claim 1, wherein the determining the brightest region in the original image comprises:
dividing the original image into a plurality of image blocks;
calculating the average value of pixels of a plurality of image blocks;
and taking the area corresponding to the image block with the highest pixel average value as the brightest area in the original image.
6. The image processing method according to claim 1, wherein the original image includes each frame of original image acquired when a video is captured, and the target image includes each frame of corresponding target image, the image processing method further comprising:
obtaining video data according to each frame of target image;
and compressing the video data to obtain target video data.
7. The image processing method according to claim 6, further comprising:
if a video viewing instruction is received, acquiring target video data according to the video viewing instruction;
and decompressing the target video data to obtain video data, wherein the video data comprises a plurality of frames of target images.
8. An image processing apparatus applied to a mobile terminal, comprising:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring an original image acquired by a mobile terminal, and the color depth of the original image is at least 10 bits per pixel;
the processing unit is used for determining the brightest area in the original image and carrying out exposure processing on the brightest area to obtain intermediate image data;
and the conversion unit is used for carrying out logarithmic conversion on the intermediate image data to obtain a target image.
9. A computer-readable storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor to perform the image processing method of any of claims 1 to 7.
10. A mobile terminal, comprising a processor and a memory, wherein the processor is electrically connected to the memory, the memory is used for storing instructions and data, and the processor is used for executing the steps of the picture transmission method according to any one of claims 1 to 7.
CN202010327508.8A 2020-04-23 2020-04-23 Image processing method, device, storage medium and mobile terminal Pending CN111510698A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010327508.8A CN111510698A (en) 2020-04-23 2020-04-23 Image processing method, device, storage medium and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010327508.8A CN111510698A (en) 2020-04-23 2020-04-23 Image processing method, device, storage medium and mobile terminal

Publications (1)

Publication Number Publication Date
CN111510698A true CN111510698A (en) 2020-08-07

Family

ID=71876340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010327508.8A Pending CN111510698A (en) 2020-04-23 2020-04-23 Image processing method, device, storage medium and mobile terminal

Country Status (1)

Country Link
CN (1) CN111510698A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113344775A (en) * 2021-06-18 2021-09-03 北京澎思科技有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN113810641A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN113810642A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN114449199A (en) * 2021-08-12 2022-05-06 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN115242992A (en) * 2021-08-12 2022-10-25 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
WO2023015997A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video editing method and video editing apparatus
WO2023016038A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
WO2023016040A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
WO2023016042A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
WO2023016041A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
WO2023016044A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
EP4171014A4 (en) * 2021-06-16 2024-03-06 Honor Device Co., Ltd. Photographing method, graphical interface and related apparatus

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030215133A1 (en) * 2002-05-20 2003-11-20 Eastman Kodak Company Color transformation for processing digital images
CN1663697A (en) * 2005-03-23 2005-09-07 江苏大学 Visual data processing system for fruit external appearance quality online detection technology
CN1685709A (en) * 2002-07-24 2005-10-19 松下电器产业株式会社 Image pickup system
CN1713690A (en) * 2004-06-15 2005-12-28 微软公司 System and method for automated correction of digital images
US7612813B2 (en) * 2006-02-03 2009-11-03 Aptina Imaging Corporation Auto exposure for digital imagers
US20120206470A1 (en) * 2011-02-16 2012-08-16 Apple Inc. Devices and methods for obtaining high-local-contrast image data
CN103379289A (en) * 2012-04-18 2013-10-30 歌乐牌株式会社 Imaging apparatus
US20140168505A1 (en) * 2012-12-19 2014-06-19 International Business Machines Corporation Digital imaging exposure metering system
CN104247398A (en) * 2012-04-11 2014-12-24 佳能株式会社 Imaging device and method for controlling same
WO2015128603A1 (en) * 2014-02-27 2015-09-03 British Broadcasting Corporation Method and apparatus for signal processing for high dynamic range video
US20150358524A1 (en) * 2013-02-21 2015-12-10 Kyocera Corporation Imaging apparatus and storage medium, and exposure amount control method
US20160105656A1 (en) * 2014-10-13 2016-04-14 Quanta Computer Inc. White balance method in multi-exposure imaging system
CN105530496A (en) * 2014-09-28 2016-04-27 联想(北京)有限公司 Method and device for information processing
CN105827965A (en) * 2016-03-25 2016-08-03 维沃移动通信有限公司 Image processing method based on mobile terminal and mobile terminal
CN105915816A (en) * 2016-07-06 2016-08-31 穆德远 Method and equipment for determining brightness of given scene
CN107018353A (en) * 2016-01-27 2017-08-04 江西云晖生物芯片技术有限公司 A kind of NEXT series of products EM770W picture systems
CN107079105A (en) * 2016-11-14 2017-08-18 深圳市大疆创新科技有限公司 Image processing method, device, equipment and video image transmission system
US20170289571A1 (en) * 2016-04-01 2017-10-05 Intel Corporation Temporal control for spatially adaptive tone mapping of high dynamic range video
US20190052790A1 (en) * 2017-08-10 2019-02-14 Lg Electronics Inc. Mobile terminal
WO2019036522A1 (en) * 2017-08-15 2019-02-21 Dolby Laboratories Licensing Corporation Bit-depth efficient image processing
US20190220961A1 (en) * 2016-09-07 2019-07-18 Gvbb Holdings S.A.R.L. High dynamic range processing
JP2020048139A (en) * 2018-09-21 2020-03-26 キヤノン株式会社 Image processing system
US20200228736A1 (en) * 2017-08-15 2020-07-16 Dolby Laboratories Licensing Corporation Bit-depth efficient image processing

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030215133A1 (en) * 2002-05-20 2003-11-20 Eastman Kodak Company Color transformation for processing digital images
CN1685709A (en) * 2002-07-24 2005-10-19 松下电器产业株式会社 Image pickup system
CN1713690A (en) * 2004-06-15 2005-12-28 微软公司 System and method for automated correction of digital images
CN1663697A (en) * 2005-03-23 2005-09-07 江苏大学 Visual data processing system for fruit external appearance quality online detection technology
US7612813B2 (en) * 2006-02-03 2009-11-03 Aptina Imaging Corporation Auto exposure for digital imagers
US20120206470A1 (en) * 2011-02-16 2012-08-16 Apple Inc. Devices and methods for obtaining high-local-contrast image data
CN104247398A (en) * 2012-04-11 2014-12-24 佳能株式会社 Imaging device and method for controlling same
CN103379289A (en) * 2012-04-18 2013-10-30 歌乐牌株式会社 Imaging apparatus
US20140168505A1 (en) * 2012-12-19 2014-06-19 International Business Machines Corporation Digital imaging exposure metering system
US20150358524A1 (en) * 2013-02-21 2015-12-10 Kyocera Corporation Imaging apparatus and storage medium, and exposure amount control method
WO2015128603A1 (en) * 2014-02-27 2015-09-03 British Broadcasting Corporation Method and apparatus for signal processing for high dynamic range video
CN105530496A (en) * 2014-09-28 2016-04-27 联想(北京)有限公司 Method and device for information processing
US20160105656A1 (en) * 2014-10-13 2016-04-14 Quanta Computer Inc. White balance method in multi-exposure imaging system
CN107018353A (en) * 2016-01-27 2017-08-04 江西云晖生物芯片技术有限公司 A kind of NEXT series of products EM770W picture systems
CN105827965A (en) * 2016-03-25 2016-08-03 维沃移动通信有限公司 Image processing method based on mobile terminal and mobile terminal
US20170289571A1 (en) * 2016-04-01 2017-10-05 Intel Corporation Temporal control for spatially adaptive tone mapping of high dynamic range video
CN105915816A (en) * 2016-07-06 2016-08-31 穆德远 Method and equipment for determining brightness of given scene
US20190220961A1 (en) * 2016-09-07 2019-07-18 Gvbb Holdings S.A.R.L. High dynamic range processing
CN107079105A (en) * 2016-11-14 2017-08-18 深圳市大疆创新科技有限公司 Image processing method, device, equipment and video image transmission system
US20190052790A1 (en) * 2017-08-10 2019-02-14 Lg Electronics Inc. Mobile terminal
WO2019036522A1 (en) * 2017-08-15 2019-02-21 Dolby Laboratories Licensing Corporation Bit-depth efficient image processing
US20200228736A1 (en) * 2017-08-15 2020-07-16 Dolby Laboratories Licensing Corporation Bit-depth efficient image processing
JP2020048139A (en) * 2018-09-21 2020-03-26 キヤノン株式会社 Image processing system

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
HOMEBOY电影洗印厂: "听说手机也能拍Log了!?有人做了个测试", 《知乎》 *
吴蔚琦等: "4K专题纪录片拍摄、制作、调色与下变换播出――《浦东传奇》前后期流程与技术细节", 《影视制作》 *
火华不肝: "求大神给科普下log拍摄模式的技术详解?", 《知乎》 *
而后网: "DJI Mavic 2 Pro 关键字小知识专访HDR、10-bit、DLog-M 详解", 《而后网》 *
轻舟风行: "老听摄影师说Log, Log, Log,到底什么是Log?!", 《百度》 *
飞哥: "Log 模式全揭秘,让你的视频与众不同", 《知乎》 *

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4171014A4 (en) * 2021-06-16 2024-03-06 Honor Device Co., Ltd. Photographing method, graphical interface and related apparatus
CN113344775A (en) * 2021-06-18 2021-09-03 北京澎思科技有限公司 Image processing method, image processing device, electronic equipment and storage medium
WO2023016044A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
CN115706767B (en) * 2021-08-12 2023-10-31 荣耀终端有限公司 Video processing method, device, electronic equipment and storage medium
CN115242992A (en) * 2021-08-12 2022-10-25 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
WO2023015997A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video editing method and video editing apparatus
WO2023016038A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
WO2023016039A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
WO2023016040A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
WO2023016035A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
WO2023016042A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
WO2023016041A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
CN113810641A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN114449199A (en) * 2021-08-12 2022-05-06 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN115706764A (en) * 2021-08-12 2023-02-17 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN115706765A (en) * 2021-08-12 2023-02-17 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN115706767A (en) * 2021-08-12 2023-02-17 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN115706863A (en) * 2021-08-12 2023-02-17 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN113810642B (en) * 2021-08-12 2023-02-28 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN113810641B (en) * 2021-08-12 2023-02-28 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN115242992B (en) * 2021-08-12 2023-08-18 荣耀终端有限公司 Video processing method, device, electronic equipment and storage medium
CN115706764B (en) * 2021-08-12 2023-09-19 荣耀终端有限公司 Video processing method, device, electronic equipment and storage medium
CN115706765B (en) * 2021-08-12 2023-10-27 荣耀终端有限公司 Video processing method, device, electronic equipment and storage medium
WO2023016037A1 (en) * 2021-08-12 2023-02-16 荣耀终端有限公司 Video processing method and apparatus, electronic device, and storage medium
CN115706863B (en) * 2021-08-12 2023-11-21 荣耀终端有限公司 Video processing method, device, electronic equipment and storage medium
CN113810642A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111510698A (en) Image processing method, device, storage medium and mobile terminal
JP6803982B2 (en) Optical imaging method and equipment
CN111418201B (en) Shooting method and equipment
CN107707827B (en) High-dynamic image shooting method and mobile terminal
CN107438163B (en) Photographing method, terminal and computer readable storage medium
US7847830B2 (en) System and method for camera metering based on flesh tone detection
CN108307125B (en) Image acquisition method, device and storage medium
CN107038715B (en) Image processing method and device
US9536479B2 (en) Image display device and method
KR20150099302A (en) Electronic device and control method of the same
CN108200352B (en) Method, terminal and storage medium for adjusting picture brightness
CN111696039B (en) Image processing method and device, storage medium and electronic equipment
CN113810600A (en) Terminal image processing method and device and terminal equipment
CN112118388A (en) Image processing method, image processing device, computer equipment and storage medium
CN110047060B (en) Image processing method, image processing device, storage medium and electronic equipment
US10051252B1 (en) Method of decaying chrominance in images
CN113472997B (en) Image processing method and device, mobile terminal and storage medium
KR101337667B1 (en) Lens roll-off correction operation using values corrected based on brightness information
CN114222072B (en) Image processing method, device, electronic equipment and storage medium
CN115550575B (en) Image processing method and related device
CN117519555A (en) Image processing method, electronic equipment and system
CN115527474A (en) Image display method, image display device, projection device, and storage medium
CN113810622A (en) Image processing method and device
CN117711300B (en) Image display method, electronic device, readable storage medium and chip
JP2004259177A (en) Image processing device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200807