Nothing Special   »   [go: up one dir, main page]

CN112614083A - Image fusion method and device and electronic system - Google Patents

Image fusion method and device and electronic system Download PDF

Info

Publication number
CN112614083A
CN112614083A CN202011513700.2A CN202011513700A CN112614083A CN 112614083 A CN112614083 A CN 112614083A CN 202011513700 A CN202011513700 A CN 202011513700A CN 112614083 A CN112614083 A CN 112614083A
Authority
CN
China
Prior art keywords
image
color temperature
highlight
highlight area
standard color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011513700.2A
Other languages
Chinese (zh)
Inventor
梁钢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Megvii Technology Co Ltd
Original Assignee
Beijing Megvii Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Megvii Technology Co Ltd filed Critical Beijing Megvii Technology Co Ltd
Priority to CN202011513700.2A priority Critical patent/CN112614083A/en
Publication of CN112614083A publication Critical patent/CN112614083A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

本发明提供了图像融合方法、装置及电子系统。其中,该方法包括:对同一场景的多张图像进行配准对齐,得到空间对齐的基准亮度图像和至少一张欠曝图像;其中,基准亮度图像对应的曝光值EV和欠曝图像的EV不同;基于欠曝图像的高光区域中每个像素点对应的颜色通道参数和对应关系确定高光区域的有效像素点;基于高光区域中的有效像素点对应的颜色通道参数确定高光区域的校准参数,并应用高光区域的校准参数对高光区域进行校准,得到欠曝优化图像;对基准亮度图像和欠曝优化图像进行融合处理,得到同一场景对应的融合图像,从而缓解了现有方法中融合图像的高光区域出现偏色的问题,保证了融合图像的逼真效果,提升了融合图像的实用价值。

Figure 202011513700

The present invention provides an image fusion method, device and electronic system. The method includes: registering and aligning multiple images of the same scene to obtain a spatially aligned reference luminance image and at least one underexposed image; wherein, the exposure value EV corresponding to the reference luminance image is different from the EV of the underexposed image ; Determine the effective pixel points of the highlight area based on the color channel parameters and the corresponding relationship of each pixel in the highlight area of the underexposed image; Determine the calibration parameters of the highlight area based on the color channel parameters corresponding to the effective pixels in the highlight area, and Apply the calibration parameters of the highlight area to calibrate the highlight area to obtain the under-exposure optimized image; fuse the benchmark brightness image and the under-exposure optimized image to obtain the fused image corresponding to the same scene, thereby alleviating the highlight of the fused image in the existing method The problem of color cast in the area ensures the realistic effect of the fused image and improves the practical value of the fused image.

Figure 202011513700

Description

Image fusion method and device and electronic system
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image fusion method, an image fusion device, and an electronic system.
Background
With the continuous development of digital image technology, high-definition and vivid images are increasingly popular. Currently, the HDR (High Dynamic Range) image technology is widely applied in the fields of consumer electronics, remote sensing, security monitoring, digital television and the like.
The HDR image is obtained by fusing images of different EVs (Exposure Values) based on an HDR algorithm, and although rich brightness levels in real life can be well reproduced to generate a realistic effect, in the fusion process, since the images of different EVs are directly fused, colors of high light areas in the fused image are abnormal, and actual application requirements of users cannot be met.
Disclosure of Invention
In view of the above, the present invention provides an image fusion method, an image fusion device and an electronic system to alleviate the above problems, so as to avoid color cast in the highlight region of the fused image and improve the practical value of the fused image.
In a first aspect, an embodiment of the present invention provides an image fusion method, where the method includes: registering and aligning a plurality of images of the same scene to obtain a reference brightness image and at least one under-exposed image which are spatially aligned; wherein, the exposure value EV corresponding to the reference brightness image is different from the EV of the underexposed image; acquiring corresponding relations between a plurality of standard color temperature intervals and color channel parameters respectively, and determining effective pixel points of a highlight area based on the color channel parameters corresponding to each pixel point in the highlight area of an underexposed image and the corresponding relations; determining a calibration parameter of the highlight area based on a color channel parameter corresponding to an effective pixel point in the highlight area, and calibrating the highlight area by using the calibration parameter of the highlight area to obtain an under-exposure optimization image; and performing fusion processing on the reference brightness image and the underexposure optimization image to obtain a fusion image corresponding to the same scene.
With reference to the first aspect, an embodiment of the present invention provides a first possible implementation manner of the first aspect, where the step of determining effective pixel points of a highlight area based on color channel parameters and a corresponding relationship corresponding to each pixel point in the highlight area of an underexposed image includes: determining a highlight region of an underexposed image based on an overexposed region of a reference luminance image; checking the color channel value of each pixel point in the highlight area; marking the pixel points with the color channel value larger than or equal to the preset maximum value in the highlight area as invalid pixel points; calculating color channel parameters of pixel points with color channel values smaller than a preset maximum value; and determining the pixel point of which the color channel parameter is between the minimum color channel parameter and the maximum color channel parameter in the corresponding relation as an effective pixel point of the highlight area.
With reference to the first aspect, an embodiment of the present invention provides a second possible implementation manner of the first aspect, where before the step of determining the calibration parameter of the highlight region based on the color channel parameter corresponding to the effective pixel point in the highlight region, the method further includes: determining a first target standard color temperature interval corresponding to the reference brightness image and a second target standard color temperature interval corresponding to a high light region of the under-exposed image according to the corresponding relation; comparing whether the first target standard color temperature interval is the same as the second target standard color temperature interval; if not, a step of determining calibration parameters for the highlight region based on the color channel parameters corresponding to the active pixel points in the highlight region is performed.
With reference to the second possible implementation manner of the first aspect, an embodiment of the present invention provides a third possible implementation manner of the first aspect, where the determining, according to the correspondence, a first target standard color temperature section corresponding to the reference luminance image and a second target standard color temperature section corresponding to a highlight region of the underexposed image includes: determining a first target standard color temperature interval corresponding to the reference brightness image based on the color channel parameters and the corresponding relation of each pixel point in the reference brightness image; and determining a second target standard color temperature interval corresponding to the high light region based on the color channel parameters and the corresponding relation of the effective pixel points in the high light region of the underexposed image.
With reference to the third possible implementation manner of the first aspect, an embodiment of the present invention provides a fourth possible implementation manner of the first aspect, where the step of determining the second target standard color temperature interval corresponding to the highlight area based on the color channel parameters and the corresponding relations of the effective pixel points in the highlight area of the underexposed image includes: and determining a second target standard color temperature interval corresponding to the high light region according to the proportion of the effective pixel points in the high light region to the plurality of standard color temperature intervals.
With reference to the fourth possible implementation manner of the first aspect, the present invention provides a fifth possible implementation manner of the first aspect, where there are a plurality of highlight areas; the step of determining the second target standard color temperature section corresponding to the highlight region according to the proportion of the effective pixel points in the highlight region corresponding to the plurality of standard color temperature sections includes one of the following steps: taking the area with the most pixels in the multiple highlight areas as an effective area, and determining a second target standard color temperature interval corresponding to the highlight area based on the proportion of the effective pixel points in the effective area to the multiple standard color temperature intervals; determining a second target standard color temperature interval corresponding to each high light region according to the ratio of effective pixel points in each high light region to a plurality of standard color temperature intervals; and counting the occupation ratios of the effective pixel points in all the high-light regions corresponding to the plurality of standard color temperature intervals, and determining second target standard color temperature intervals corresponding to all the high-light regions according to the counted occupation ratios.
With reference to the fourth possible implementation manner of the first aspect, an embodiment of the present invention provides a sixth possible implementation manner of the first aspect, where the plurality of standard color temperature intervals are a high color temperature interval, a medium color temperature interval, and a low color temperature interval, respectively; the step of determining the second target standard color temperature section corresponding to the highlight region according to the proportion of the effective pixel points in the highlight region to the plurality of standard color temperature sections includes: counting a first ratio of first type effective pixel points belonging to a high color temperature interval, a second ratio of second type effective pixel points belonging to a medium color temperature interval and a third ratio of third type effective pixel points belonging to a low color temperature interval in a high light region; comparing the first ratio value, the second ratio value and the third ratio value; if the first ratio value is maximum and is larger than a preset ratio threshold, determining a high color temperature interval corresponding to the high light area; and if the first ratio is smaller than or equal to the preset ratio threshold, setting the effective pixel point corresponding to the minimum ratio among the first ratio, the second ratio and the third ratio as an ineffective pixel point, and determining a second target standard color temperature interval of the highlight area based on the color channel parameter corresponding to the current effective pixel point in the highlight area.
With reference to the first aspect, an embodiment of the present invention provides a seventh possible implementation manner of the first aspect, where the step of determining the calibration parameter of the highlight region based on the color channel parameter corresponding to the effective pixel point in the highlight region includes: calculating the mean value of color channel parameters corresponding to all effective pixel points in the highlight area; the calibration parameters for the highlight region are determined based on the calculated mean.
With reference to the second possible implementation manner of the first aspect, the embodiment of the present invention provides an eighth possible implementation manner of the first aspect, wherein the step of determining the calibration parameter of the highlight region based on the color channel parameter corresponding to the effective pixel point in the highlight region includes: calculating the mean value of the color channel parameters corresponding to the effective pixel points belonging to the second target standard color temperature interval in the highlight region; the calibration parameters for the highlight region are determined based on the calculated mean.
With reference to the first aspect, an embodiment of the present invention provides a ninth possible implementation manner of the first aspect, where the number of the underexposed images is two, an EV corresponding to the first underexposed image is smaller than an EV corresponding to the second underexposed image, and both the EVs corresponding to the first underexposed image and the second underexposed image are greater than 0; the step of performing fusion processing on the reference luminance image and the underexposure optimized image to obtain a fusion image corresponding to the same scene includes: and performing high dynamic range HDR fusion processing on the reference brightness image, the underexposed optimized image corresponding to the first underexposed image and the underexposed optimized image corresponding to the second underexposed image to obtain a fusion image corresponding to the same scene.
With reference to the first aspect, an embodiment of the present invention provides a tenth possible implementation manner of the first aspect, where the EV corresponding to the reference luminance image is equal to 0.
With reference to the first aspect, an embodiment of the present invention provides an eleventh possible implementation manner of the first aspect, where an EV corresponding to the reference luminance image is not equal to 0; the step of performing fusion processing on the reference luminance image and the underexposure optimized image to obtain a fusion image corresponding to the same scene includes: according to the EV transformation rule corresponding to the reference brightness image, performing brightness transformation on the reference brightness image to obtain a reference optimized image, and fusing the reference optimized image and the underexposed optimized image to obtain a fused image corresponding to the same scene; or fusing the reference brightness image and the underexposed optimized image to obtain a primary fused image corresponding to the same scene, and performing brightness transformation on the primary fused image according to the EV transformation rule corresponding to the reference brightness image to obtain a fused image corresponding to the same scene.
With reference to the first aspect, an embodiment of the present invention provides a twelfth possible implementation manner of the first aspect, wherein the obtaining of the correspondence between the plurality of standard color temperature intervals and the color channel parameters includes: acquiring shot images under a plurality of preset standard color temperature light sources; counting and determining color channel parameters of the photographed image corresponding to each preset standard color temperature light source; and generating corresponding relations between a plurality of standard color temperature intervals and the color channel parameters respectively based on the color channel parameters corresponding to each preset standard color temperature light source.
In a second aspect, an embodiment of the present invention further provides an image fusion apparatus, where the apparatus includes: the registration alignment module is used for performing registration alignment on a plurality of images of the same scene to obtain a reference brightness image and at least one under-exposed image which are spatially aligned; wherein, the exposure value EV corresponding to the reference brightness image is different from the EV of the underexposed image; the effective pixel point determining module is used for acquiring corresponding relations between a plurality of standard color temperature intervals and color channel parameters respectively, and determining effective pixel points of a high-light area based on the color channel parameters corresponding to each pixel point in the high-light area of the underexposed image and the corresponding relations; the under-exposure optimization image acquisition module is used for determining a calibration parameter of the highlight area based on a color channel parameter corresponding to an effective pixel point in the highlight area, and calibrating the highlight area by using the calibration parameter of the highlight area to obtain an under-exposure optimization image; and the fusion processing module is used for carrying out fusion processing on the reference brightness image and the under-exposure optimization image to obtain a fusion image corresponding to the same scene.
In a third aspect, an embodiment of the present invention further provides an electronic system, including: the device comprises an image acquisition device, a processing device and a storage device; the system comprises an image acquisition device, a processing device and a processing device, wherein the image acquisition device is used for acquiring a plurality of images of the same scene; the storage means has stored thereon a computer program which, when run by a processing device, performs the image fusion method of the first aspect.
In a fourth aspect, the embodiments of the present invention further provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the image fusion method in the first aspect.
The embodiment of the invention has the following beneficial effects:
the embodiment of the invention provides an image fusion method, an image fusion device and an electronic system, wherein a calibration parameter of a highlight area is determined through effective pixel points in the highlight area in an underexposed image, the calibration parameter is used for calibrating the highlight area to obtain an underexposed optimized image, and then a reference brightness image and the underexposed optimized image are subjected to fusion processing to obtain a fusion image corresponding to the same scene.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and drawings.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic system according to an embodiment of the present invention;
fig. 2 is a flowchart of an image fusion method according to an embodiment of the present invention;
FIG. 3 is a flowchart of another image fusion method according to an embodiment of the present invention;
FIG. 4 is a flowchart of another image fusion method according to an embodiment of the present invention;
fig. 5 is a flowchart of a method for generating a correspondence between a standard color temperature interval and a color channel parameter according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of an image fusion method according to an embodiment of the present invention;
fig. 7 is a schematic diagram of an image fusion apparatus according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Aiming at the problem of color anomaly of a highlight region of a fused image in the existing HDR algorithm, the embodiment of the invention provides an image fusion method, an image fusion device and an electronic system, which relieve the problem of color cast in the highlight region of the fused image in the existing method, ensure the vivid effect of the fused image and improve the practical value of the fused image.
To facilitate understanding of the present embodiment, a detailed description will be given below of an image fusion method provided in an embodiment of the present invention.
First, refer to the schematic structure of the electronic system shown in fig. 1. The electronic system can be used for realizing the image fusion method, the image fusion device and the electronic system of the embodiment of the invention.
As shown in FIG. 1, an electronic system 100 includes one or more processing devices 102, one or more memory devices 104, an input device 106, an output device 108, and one or more image capture devices 110, which are interconnected via a bus system 112 and/or other type of connection mechanism (not shown). It should be noted that the components and structure of the electronic system 100 shown in fig. 1 are exemplary only, and not limiting, and that the electronic system may have other components and structures as desired.
The processing device 102 may be a server, a smart terminal, or a device containing a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, may process data for other components in the electronic system 100, and may control other components in the electronic system 100 to perform image fusion functions.
Storage 104 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. Volatile memory can include, for example, Random Access Memory (RAM), cache memory (or the like). The non-volatile memory may include, for example, Read Only Memory (ROM), a hard disk, flash memory, and the like. One or more computer program instructions may be stored on a computer-readable storage medium and executed by processing device 102 to implement the functions of image fusion in embodiments of the invention (implemented by the processing device) described below. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer-readable storage medium.
The input device 106 may be a device used by a user to input instructions and may include one or more of a keyboard, a mouse, a microphone, a touch screen, and the like.
The output device 108 may output various information (e.g., images or sounds) to the outside (e.g., a user), and may include one or more of a display, a speaker, and the like.
Image capture device 110 may capture multiple images of the same scene and store the captured multiple images in storage 104 for use by other components.
For example, the devices used for implementing the image fusion method, apparatus and electronic system according to the embodiments of the present invention may be integrally disposed, or may be disposed in a decentralized manner, such as integrally disposing the processing device 102, the storage device 104, the input device 106 and the output device 108, and disposing the image capturing device 110 at a designated position where an image can be captured. When the above-described devices in the electronic system are integrally provided, the electronic system may be implemented as an intelligent terminal such as a camera, a smart phone, a tablet computer, a vehicle-mounted terminal, and the like.
The embodiment of the invention provides an image fusion method, wherein an execution main body of the method is electronic equipment, the electronic equipment may be pre-stored with corresponding relations between a plurality of standard color temperature intervals and color channel parameters respectively, or may be used for acquiring the corresponding relations between the plurality of standard color temperature intervals and the color channel parameters from other storage devices, and the corresponding relations can be set according to actual conditions. The color channels comprise an R channel (namely a red channel), a B channel (namely a blue channel) and a G channel (namely a green channel), in practical application, the G channel is generally used as a denominator, R/G values and B/G values are counted, other channel ratios can be used, for example, the B channel is used as the denominator, R/B values and G/B values are counted, and the setting can be specifically carried out according to actual conditions. For ease of understanding, the color channel parameters including the red-green channel ratio (i.e., R/G value) and the blue-green channel ratio (i.e., B/G value) are exemplified herein. Table 1 shows a corresponding relationship between a standard color temperature interval and a color channel, where each color temperature interval may have a superimposed color temperature interval or may not have a superimposed color temperature interval:
TABLE 1
Figure BDA0002845058220000091
Based on the above correspondence, a flowchart of the image fusion method provided in the embodiment of the present invention is shown in fig. 2, and the method includes the following steps:
step S202, carrying out registration alignment on a plurality of images of the same scene to obtain a reference brightness image and at least one under-exposed image which are spatially aligned; the EV corresponding to the reference luminance image is different from the EV of the underexposed image.
Specifically, a plurality of images of the same scene are acquired through a camera, and the images are registered and aligned, so that the frame of each image is aligned with the frame; wherein, the registration alignment can be performed through Homography transformation, Mesh Flow algorithm, Optical Flow algorithm and the like. Determining a reference luminance image and an under-exposed image according to the aligned images, wherein EV of the reference luminance image may be 0 or not, and if the luminance of the EV-2 image is converted into the luminance of the EV0 image through array gain, EV-2 can be used as the reference luminance image; the number of the reference luminance image and the number of the underexposed images can be set according to the actual situation, and the embodiment of the present invention does not limit the description of the number of the underexposed images.
Step S204, acquiring corresponding relations between a plurality of standard color temperature intervals and color channel parameters, and determining effective pixel points of the highlight area based on the color channel parameters corresponding to each pixel point in the highlight area of the underexposed image and the corresponding relations.
In practical application, the highlight area of each under-exposed image includes a plurality of pixels, wherein the pixels include over-exposed pixels and effective pixels, and the over-exposed pixels will cause overexposure of the highlight area, so that the under-exposed image is seriously distorted, and therefore the effective pixels in the highlight area need to be determined, and the over-exposed pixels are discarded.
Because each pixel point corresponds to an R/G value and a B/G value, whether the pixel point is in a standard color temperature interval or not can be determined through the color channel parameters and the corresponding relation of each pixel point, if yes, the pixel point is determined to be an effective pixel point of a highlight area, so that an underexposed image can be optimized according to the effective pixel point, and the vivid effect of a fused image is further ensured.
Step S206, determining calibration parameters of the highlight area based on the color channel parameters corresponding to the effective pixel points in the highlight area, and calibrating the highlight area by using the calibration parameters of the highlight area to obtain the under-exposure optimized image.
In practical application, for an under-exposed image, it is a common practice to perform color restoration on the under-exposed image through an AWB (Automatic white balance) algorithm, that is, to correct a pixel point of the under-exposed image by calculating an AWB correction parameter, so as to obtain a corrected image. Therefore, based on the determined effective pixel points in the highlight area, determining a calibration parameter, namely AWB Gain, corresponding to the highlight area, wherein the color channel parameter of each pixel point comprises an R/G value and a B/G value, so that the calibration parameter AWB Gain in the highlight area comprises an R Gain corresponding to the R/G value and a B Gain corresponding to the B/G value, and correcting each effective pixel point in the highlight area based on the calibration parameter to obtain a corrected underexposed image, namely an underexposed optimized image.
In addition, because the calibration parameters of the high-light area are determined according to the effective pixel points of the high-light area, compared with the prior method in which the calibration parameters are determined directly by all the pixel points passing through the high-light area, the method improves the precision of the calibration parameters, further improves the calibration precision of the high-light area, and ensures the color restoration degree of the underexposure optimization image.
And S208, fusing the reference brightness image and the under-exposure optimization image to obtain a fused image corresponding to the same scene.
Specifically, if only one under-exposed image is included, HDR fusion processing is directly performed on the reference luminance image and the under-exposed optimized image corresponding to the under-exposed image, so as to obtain a fused image corresponding to the same scene. In addition, if the number of the underexposed images is two, wherein the EV corresponding to the first underexposed image is smaller than the EV corresponding to the second underexposed image, and the EV corresponding to the first underexposed image and the EV corresponding to the second underexposed image are both larger than 0; and if the EV corresponding to the first under-exposed image is 2 and the EV corresponding to the second under-exposed image is 4, performing HDR fusion processing on the reference brightness image, the under-exposed optimized image corresponding to the first under-exposed image and the under-exposed optimized image corresponding to the second under-exposed image to obtain a fused image corresponding to the same scene.
And if the EV corresponding to the reference luminance image is not equal to 0; at the moment, performing brightness transformation on the reference brightness image according to an EV transformation rule corresponding to the reference brightness image to obtain a reference optimized image, and fusing the reference optimized image and the underexposed optimized image to obtain a fused image corresponding to the same scene; or fusing the reference brightness image and the underexposed optimized image to obtain a primary fused image corresponding to the same scene, and performing brightness transformation on the primary fused image according to the EV transformation rule corresponding to the reference brightness image to obtain a fused image corresponding to the same scene. For example, n pieces of EV-2 are used as the reference luminance image, and the underexposed image includes the first underexposed image corresponding to EV-4 and the second underexposed image corresponding to EV-6, in this case, the luminance of the reference luminance image may be transformed according to the EV transformation rule corresponding to the reference luminance image, such as EV0 ═ EV-2 × 4, so that the EV corresponding to the obtained reference optimized image is 0, and HDR fusion processing may be performed based on the reference optimized image and the underexposed optimized image, so as to obtain a fusion image corresponding to the same scene; or, first, HDR fusion processing is performed on the reference luminance image corresponding to EV-2, the underexposed optimized image corresponding to EV-4, and the underexposed optimized image corresponding to EV-6 to obtain a preliminary fusion image, and luminance transformation is performed on the preliminary fusion image according to an EV transformation rule corresponding to the reference luminance image to obtain a fusion image corresponding to the same scene.
According to the image fusion method provided by the embodiment of the invention, the calibration parameter of the highlight area is determined through the effective pixel points in the highlight area in the underexposed image, the highlight area is calibrated by using the calibration parameter to obtain the underexposed optimized image, and then the reference brightness image and the underexposed optimized image are subjected to fusion processing to obtain the fusion image corresponding to the same scene.
On the basis of fig. 2, another image fusion method is further provided in an embodiment of the present invention, where an execution subject is the electronic device, where the electronic device prestores correspondence between a plurality of standard color temperature intervals and color channel parameters, and the color channel parameters include: the method mainly describes a process of determining effective pixel points of a highlight region based on color channel parameters and corresponding relations corresponding to each pixel point in the highlight region of an underexposed image, and as shown in fig. 3, the method comprises the following processes:
step S302, carrying out registration alignment on a plurality of images of the same scene to obtain a reference brightness image and at least one under-exposed image which are aligned in space; the EV corresponding to the reference luminance image is different from the EV of the underexposed image.
In step S304, a highlight region of the underexposed image is determined based on the overexposed region of the reference luminance image.
Specifically, after the reference luminance image and the underexposed image are registered and aligned, an overexposure area of the general reference luminance image is slightly larger than a highlight area of the underexposed image, so for the overexposure area of the reference luminance image, three channel values of an R channel, a B channel and a G channel of each pixel point in the overexposure area are generally counted, and if the three channel values are all saturation values, the area where the pixel point is located is considered as the highlight area, so that the highlight area of the underexposed image is determined by judging the channel values of the pixel points in the overexposure area.
Step S306, the color channel value of each pixel point in the highlight area is checked.
Step S308, marking the pixel points with the color channel value larger than or equal to the preset maximum value in the highlight area as invalid pixel points.
Specifically, for three channel values (i.e., color channel values) of an R channel, a B channel, and a G channel of each pixel point in the highlight area, it is determined whether each color channel value is greater than or equal to a preset maximum value, if the preset maximum value is 250(8bit), if the preset maximum value is greater than or equal to the preset maximum value, the pixel point corresponding to the channel value is considered as an overexposed pixel point, i.e., an invalid pixel point. In addition, the R/G value and the B/G value of each pixel point can be calculated according to three color channel values of an R channel, a B channel and a G channel of each pixel point in the highlight area, if one of the R/G value or the B/G value is 255(8bit)/1024(10bit), the pixel point corresponding to the R/G value or the B/G value is also judged to be an invalid pixel point, the invalid pixel point can cause the highlight area to be overexposed, and therefore the underexposed image is seriously distorted, and therefore when the pixel points in the highlight area are counted, the invalid pixel point needs to be discarded.
Step S310, calculating color channel parameters of the pixel points whose color channel values are smaller than the preset maximum value.
In step S312, the pixel point where the color channel parameter falls between the minimum color channel parameter and the maximum color channel parameter in the corresponding relationship is determined as an effective pixel point of the highlight region.
Specifically, after color channel parameters of pixels with color channel values smaller than a preset maximum value are obtained, whether the color channel parameters of each pixel are between the minimum color channel parameter and the maximum color channel parameter in the corresponding relation list is judged based on the corresponding relation between the standard color temperature interval and the color channel parameters, if so, namely the color channel parameters fall within the color channel parameter range corresponding to the high color temperature region or the medium color temperature region or the low color temperature region, the pixel corresponding to the color channel parameters is an effective pixel of the high color region, and if the color channel parameters fall outside the color channel parameter ranges corresponding to the high color temperature region, the medium color temperature region and the low color temperature region, the pixel corresponding to the color channel parameters is still judged to be an ineffective pixel. Meanwhile, for the counted effective pixel points, the color temperature interval corresponding to the effective pixel point can be judged according to the corresponding color channel parameters.
Step S314, determining calibration parameters of the highlight area based on the color channel parameters corresponding to the effective pixel points in the highlight area, and calibrating the highlight area by using the calibration parameters of the highlight area to obtain the under-exposure optimized image.
And step S316, fusing the reference brightness image and the under-exposure optimization image to obtain a fused image corresponding to the same scene.
The above steps S314 to S316 are detailed in the foregoing embodiments, and the embodiments of the present invention are not described in detail herein. Therefore, through the determination process of the effective pixel points, the ineffective pixel points in the highlight area can be discarded, so that the underexposed image is optimized according to the effective pixel points, and the vivid effect of the fused image is ensured.
On the basis of fig. 2, another image fusion method is further provided in an embodiment of the present invention, where an execution subject is an electronic device, where the electronic device pre-stores correspondence between a plurality of standard color temperature intervals and color channel parameters, and the color channel parameters include: the red-green channel ratio and the blue-green channel ratio, the method focuses on how to determine to perform the process of determining the calibration parameters of the highlight region based on the color channel parameters corresponding to the effective pixel points in the highlight region, as shown in fig. 4, the method includes the following processes:
step S402, carrying out registration alignment on a plurality of images of the same scene to obtain a reference brightness image and at least one under-exposed image which are spatially aligned; the EV corresponding to the reference luminance image is different from the EV of the underexposed image.
Step S404, determining effective pixel points of the highlight area based on the color channel parameters and the corresponding relation corresponding to each pixel point in the highlight area of the underexposed image.
The above steps can refer to the foregoing embodiments, and the embodiments of the present invention are not described in detail herein.
Step S406, determining a first target standard color temperature section corresponding to the reference brightness image and a second target standard color temperature section corresponding to the high light region of the underexposed image according to the corresponding relation.
Specifically, according to the color channel parameters and the corresponding relations of the pixel points in the reference luminance image, the color temperature interval of each pixel point in the reference luminance image can be determined, and then the first target standard color temperature interval corresponding to the reference luminance image is determined. In particular, the reference luminance image corresponding to EV0 is mostly used as the reference luminance image, and since the main light source environment in the shooting scene is mostly ambient light with a low color temperature, the first target standard color temperature section corresponding to the reference luminance image is mostly a low color temperature section, and the other color temperature sections can be set according to practical situations, which is not limited in the embodiment of the present invention.
Meanwhile, based on the color channel parameters and the corresponding relation of the effective pixel points in the high-light region of the under-exposed image, a second target standard color temperature interval corresponding to the high-light region can be determined. Since the effective pixel points in the high light region may be in different color temperature ranges, in practical application, the second target standard color temperature range corresponding to the high light region needs to be determined according to the occupation ratio of the effective pixel points in the high light region corresponding to the plurality of standard color temperature ranges.
Specifically, in an underexposed image, a plurality of highlight areas may exist, in practical application, effective pixel points in the underexposed image can be extracted, and the underexposed image is subjected to binarization processing, wherein the effective pixel points are 1, and the ineffective pixel points are 0, so that a mask of the highlight areas is formed, and a certain expansion corrosion operation is performed on the mask, so that a plurality of overexposure areas of 1 in the operated image can be found.
Optionally, the region with the most pixels in the multiple highlight regions may be used as an effective region, and a second target standard color temperature interval corresponding to the highlight region is determined based on the percentage of the effective pixel point in the effective region corresponding to the multiple standard color temperature intervals, if a certain underexposed image has 3 highlight areas, after counting the effective pixel points of each highlight area, the number of effective pixel points in the highlight region 1 is A1, the number of effective pixel points in the highlight region 2 is A2, the number of effective pixel points in the highlight region 3 is A3, A1 is more than A2 is more than A3, the high light region 3 can be used as an effective region, and a second target standard color temperature section corresponding to the high light region 3 can be determined according to the occupation ratio of a plurality of standard color temperature sections corresponding to effective pixel points in the high light region 3, and the second target standard color temperature section corresponding to the high light region 3 is determined as the second target standard color temperature section corresponding to the high light region of the underexposed image.
Optionally, the second target standard color temperature section corresponding to each high light region may be determined according to the ratio of the effective pixel point of each high light region in the underexposed image to the plurality of standard color temperature sections, for example, the second target standard color temperature sections corresponding to the high light region 1, the high light region 2, and the high light region 3 in the underexposed image are determined according to the ratio of the plurality of standard color temperature sections corresponding to the effective pixel points in the high light region 1, the high light region 2, and the high light region 3, respectively. Or counting the occupation ratios of the effective pixel points in all the high-light regions corresponding to the plurality of standard color temperature intervals, and determining the second target standard color temperature intervals corresponding to all the high-light regions according to the counted occupation ratios. For example, the method for counting the occupation ratios of the plurality of standard color temperature sections corresponding to all the effective pixels in the high light region 1, the high light region 2, and the high light region 3, determining the second target standard color temperature section corresponding to the high light region 1, the high light region 2, and the high light region 3 according to the occupation ratios, and specifically determining the second target standard color temperature section corresponding to the high light region may be set according to actual conditions, which is not limited in the embodiments of the present invention.
In addition, the standard color temperature section includes a high color temperature section, a middle color temperature section and a low color temperature section; the process of determining the second target standard color temperature section corresponding to the highlight region according to the percentage of the effective pixel points in the highlight region corresponding to the plurality of standard color temperature sections is specifically as follows:
(1) counting a first ratio of first type effective pixel points belonging to a high color temperature interval, a second ratio of second type effective pixel points belonging to a medium color temperature interval and a third ratio of third type effective pixel points belonging to a low color temperature interval in a high light region;
(2) comparing the first ratio value, the second ratio value and the third ratio value;
(3) if the first ratio value is maximum and is larger than a preset ratio threshold, determining that the high light area corresponds to a high color temperature interval;
(4) and if the first ratio is smaller than or equal to the preset ratio threshold, setting the effective pixel point corresponding to the minimum ratio among the first ratio, the second ratio and the third ratio as an ineffective pixel point, and determining a second target standard color temperature interval of the highlight area based on the color channel parameter corresponding to the current effective pixel point in the highlight area.
For example, for a high light area, the preset ratio threshold is 50%, if the first ratio value of the first type effective pixel points belonging to the high color temperature interval is the largest and exceeds 50%, the high light area is determined to be the high color temperature interval, and the pixel points in other color temperature intervals of the high light area are discarded; and if the first ratio of the first type of effective pixel points belonging to the high color temperature interval and the second ratio of the second type of effective pixel points belonging to the medium color temperature interval do not exceed 50 percent and are both greater than the third ratio of the third type of effective pixel points belonging to the low color temperature interval, the third type of effective pixel points are considered as invalid pixel points, and the color channel values of the first type of effective pixel points and the second type of effective pixel points are calculated to determine the second target standard color temperature interval of the high light area.
Step S408, comparing whether the first target standard color temperature interval is the same as the second target standard color temperature interval; if not, step S410 is performed, and if the same, step S412 is performed.
In practical applications, only when the calibration parameter AWB Gain of the highlight region and the calibration parameter AWB Gain of the reference luminance image are largely different, the highlight region will be color-shifted. The calibration parameter AWB Gain of the reference luminance image corresponding to EV0 is mainly affected by the dominant illuminant, and it is assumed that the environment of the dominant illuminant in the current shooting scene is an ambient light with a low color temperature, but the highlight area is a light of a high color temperature, the calibration parameter AWB Gain calculated for the reference luminance image is determined based on the main ambient light, and therefore, the calibration parameter AWB Gain of the dominant light source in the reference luminance image and the calibration parameter AWB Gain of the ambient light are relatively close, and the calibration parameter AWB Gain of the dominant light source is suitable for the scene with low color temperature, and the high light region of the reference brightness image is light with high color temperature, even if the calibration parameter AWB Gain of the dominant light source with low color temperature is applied, the dominant light source still has a high light area, therefore, the highlight region of the reference luminance image corresponding to EV0 is not affected, i.e., the dominant light source does not need to undergo AWB processing.
And the high light area of the underexposed image corresponding to EV-2 is an underexposed state, when the calibration parameter AWB Gain calculated according to the low color temperature ambient light is applied to the high color temperature interval, because Blue Gain in the AWB Gain is larger, it will cause the high color temperature interval of the underexposed image corresponding to EV-2 to be bluish, therefore, it needs to judge which standard color temperature interval the calibration parameter AWB Gain of the reference luminance image corresponding to EV0 belongs to, only when the judged color temperature interval of the high light area of the underexposed image is not consistent with the color temperature interval of the reference luminance image, the calibration parameter of the high light area is determined, otherwise, the calibration parameter AWB Gain of the reference luminance image is directly used to calibrate the high light area, therefore, it needs to compare whether the first target standard interval is the same as the second target standard interval, if the color temperature is different, then step S410 is executed, if so, step S412 is performed.
Step S410, determining calibration parameters of the highlight area based on the color channel parameters corresponding to the effective pixel points in the highlight area, and calibrating the highlight area by using the calibration parameters of the highlight area to obtain an under-exposure optimization image.
Specifically, the mean value of the color channel parameters corresponding to all effective pixel points in the highlight area can be calculated; and determining the calibration parameters of the high-light region based on the calculated mean value, i.e., R Gain 1/(R/G _ mean) and B Gain 1/(B/G _ mean) in the calibration parameters of the high-light region. Or calculating the mean value of the color channel parameters corresponding to the effective pixel points belonging to the second target standard color temperature interval in the highlight region; and determining a calibration parameter of the high-light area based on the calculated mean value, if the second target standard color temperature interval of the high-light area is a high color temperature interval, then in the calibration parameter of the high-light area, R Gain is 1/(R/G _ high color temperature mean value), and B Gain is 1/(B/G _ high color temperature mean value), thereby eliminating the influence of a small area corresponding to an invalid pixel point in the high-light area, only reserving an area corresponding to an effective pixel point with a large weight in the high-light area, improving the precision of the calibration parameter, further improving the calibration precision of the high-light area, and ensuring the color reduction degree of the under-exposure optimized image.
Step S412, calibrating the highlight area based on the calibration parameters of the reference brightness image to obtain an under-exposure optimization image.
Specifically, the calibration parameters of the reference luminance image are directly used for calibrating the highlight area, and an underexposure optimization image is obtained. For example, for a dim yellow street lamp scene, the overall color tone of the whole environment is yellow, and the calibration parameter AWB Gain of the ambient light is mainly used for compensating the yellow environment, that is, the calibration parameter AWB Gain of the reference luminance image is mainly used for compensating the yellow environment; and the calibration parameter calculated by the yellow street lamp, namely the calibration parameter AWB Gain of the under-exposed image, is still the color temperature environment needing to be compensated for yellow, so that the calibration parameter AWB Gain of the reference brightness image can be adopted as the calibration parameter of the under-exposed image for ensuring the uniformity of the picture style, and the correction is carried out to obtain the under-exposed optimized image.
And step S414, fusing the reference brightness image and the under-exposure optimization image to obtain a fused image corresponding to the same scene.
Therefore, the first target standard color temperature interval of the reference brightness image and the second target standard color temperature interval of the underexposure image are compared for determining whether the first target standard color temperature interval is the same as the second target standard color temperature interval of the underexposure image, so as to determine the calibration parameter of the highlight area in the underexposure image, namely, the calibration parameter of the reference brightness image is adopted as the calibration parameter of the highlight area when the first target standard color temperature interval is the same as the second target standard color temperature interval of the underexposure image, and the calibration parameter of the highlight area is not determined according to the color channel parameter corresponding to the effective pixel point in the highlight area at the same time, so that the calibration parameter of the highlight area is selected adaptively, the precision of the calibration parameter is improved, the color restoration.
In another possible embodiment, an embodiment of the present invention further provides a method for generating a correspondence between a standard color temperature interval and a color channel parameter, as shown in fig. 5, where the method includes the following steps:
step S502, acquiring a photographed image of the electronic device under a plurality of preset standard color temperature light sources.
Specifically, photographing is performed by a camera under 7500K, 6500K, 5000K, 4000K, 3000K and 2300K standard color temperature light sources, and corresponding photographed images are obtained respectively. It should be noted that, the standard color temperature light source may be set according to practical situations, and this is not limited to be described in the embodiment of the present invention.
Step S504, counting and determining color channel parameters of the photographed image corresponding to each preset standard color temperature light source;
and respectively counting the red-green channel ratio (R/G) and the blue-green channel ratio (B/G) of each pixel point under each standard color temperature light source for the photographed images under the plurality of preset standard color temperature light sources, so as to divide the color temperature interval of the preset standard color temperature light sources according to the corresponding color channel parameters.
Step S506, generating a corresponding relationship between each of the standard color temperature intervals and the color channel parameter based on the color channel parameter corresponding to each of the preset standard color temperature light sources.
Specifically, according to the R/G value and the B/G value of each pixel point under each standard color temperature light source, a color temperature interval is divided into 3 color temperature intervals, namely 5000K-7500K of a high color temperature interval, 3000K-5000K of a medium color temperature interval and 2300K-3000K of a low color temperature interval, and corresponding relations between a plurality of standard color temperature intervals and color channel parameters are established so as to distinguish the color temperature intervals according to the color channel parameters.
In addition, the color temperature range can be expanded by setting a reasonable error range, if the error range is set to be +/-15%, namely when the R/G value and the B/G value of a certain pixel point in the image are within +/-15% of the color temperature range, the pixel point is also considered as an effective pixel point. For example, the R/G value corresponding to D75(7500K) is 0.3, the B/G value is 0.6, after error calculation, the color temperature range of D75 is R/G ═ 0.255, 0.345, and B/G ═ 0.51, 0.69, and when the R/G value and B/G value of a certain pixel fall within the above range, the pixel is considered as an effective pixel, and because D75 belongs to the high color temperature range, the pixel is an effective pixel in the high color temperature region.
It should be noted that when a pixel in a high light region is located in the color temperature range of D75, the pixel is considered to be a pixel in D75; when a certain pixel point is in a transitional color temperature interval, if the pixel point belongs to both a high color temperature interval and a medium color temperature interval, the pixel point is classified into the high color temperature interval for calculation and also classified into the medium color temperature interval for calculation; and when a certain pixel point is at the junction of D75 and D65(6500K), the pixel point is considered to belong to the pixel point in the high color temperature range.
For convenience of understanding, the EV value corresponding to the reference luminance image is 0, the EV value corresponding to the first underexposed image is 2, and the EV value corresponding to the second underexposed image is 4. In the existing method, a high light region of a reference brightness image EV0 and a high light region of a first underexposed image EV2 are fused, and then a high light region of a second underexposed image EV4 is fused, because the high light region of the first underexposed image EV2 and the high light region of the second underexposed image EV4 apply calibration parameters of the reference brightness image EV0, in a low-color-temperature light source, the components of an R channel are more, the components of a B channel are less, and more B Gains need to be compensated for realizing the normal white balance of the low-color-temperature light source, so that the numerical values of the three channels of the R channel, the G channel and the B channel are closer; however, the high-light region of the high color temperature is just opposite, that is, the B channel component is more, the R channel component is less, and theoretically, more R Gain needs to be compensated, and when the calibration parameter AWB Gain of the fused reference luminance image EV0 of the low color temperature is applied to the high-light region of the first underexposed image EV2 and the high-color temperature region of the second underexposed image EV4, the color of the high-color temperature region will be bluish, and the high-light region in the fused image will be bluish, that is, the color of the high-light region in the fused image will be abnormal, and the actual application requirements cannot be met.
As shown in fig. 6, the image fusion method according to the embodiment of the present invention includes the following steps:
(61) firstly, calibrating a camera, specifically, taking a picture of the same scene under a 2300K light source-7500K light source through the camera, and counting color channel parameters of each image, namely, counting an R/G value and a B/G value to establish corresponding relations between a plurality of standard color temperature intervals and the color channel parameters respectively; setting a reasonable error range of +/-15%, namely setting the R/G value and the B/G value of each pixel point in the high light region to be within +/-15% of the standard color temperature range, namely setting a calibration data range corresponding to the standard color temperature range, namely, setting the pixel points as effective pixel points;
(62) carrying out registration alignment on the reference brightness image EV0, the first under-exposed image EV2 and the second under-exposed image EV 4;
(63) performing highlight overexposure area detection on the reference luminance image EV0 to determine a highlight area of the reference luminance image EV 0;
(64) determining a highlight region of the first under-exposed image EV2 based on the highlight region of the reference luminance image EV0, specifically, extracting a highlight region boundary of the reference luminance image EV0, and determining the highlight region of the first under-exposed image EV2 according to the highlight region boundary; and performing steps (641) - (647);
(641) calculating the R/G value and the B/G value of each pixel point in the highlight area of the first underexposed image EV 2;
(642) judging whether each pixel point is an overexposure pixel point (namely an invalid pixel point), if so, discarding the pixel point, and if not, executing the step (643);
(643) judging whether the R/G value and the B/G value of the pixel point are in the calibration data range, if not, discarding the pixel point, if so, determining the pixel point as an effective pixel point of a highlight area, and executing the step (644);
(644) calculating the R/G value and the B/G value of each effective pixel point in the highlight area, and judging a standard color temperature interval corresponding to the effective pixel point;
(645) the standard color temperature interval corresponding to the highlight region is determined based on the standard color temperature interval corresponding to each effective pixel point, which may specifically refer to the foregoing embodiment, and details are not repeated herein in the embodiment of the present invention;
(646) determining calibration parameters of the highlight area based on color channel parameters corresponding to effective pixel points in the highlight area, namely calculating R Gain as 1/(R/G _ mean), and B Gain as 1/(B/G _ mean);
(647) calibrating the highlight area by using the calibration parameters of the highlight area to obtain an underexposure optimization image corresponding to the first underexposure image EV 2; specifically, multiplying each effective pixel point in a highlight area of the first under-exposed image EV2 by R Gain and B Gain to perform color restoration on the first under-exposed image EV2 to obtain an under-exposed optimized image corresponding to the first under-exposed image EV 2;
(65) determining a highlight region of the second underexposed image EV4 based on the highlight region of the first underexposed image EV2, and performing steps (651) - (657), where the specific method may refer to step (64), and may also determine the highlight region of the second underexposed image EV4 based on the highlight region of the reference luminance image EV0, which is not described in detail herein in the embodiment of the present invention;
(651) calculating the R/G value and the B/G value of each pixel point in the highlight area of the second underexposed image EV 4;
(652) judging whether each pixel point is an overexposure pixel point (namely an invalid pixel point), if so, discarding the pixel point, and if not, executing the step (653);
(653) judging whether the R/G value and the B/G value of the pixel point are in the calibration data range, if not, discarding the pixel point, if so, determining the pixel point as an effective pixel point of a highlight area, and executing the step (654);
(654) calculating the R/G value and the B/G value of each effective pixel point in the highlight area, and judging a standard color temperature interval corresponding to the effective pixel point;
(655) judging a standard color temperature interval corresponding to the highlight area based on the standard color temperature interval corresponding to each effective pixel point;
(656) determining calibration parameters of the highlight area based on color channel parameters corresponding to effective pixel points in the highlight area, namely calculating R Gain as 1/(R/G _ mean), and B Gain as 1/(B/G _ mean);
(657) calibrating the highlight area by using the calibration parameters of the highlight area to obtain an underexposure optimization image corresponding to the second underexposure image EV 4; specifically, multiplying each effective pixel point in a highlight area of the second underexposed image EV4 by R Gain and B Gain to perform color restoration on the second underexposed image EV4 to obtain an underexposed optimized image corresponding to the second underexposed image EV 4;
(66) and performing fusion processing on the reference brightness image EV0, the underexposed optimized image corresponding to the first underexposed image EV2 and the underexposed optimized image corresponding to the second underexposed image EV4 to obtain a fused image corresponding to the same scene.
Therefore, according to the image fusion method, the calibration parameters of the highlight area are determined through the effective pixel points in the highlight area in the underexposure image, the calibration parameters are used for calibrating the highlight area to obtain the underexposure optimized image, the reference brightness image and the underexposure optimized image are subjected to fusion processing to obtain the fusion image corresponding to the same scene, compared with the underexposure image fusion processing in the existing method in which the reference brightness image and the underexposure image calibrated by the calibration parameters of the reference brightness image are directly subjected to fusion processing, the problem of color cast of the highlight area of the fusion image is relieved, the vivid effect of the fusion image is guaranteed, and the practical value of the fusion image is improved.
On the basis of the foregoing embodiment, an embodiment of the present invention further provides an image fusion apparatus, which is applied to an electronic device, and as shown in fig. 7, the apparatus includes a registration alignment module 71, an effective pixel point determining module 72, an underexposure optimization image obtaining module 73, and a fusion processing module 74, which are connected in sequence, where functions of each module are as follows:
a registration alignment module 71, configured to perform registration alignment on multiple images of the same scene to obtain a spatially aligned reference luminance image and at least one under-exposed image; wherein, the exposure value EV corresponding to the reference brightness image is different from the EV of the underexposed image;
an effective pixel point determining module 72, configured to obtain corresponding relationships between a plurality of standard color temperature intervals and color channel parameters, and determine an effective pixel point in a highlight region based on the color channel parameter and the corresponding relationship corresponding to each pixel point in the highlight region of the underexposed image;
the under-exposure optimization image acquisition module 73 is configured to determine a calibration parameter of the highlight region based on a color channel parameter corresponding to an effective pixel point in the highlight region, and calibrate the highlight region by using the calibration parameter of the highlight region to obtain an under-exposure optimization image;
and the fusion processing module 74 is configured to perform fusion processing on the reference luminance image and the under-exposure optimized image to obtain a fusion image corresponding to the same scene.
The image fusion device provided by the embodiment of the invention determines the calibration parameter of the highlight area through the effective pixel points in the highlight area in the underexposed image, and calibrates the highlight area by using the calibration parameter to obtain the underexposed optimized image, and then performs fusion processing on the reference brightness image and the underexposed optimized image to obtain the fusion image corresponding to the same scene.
In one possible embodiment, the effective pixel point determining module 72 is further configured to: determining a highlight region of an underexposed image based on an overexposed region of a reference luminance image; checking the color channel value of each pixel point in the highlight area; marking the pixel points with the color channel value larger than or equal to the preset maximum value in the highlight area as invalid pixel points; calculating color channel parameters of pixel points with color channel values smaller than a preset maximum value; and determining the pixel point of which the color channel parameter is between the minimum color channel parameter and the maximum color channel parameter in the corresponding relation as an effective pixel point of the highlight area.
In another possible embodiment, before the under-exposure optimization image obtaining module 73, the apparatus is further configured to: determining a first target standard color temperature interval corresponding to the reference brightness image and a second target standard color temperature interval corresponding to a high light region of the under-exposed image according to the corresponding relation; comparing whether the first target standard color temperature interval is the same as the second target standard color temperature interval; if not, performing calibration parameter determination for the highlight region based on the color channel parameters corresponding to the active pixel points in the highlight region.
In another possible embodiment, the determining, according to the correspondence, a first target standard color temperature section corresponding to the reference luminance image and a second target standard color temperature section corresponding to a high light region of the underexposed image includes: determining a first target standard color temperature interval corresponding to the reference brightness image based on the color channel parameters and the corresponding relation of each pixel point in the reference brightness image; and determining a second target standard color temperature interval corresponding to the high light region based on the color channel parameters and the corresponding relation of the effective pixel points in the high light region of the underexposed image.
In another possible embodiment, the determining a second target standard color temperature section corresponding to a high light region based on the color channel parameters and the corresponding relationship of the effective pixel points in the high light region of the underexposed image includes: and determining a second target standard color temperature interval corresponding to the high light region according to the proportion of the effective pixel points in the high light region to the plurality of standard color temperature intervals.
In another possible embodiment, the high light area is plural; the determining of the second target standard color temperature section corresponding to the highlight region according to the proportion of the effective pixel points in the highlight region corresponding to the plurality of standard color temperature sections includes one of the following: taking the area with the most pixels in the multiple highlight areas as an effective area, and determining a second target standard color temperature interval corresponding to the highlight area based on the proportion of the effective pixel points in the effective area to the multiple standard color temperature intervals; determining a second target standard color temperature interval corresponding to each high light region according to the ratio of effective pixel points in each high light region to a plurality of standard color temperature intervals; and counting the occupation ratios of the effective pixel points in all the high-light regions corresponding to the plurality of standard color temperature intervals, and determining second target standard color temperature intervals corresponding to all the high-light regions according to the counted occupation ratios.
In another possible embodiment, the plurality of standard color temperature intervals are a high color temperature interval, a middle color temperature interval and a low color temperature interval respectively; the determining a second target standard color temperature section corresponding to the highlight region according to the proportion of the effective pixel points in the highlight region to the plurality of standard color temperature sections includes: counting a first ratio of first type effective pixel points belonging to a high color temperature interval, a second ratio of second type effective pixel points belonging to a medium color temperature interval and a third ratio of third type effective pixel points belonging to a low color temperature interval in a high light region; comparing the first ratio value, the second ratio value and the third ratio value; if the first ratio value is maximum and is larger than a preset ratio threshold, determining a high color temperature interval corresponding to the high light area; and if the first ratio is smaller than or equal to the preset ratio threshold, setting the effective pixel point corresponding to the minimum ratio among the first ratio, the second ratio and the third ratio as an ineffective pixel point, and determining a second target standard color temperature interval of the highlight area based on the color channel parameter corresponding to the current effective pixel point in the highlight area.
In another possible embodiment, the under-exposure optimization image obtaining module 73 is further configured to: calculating the mean value of color channel parameters corresponding to all effective pixel points in the highlight area; the calibration parameters for the highlight region are determined based on the calculated mean.
In another possible embodiment, the under-exposure optimization image obtaining module 73 is further configured to: calculating the mean value of the color channel parameters corresponding to the effective pixel points belonging to the second target standard color temperature interval in the highlight region; the calibration parameters for the highlight region are determined based on the calculated mean.
In another possible embodiment, the number of the underexposed images is two, the EV corresponding to the first underexposed image is smaller than the EV corresponding to the second underexposed image, and the EVs corresponding to the first underexposed image and the second underexposed image are both larger than 0; the fusion processing module 74 is further configured to: and performing high dynamic range HDR fusion processing on the reference brightness image, the underexposed optimized image corresponding to the first underexposed image and the underexposed optimized image corresponding to the second underexposed image to obtain a fusion image corresponding to the same scene.
In another possible embodiment, the EV corresponding to the reference luminance image is equal to 0.
In another possible embodiment, the EV for the reference luminance image is not equal to 0; the fusion processing module 74 is further configured to: according to the EV transformation rule corresponding to the reference brightness image, performing brightness transformation on the reference brightness image to obtain a reference optimized image, and fusing the reference optimized image and the underexposed optimized image to obtain a fused image corresponding to the same scene; or fusing the reference brightness image and the underexposed optimized image to obtain a primary fused image corresponding to the same scene, and performing brightness transformation on the primary fused image according to the EV transformation rule corresponding to the reference brightness image to obtain a fused image corresponding to the same scene.
In another possible embodiment, the apparatus is further configured to: acquiring shot images under a plurality of preset standard color temperature light sources; counting and determining color channel parameters of the photographed image corresponding to each preset standard color temperature light source; and generating corresponding relations between a plurality of standard color temperature intervals and the color channel parameters respectively based on the color channel parameters corresponding to each preset standard color temperature light source.
The image fusion device provided by the embodiment of the invention has the same technical characteristics as the image fusion method provided by the embodiment, so that the same technical problems can be solved, and the same technical effects can be achieved.
The embodiment of the invention also provides electronic equipment which comprises a processor and a memory, wherein the memory stores machine executable instructions capable of being executed by the processor, and the processor executes the machine executable instructions to realize the image fusion method.
Referring to fig. 8, the electronic device includes a processor 80 and a memory 81, the memory 81 stores machine executable instructions capable of being executed by the processor 80, and the processor 80 executes the machine executable instructions to implement the image fusion method.
Further, the electronic device shown in fig. 8 further includes a bus 82 and a communication interface 83, and the processor 80, the communication interface 83, and the memory 81 are connected through the bus 82.
The Memory 81 may include a high-speed Random Access Memory (RAM) and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 83 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, etc. may be used. The bus 82 may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (Enhanced Industry Standard Architecture) bus, or the like. The above-mentioned bus may be classified into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one double-headed arrow is shown in FIG. 8, but that does not indicate only one bus or one type of bus.
The processor 80 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 80. The Processor 80 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory 81, and the processor 80 reads information in the memory 81 and performs the steps of the method of the previous embodiment in combination with hardware thereof.
The present embodiments also provide a machine-readable storage medium having stored thereon machine-executable instructions that, when invoked and executed by a processor, cause the processor to implement the image fusion method described above.
The image fusion method, the image fusion device and the computer program product of the electronic system provided by the embodiment of the present invention include a computer readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiment, and specific implementation may refer to the method embodiment, and will not be described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (16)

1.一种图像融合方法,其特征在于,所述方法包括:1. an image fusion method, is characterized in that, described method comprises: 对同一场景的多张图像进行配准对齐,得到空间对齐的基准亮度图像和至少一张欠曝图像;其中,所述基准亮度图像对应的曝光值EV和所述欠曝图像的EV不同;A plurality of images of the same scene are registered and aligned to obtain a spatially aligned reference luminance image and at least one underexposed image; wherein, the exposure value EV corresponding to the reference luminance image is different from the EV of the underexposed image; 获取多个标准色温区间分别与颜色通道参数的对应关系,并基于所述欠曝图像的高光区域中每个像素点对应的颜色通道参数和所述对应关系确定所述高光区域的有效像素点;Obtaining the corresponding relationship between a plurality of standard color temperature intervals and color channel parameters, and determining the effective pixel points of the highlight area based on the color channel parameter corresponding to each pixel in the highlight area of the underexposed image and the corresponding relationship; 基于所述高光区域中的有效像素点对应的颜色通道参数确定所述高光区域的校准参数,并应用所述高光区域的校准参数对所述高光区域进行校准,得到欠曝优化图像;Determine the calibration parameter of the highlight area based on the color channel parameters corresponding to the effective pixels in the highlight area, and apply the calibration parameter of the highlight area to calibrate the highlight area to obtain an underexposed optimized image; 对所述基准亮度图像和所述欠曝优化图像进行融合处理,得到所述同一场景对应的融合图像。Perform fusion processing on the reference brightness image and the underexposure optimized image to obtain a fusion image corresponding to the same scene. 2.根据权利要求1所述的图像融合方法,其特征在于,基于所述欠曝图像的高光区域中每个像素点对应的颜色通道参数和所述对应关系确定所述高光区域的有效像素点的步骤,包括:2 . The image fusion method according to claim 1 , wherein the effective pixel points of the highlight area are determined based on the color channel parameters corresponding to each pixel point in the highlight area of the underexposed image and the corresponding relationship. 3 . steps, including: 基于所述基准亮度图像的过曝区域确定所述欠曝图像的高光区域;determining a highlight area of the underexposed image based on the overexposed area of the reference luminance image; 检查所述高光区域中每个像素点的颜色通道值;Check the color channel value of each pixel in the highlight area; 将所述高光区域中颜色通道值大于或等于预设最大值的像素点标记为无效像素点;Marking pixels whose color channel values in the highlight area are greater than or equal to the preset maximum value as invalid pixels; 计算颜色通道值小于所述预设最大值的像素点的颜色通道参数;Calculate the color channel parameter of the pixel whose color channel value is less than the preset maximum value; 将颜色通道参数落于所述对应关系中最小颜色通道参数和最大颜色通道参数之间的像素点确定为所述高光区域的有效像素点。A pixel point whose color channel parameter falls between the minimum color channel parameter and the maximum color channel parameter in the corresponding relationship is determined as an effective pixel point in the highlight area. 3.根据权利要求1所述的图像融合方法,其特征在于,基于所述高光区域中的有效像素点对应的颜色通道参数确定所述高光区域的校准参数的步骤之前,所述方法还包括:3. The image fusion method according to claim 1, wherein before the step of determining the calibration parameter of the highlight region based on the color channel parameters corresponding to the effective pixels in the highlight region, the method further comprises: 根据所述对应关系确定所述基准亮度图像对应的第一目标标准色温区间和所述欠曝图像的高光区域对应的第二目标标准色温区间;Determine the first target standard color temperature interval corresponding to the reference luminance image and the second target standard color temperature interval corresponding to the highlight area of the underexposed image according to the corresponding relationship; 比较所述第一目标标准色温区间与所述第二目标标准色温区间是否相同;comparing whether the first target standard color temperature interval is the same as the second target standard color temperature interval; 如果不同,执行所述基于所述高光区域中的有效像素点对应的颜色通道参数确定所述高光区域的校准参数的步骤。If different, the step of determining the calibration parameter of the highlight region based on the color channel parameters corresponding to the effective pixel points in the highlight region is performed. 4.根据权利要求3所述的图像融合方法,其特征在于,根据所述对应关系确定所述基准亮度图像对应的第一目标标准色温区间和所述欠曝图像的高光区域对应的第二目标标准色温区间的步骤,包括:4 . The image fusion method according to claim 3 , wherein the first target standard color temperature interval corresponding to the reference luminance image and the second target corresponding to the highlight area of the underexposed image are determined according to the corresponding relationship. 5 . The steps for the standard color temperature interval include: 基于所述基准亮度图像中各个像素点的颜色通道参数和所述对应关系,确定所述基准亮度图像对应的第一目标标准色温区间;determining a first target standard color temperature interval corresponding to the reference luminance image based on the color channel parameters of each pixel in the reference luminance image and the corresponding relationship; 基于所述欠曝图像的高光区域中有效像素点的颜色通道参数和所述对应关系,确定所述高光区域对应的第二目标标准色温区间。Based on the color channel parameters of the effective pixel points in the highlight area of the underexposed image and the corresponding relationship, a second target standard color temperature interval corresponding to the highlight area is determined. 5.根据权利要求4所述的图像融合方法,其特征在于,基于所述欠曝图像的高光区域中有效像素点的颜色通道参数和所述对应关系,确定所述高光区域对应的第二目标标准色温区间的步骤,包括:5 . The image fusion method according to claim 4 , wherein the second target corresponding to the highlight region is determined based on the color channel parameters of effective pixels in the highlight region of the underexposed image and the corresponding relationship. 6 . The steps for the standard color temperature interval include: 根据所述高光区域中的有效像素点对应于多个所述标准色温区间的占比,确定所述高光区域对应的第二目标标准色温区间。A second target standard color temperature interval corresponding to the highlight area is determined according to the proportion of the effective pixel points in the highlight area corresponding to the plurality of standard color temperature intervals. 6.根据权利要求5所述的图像融合方法,其特征在于,所述高光区域为多个;根据所述高光区域中的有效像素点对应于多个所述标准色温区间的占比,确定所述高光区域对应的第二目标标准色温区间的步骤,包括以下之一:6 . The image fusion method according to claim 5 , wherein there are multiple highlight regions; according to the proportion of effective pixels in the highlight region corresponding to the multiple standard color temperature intervals, the number of the highlight regions is determined. 7 . The step of determining the second target standard color temperature interval corresponding to the highlight area includes one of the following: 将多个所述高光区域中像素点最多的区域作为有效区域,基于所述有效区域中的有效像素点对应于多个所述标准色温区间的占比,确定所述高光区域对应的第二目标标准色温区间;Taking the area with the most pixels in the multiple highlight areas as the effective area, and determining the second target corresponding to the highlight area based on the proportion of the effective pixels in the effective area corresponding to the multiple standard color temperature intervals Standard color temperature range; 分别根据每个所述高光区域中的有效像素点对应于多个所述标准色温区间的占比,确定每个所述高光区域对应的第二目标标准色温区间;Determine the second target standard color temperature interval corresponding to each of the high-light areas according to the proportion of the effective pixel points in each of the high-light areas corresponding to the plurality of standard color temperature intervals; 统计所有所述高光区域中的有效像素点对应于多个所述标准色温区间的占比,根据统计出的所述占比确定所有所述高光区域对应的第二目标标准色温区间。The proportions of valid pixels in all the highlight areas corresponding to the plurality of standard color temperature intervals are counted, and the second target standard color temperature intervals corresponding to all the highlight areas are determined according to the counted proportions. 7.根据权利要求5所述的图像融合方法,其特征在于,多个标准色温区间分别为高色温区间、中色温区间和低色温区间;根据所述高光区域中的有效像素点对应于多个所述标准色温区间的占比,确定所述高光区域对应的第二目标标准色温区间的步骤,包括:7. The image fusion method according to claim 5, characterized in that, a plurality of standard color temperature intervals are respectively a high color temperature interval, a medium color temperature interval and a low color temperature interval; The proportion of the standard color temperature interval, and the step of determining the second target standard color temperature interval corresponding to the highlight area includes: 统计所述高光区域中属于高色温区间的第一类有效像素点的第一占比值、属于中色温区间的第二类有效像素点的第二占比值和属于低色温区间的第三类有效像素点的第三占比值;Counting the first proportion value of the first type of effective pixel points belonging to the high color temperature interval, the second proportion value of the second type of effective pixel points belonging to the medium color temperature interval, and the third type of effective pixels belonging to the low color temperature interval in the highlight area The third proportion of points; 比较所述第一占比值、所述第二占比值和所述第三占比值的大小;comparing the size of the first proportion value, the second proportion value and the third proportion value; 如果所述第一占比值最大,且大于预设占比阈值,确定所述高光区域对应所述高色温区间;If the first proportion value is the largest and is greater than a preset proportion threshold value, determine that the highlight area corresponds to the high color temperature interval; 如果所述第一占比值小于或等于所述预设占比阈值,将所述第一占比值、所述第二占比值和所述第三占比值中最小占比值对应的有效像素点设置为无效像素点,基于所述高光区域中当前的有效像素点对应的颜色通道参数确定所述高光区域的第二目标标准色温区间。If the first proportion value is less than or equal to the preset proportion threshold value, set the effective pixel point corresponding to the smallest proportion value among the first proportion value, the second proportion value and the third proportion value as For invalid pixel points, the second target standard color temperature interval of the highlight area is determined based on the color channel parameter corresponding to the current valid pixel point in the highlight area. 8.根据权利要求1-7任一项所述的图像融合方法,其特征在于,基于所述高光区域中的有效像素点对应的颜色通道参数确定所述高光区域的校准参数的步骤,包括:8. The image fusion method according to any one of claims 1-7, wherein the step of determining the calibration parameters of the highlight area based on the color channel parameters corresponding to the effective pixels in the highlight area comprises: 计算所述高光区域中所有有效像素点对应的颜色通道参数的均值;Calculate the mean value of the color channel parameters corresponding to all valid pixels in the highlight area; 基于计算得到的所述均值确定所述高光区域的校准参数。A calibration parameter of the highlight area is determined based on the calculated mean value. 9.根据权利要求3所述的图像融合方法,其特征在于,基于所述高光区域中的有效像素点对应的颜色通道参数确定所述高光区域的校准参数的步骤,包括:9. The image fusion method according to claim 3, wherein the step of determining the calibration parameters of the highlight region based on the color channel parameters corresponding to the effective pixels in the highlight region comprises: 计算所述高光区域中属于所述第二目标标准色温区间的有效像素点对应的颜色通道参数的均值;Calculate the mean value of the color channel parameters corresponding to the valid pixels in the highlight area belonging to the second target standard color temperature interval; 基于计算得到的所述均值确定所述高光区域的校准参数。A calibration parameter of the highlight area is determined based on the calculated mean value. 10.根据权利要求1-9任一项所述的图像融合方法,其特征在于,所述欠曝图像为两张,第一欠曝图像对应的EV小于第二欠曝图像对应的EV,且所述第一欠曝图像和所述第二欠曝图像对应的EV均大于0;10. The image fusion method according to any one of claims 1-9, wherein there are two underexposed images, the EV corresponding to the first underexposed image is smaller than the EV corresponding to the second underexposed image, and EVs corresponding to the first underexposure image and the second underexposure image are both greater than 0; 对所述基准亮度图像和所述欠曝优化图像进行融合处理,得到所述同一场景对应的融合图像的步骤,包括:The steps of performing fusion processing on the reference brightness image and the under-exposure optimized image to obtain the fusion image corresponding to the same scene include: 对所述基准亮度图像、所述第一欠曝图像对应的欠曝优化图像和所述第二欠曝图像对应的欠曝优化图像进行高动态范围HDR融合处理,得到所述同一场景对应的融合图像。Performing high dynamic range HDR fusion processing on the reference brightness image, the underexposure optimized image corresponding to the first underexposure image, and the underexposure optimized image corresponding to the second underexposure image, to obtain the fusion corresponding to the same scene image. 11.根据权利要求1-10任一项所述的图像融合方法,其特征在于,所述基准亮度图像对应的EV等于0。11 . The image fusion method according to claim 1 , wherein the EV corresponding to the reference luminance image is equal to 0. 12 . 12.根据权利要求1-10任一项所述的图像融合方法,其特征在于,所述基准亮度图像对应的EV不等于0;12. The image fusion method according to any one of claims 1-10, wherein the EV corresponding to the reference luminance image is not equal to 0; 对所述基准亮度图像和所述欠曝优化图像进行融合处理,得到所述同一场景对应的融合图像的步骤,包括:The steps of performing fusion processing on the reference brightness image and the under-exposure optimized image to obtain the fusion image corresponding to the same scene include: 根据所述基准亮度图像对应的EV变换规则,对所述基准亮度图像进行亮度变换,得到基准优化图像,融合所述基准优化图像和所述欠曝优化图像,得到所述同一场景对应的融合图像;或者,According to the EV transformation rule corresponding to the benchmark brightness image, perform brightness transformation on the benchmark brightness image to obtain a benchmark optimized image, and fuse the benchmark optimized image and the underexposed optimized image to obtain a fusion image corresponding to the same scene ;or, 融合所述基准亮度图像和所述欠曝优化图像,得到所述同一场景对应的初步融合图像,根据所述基准亮度图像对应的EV变换规则,对所述初步融合图像进行亮度变换,得到所述同一场景对应的融合图像。Fusing the reference brightness image and the underexposure optimized image to obtain a preliminary fusion image corresponding to the same scene, and performing brightness transformation on the preliminary fusion image according to the EV transformation rule corresponding to the reference brightness image to obtain the The fused images corresponding to the same scene. 13.根据权利要求1-12任一项所述的图像融合方法,其特征在于,所述获取多个标准色温区间分别与颜色通道参数的对应关系包括:13. The image fusion method according to any one of claims 1-12, wherein the obtaining the corresponding relationship between a plurality of standard color temperature intervals and color channel parameters comprises: 获取多个预设标准色温光源下的拍照图像;Obtain photographed images under multiple preset standard color temperature light sources; 统计确定每个所述预设标准色温光源对应的拍照图像的颜色通道参数;Statistically determining the color channel parameters of the photographed image corresponding to each of the preset standard color temperature light sources; 基于每个所述预设标准色温光源对应的颜色通道参数生成多个标准色温区间分别与颜色通道参数的对应关系。Based on the color channel parameters corresponding to each of the preset standard color temperature light sources, corresponding relationships between a plurality of standard color temperature intervals and the color channel parameters are generated. 14.一种图像融合装置,其特征在于,所述装置包括:14. An image fusion device, wherein the device comprises: 配准对齐模块,用于对同一场景的多张图像进行配准对齐,得到空间对齐的基准亮度图像和至少一张欠曝图像;其中,所述基准亮度图像对应的曝光值EV和所述欠曝图像的EV不同;The registration and alignment module is used for registering and aligning multiple images of the same scene to obtain a spatially aligned reference brightness image and at least one underexposure image; wherein, the exposure value EV corresponding to the reference brightness image and the underexposure image The EV of the exposed image is different; 有效像素点确定模块,用于获取多个标准色温区间分别与颜色通道参数的对应关系,并基于所述欠曝图像的高光区域中每个像素点对应的颜色通道参数和所述对应关系确定所述高光区域的有效像素点;The effective pixel point determination module is used to obtain the corresponding relationship between a plurality of standard color temperature intervals and the color channel parameters, and determine the corresponding color channel parameters and the corresponding relationship based on the color channel parameters of each pixel point in the highlight area of the underexposed image. The effective pixel points of the highlighted area; 欠曝优化图像获取模块,用于基于所述高光区域中的有效像素点对应的颜色通道参数确定所述高光区域的校准参数,并应用所述高光区域的校准参数对所述高光区域进行校准,得到欠曝优化图像;an underexposure optimized image acquisition module, configured to determine calibration parameters of the highlight region based on color channel parameters corresponding to valid pixels in the highlight region, and apply the calibration parameters of the highlight region to calibrate the highlight region, Get underexposed optimized images; 融合处理模块,用于对所述基准亮度图像和所述欠曝优化图像进行融合处理,得到所述同一场景对应的融合图像。A fusion processing module, configured to perform fusion processing on the reference brightness image and the under-exposure optimized image to obtain a fusion image corresponding to the same scene. 15.一种电子系统,其特征在于,所述电子系统包括:图像采集设备、处理设备和存储装置;15. An electronic system, characterized in that the electronic system comprises: an image acquisition device, a processing device, and a storage device; 所述图像采集设备,用于获取同一场景的多张图像;The image acquisition device is used to acquire multiple images of the same scene; 所述存储装置上存储有计算机程序,所述计算机程序在被所述处理设备运行时执行如权利要求1-13任一项所述的图像融合方法。A computer program is stored on the storage device, and when the computer program is executed by the processing device, the image fusion method according to any one of claims 1-13 is executed. 16.一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器运行时执行上述权利要求1-13任一项所述的图像融合方法的步骤。16. A computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, and the computer program executes the image according to any one of the preceding claims 1-13 when the computer program is run by a processor The steps of the fusion method.
CN202011513700.2A 2020-12-18 2020-12-18 Image fusion method and device and electronic system Pending CN112614083A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011513700.2A CN112614083A (en) 2020-12-18 2020-12-18 Image fusion method and device and electronic system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011513700.2A CN112614083A (en) 2020-12-18 2020-12-18 Image fusion method and device and electronic system

Publications (1)

Publication Number Publication Date
CN112614083A true CN112614083A (en) 2021-04-06

Family

ID=75244325

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011513700.2A Pending CN112614083A (en) 2020-12-18 2020-12-18 Image fusion method and device and electronic system

Country Status (1)

Country Link
CN (1) CN112614083A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114466477A (en) * 2021-12-31 2022-05-10 珠海雷特科技股份有限公司 Constant-current dimming method of multichannel light source, computer device and computer readable storage medium
CN115209120A (en) * 2021-04-09 2022-10-18 爱思开海力士有限公司 Image sensing device, operation method thereof and image processing device
CN116112651A (en) * 2023-02-02 2023-05-12 北京易航远智科技有限公司 White balance processing method, device, electronic equipment and storage medium

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105933617A (en) * 2016-05-19 2016-09-07 中国人民解放军装备学院 High dynamic range image fusion method used for overcoming influence of dynamic problem
CN107392859A (en) * 2017-06-16 2017-11-24 广东欧珀移动通信有限公司 Method, device and terminal for removing highlight area
CN107545556A (en) * 2016-06-28 2018-01-05 杭州海康威视数字技术股份有限公司 A kind of processing method and system of signal lamp image
WO2018018771A1 (en) * 2016-07-29 2018-02-01 宇龙计算机通信科技(深圳)有限公司 Dual camera-based photography method and system
CN107809582A (en) * 2017-10-12 2018-03-16 广东欧珀移动通信有限公司 Image processing method, electronic device and computer readable storage medium
CN108093233A (en) * 2017-12-28 2018-05-29 努比亚技术有限公司 A kind of image processing method, terminal and computer readable storage medium
US20180260941A1 (en) * 2017-03-07 2018-09-13 Adobe Systems Incorporated Preserving color in image brightness adjustment for exposure fusion
WO2019019695A1 (en) * 2017-07-27 2019-01-31 北京大学深圳研究生院 Underwater image enhancement method based on retinex model
CN110177221A (en) * 2019-06-25 2019-08-27 维沃移动通信有限公司 The image pickup method and device of high dynamic range images
WO2019183813A1 (en) * 2018-03-27 2019-10-03 华为技术有限公司 Image capture method and device
CN110611750A (en) * 2019-10-31 2019-12-24 北京迈格威科技有限公司 A night scene high dynamic range image generation method, device and electronic equipment
CN110751608A (en) * 2019-10-23 2020-02-04 北京迈格威科技有限公司 Night scene high dynamic range image fusion method and device and electronic equipment
CN111028189A (en) * 2019-12-09 2020-04-17 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
US20200228696A1 (en) * 2017-10-30 2020-07-16 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method for obtaining exposure compensation values of high dynamic range image, terminal device and non-transitory computer-readable storage medium
CN111526299A (en) * 2020-04-28 2020-08-11 华为技术有限公司 High dynamic range image synthesis method and electronic equipment
CN111586308A (en) * 2020-04-10 2020-08-25 北京迈格威科技有限公司 Image processing method, device and electronic device
CN111986129A (en) * 2020-06-30 2020-11-24 普联技术有限公司 HDR image generation method and device based on multi-shot image fusion and storage medium

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105933617A (en) * 2016-05-19 2016-09-07 中国人民解放军装备学院 High dynamic range image fusion method used for overcoming influence of dynamic problem
CN107545556A (en) * 2016-06-28 2018-01-05 杭州海康威视数字技术股份有限公司 A kind of processing method and system of signal lamp image
WO2018018771A1 (en) * 2016-07-29 2018-02-01 宇龙计算机通信科技(深圳)有限公司 Dual camera-based photography method and system
US20180260941A1 (en) * 2017-03-07 2018-09-13 Adobe Systems Incorporated Preserving color in image brightness adjustment for exposure fusion
CN107392859A (en) * 2017-06-16 2017-11-24 广东欧珀移动通信有限公司 Method, device and terminal for removing highlight area
WO2019019695A1 (en) * 2017-07-27 2019-01-31 北京大学深圳研究生院 Underwater image enhancement method based on retinex model
WO2019072190A1 (en) * 2017-10-12 2019-04-18 Oppo广东移动通信有限公司 Image processing method, electronic apparatus, and computer readable storage medium
CN107809582A (en) * 2017-10-12 2018-03-16 广东欧珀移动通信有限公司 Image processing method, electronic device and computer readable storage medium
US20200228696A1 (en) * 2017-10-30 2020-07-16 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method for obtaining exposure compensation values of high dynamic range image, terminal device and non-transitory computer-readable storage medium
CN108093233A (en) * 2017-12-28 2018-05-29 努比亚技术有限公司 A kind of image processing method, terminal and computer readable storage medium
WO2019183813A1 (en) * 2018-03-27 2019-10-03 华为技术有限公司 Image capture method and device
CN110177221A (en) * 2019-06-25 2019-08-27 维沃移动通信有限公司 The image pickup method and device of high dynamic range images
CN110751608A (en) * 2019-10-23 2020-02-04 北京迈格威科技有限公司 Night scene high dynamic range image fusion method and device and electronic equipment
CN110611750A (en) * 2019-10-31 2019-12-24 北京迈格威科技有限公司 A night scene high dynamic range image generation method, device and electronic equipment
CN111028189A (en) * 2019-12-09 2020-04-17 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN111586308A (en) * 2020-04-10 2020-08-25 北京迈格威科技有限公司 Image processing method, device and electronic device
CN111526299A (en) * 2020-04-28 2020-08-11 华为技术有限公司 High dynamic range image synthesis method and electronic equipment
CN111986129A (en) * 2020-06-30 2020-11-24 普联技术有限公司 HDR image generation method and device based on multi-shot image fusion and storage medium

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
付争方;朱虹;薛杉;余顺园;史静;: "基于Sigmoid函数拟合的多曝光图像直接融合算法", 仪器仪表学报, no. 10, 15 October 2015 (2015-10-15) *
常猛;冯华君;徐之海;李奇;: "单张LDR图像的曝光校正与细节增强", 光子学报, no. 04, 10 February 2018 (2018-02-10) *
王想;郭延文;杜振龙;武港山;张福炎;彭群生;: "图像和视频亮度的自动调整", 电子学报, no. 1, 15 April 2009 (2009-04-15) *
贺理;陈果;郭宏;金伟其;: "一种基于双通道CMOS相机的低照度动态场景HDR融合方法", 红外技术, no. 04, 20 April 2020 (2020-04-20) *
韩思祺: "基于导向上采样的高动态范围成像加速方法研究", 硕士学位论文, no. 02, 15 February 2020 (2020-02-15) *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115209120A (en) * 2021-04-09 2022-10-18 爱思开海力士有限公司 Image sensing device, operation method thereof and image processing device
CN115209120B (en) * 2021-04-09 2024-04-26 爱思开海力士有限公司 Image sensing device, operation method thereof and image processing device
CN114466477A (en) * 2021-12-31 2022-05-10 珠海雷特科技股份有限公司 Constant-current dimming method of multichannel light source, computer device and computer readable storage medium
CN116112651A (en) * 2023-02-02 2023-05-12 北京易航远智科技有限公司 White balance processing method, device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN112614083A (en) Image fusion method and device and electronic system
CN103973958B (en) Image processing method and equipment
CN104796683B (en) A kind of method and system of calibration image color
US9516290B2 (en) White balance method in multi-exposure imaging system
CN111641819B (en) Method, device, system and computer device for white balance gain correction
CN100553301C (en) Brightness Correction Method
US20140078247A1 (en) Image adjuster and image adjusting method and program
US8704911B2 (en) Image processing apparatus, image processing method, and recording medium
US8614751B2 (en) Image processing apparatus and image processing method
JP2018503326A (en) System and method for generating a high dynamic range (HDR) pixel stream
US8810681B2 (en) Image processing apparatus and image processing method
US9036046B2 (en) Image processing apparatus and method with white balance correction
CN114222105A (en) White balance adjusting method, white balance adjusting system, white balance terminal and storage medium
US11574390B2 (en) Apparatus and method for image processing
JP2008124928A (en) Auto white balance system
TWI523542B (en) White balance compensation method and electronic apparatus using the same
CN114697628A (en) Image acquisition method, apparatus, device, and medium
CN110677558A (en) Image processing method and electronic device
CN116634279A (en) Image processing method, device, electronic equipment and storage medium
TWI778476B (en) Dual sensor imaging system and imaging method thereof
JP5911340B2 (en) Imaging apparatus and control method thereof
CN108668122A (en) Color rendition method and equipment for spectral response curve
TWI768282B (en) Method and system for establishing light source information prediction model
CN112532872B (en) Method and device for adjusting camera parameters, storage medium and electronic equipment
CN113271450B (en) White balance adjustment method, image processing device and image processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination