Nothing Special   »   [go: up one dir, main page]

US12041358B2 - High dynamic range image synthesis method and apparatus, image processing chip and aerial camera - Google Patents

High dynamic range image synthesis method and apparatus, image processing chip and aerial camera Download PDF

Info

Publication number
US12041358B2
US12041358B2 US17/938,517 US202217938517A US12041358B2 US 12041358 B2 US12041358 B2 US 12041358B2 US 202217938517 A US202217938517 A US 202217938517A US 12041358 B2 US12041358 B2 US 12041358B2
Authority
US
United States
Prior art keywords
image
exposure
brightness
pixel point
medium
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US17/938,517
Other versions
US20230038844A1 (en
Inventor
Zhaozao Li
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Autel Robotics Co Ltd
Original Assignee
Autel Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autel Robotics Co Ltd filed Critical Autel Robotics Co Ltd
Assigned to AUTEL ROBOTICS CO., LTD. reassignment AUTEL ROBOTICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LI, ZHAOZAO
Publication of US20230038844A1 publication Critical patent/US20230038844A1/en
Application granted granted Critical
Publication of US12041358B2 publication Critical patent/US12041358B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10144Varying exposure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the disclosure relates to the field of image processing technologies, and in particular, to a high dynamic range (HDR) image synthesis method and apparatus, an image processing chip and an aerial camera.
  • HDR high dynamic range
  • a high dynamic range (HDR) image is image data that may provide more details of bright and dark images and can reflect the visual effect in the real environment more desirably relative to a common image.
  • the HDR image is synthesized by a plurality of common images having different exposure time (that is, a low dynamic range) by using the optimal detail corresponding to each exposure time.
  • the obtained HDR image is synthesized after the plurality of common images are generated through a plurality of exposures. Problems such as smearing of a moving object, a decrease in the picture definition, and sometimes even an error in brightness easily occur in a specific application scene such as aerial photography with a high moving rate.
  • Embodiments of the disclosure are intended to provide a high dynamic range (HDR) image synthesis method and apparatus, an image processing chip and an aerial camera, which can solve the defects existing in the HDR image synthesis method.
  • HDR high dynamic range
  • An HDR image synthesis method including:
  • the image brightness type includes a high-light image and a low-light scene.
  • the determining an image brightness type of the to-be-synthesized images according to the mean brightness specifically includes: determining that the image brightness type of the to-be-synthesized image is the high-light image when the mean brightness is greater than or equal to a preset brightness detection threshold; and determining that the image brightness type of the to-be-synthesized image is the low-light scene when the mean brightness is less than the brightness detection threshold.
  • the calculating a mean brightness of the to-be-synthesized images specifically includes: superimposing brightness values of all pixel points in the to-be-synthesized image to obtain an accumulated brightness value; summing the accumulated brightness values of all of the to-be-synthesized images to obtain a total brightness value; and calculating the mean brightness value according to the total brightness value, a quantity of the to-be-synthesized images and sizes of the to-be-synthesized images.
  • the to-be-synthesized images include a short-exposure image, a medium-exposure image and a long-exposure image that are continuously shot.
  • An exposure time of the short-exposure image is less than an exposure time of the medium-exposure image, and the exposure time of the medium-exposure image is less than an exposure time of the long-exposure image.
  • the motion state includes a moving pixel and a stationary pixel.
  • the determining a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference specifically includes: determining whether an inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both greater than or equal to a preset motion detection threshold; when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both less than a preset motion detection threshold, determining that a motion state at the pixel point position is the moving pixel; and when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both more than a preset motion detection threshold, determining that a motion state at the pixel point position is the stationary pixel.
  • the calculating a brightness difference between adjacent pixel points in one to-be-synthesized image specifically includes: calculating a first brightness difference between a target pixel point and an adjacent first pixel point and a second brightness difference between the target pixel point and an adjacent second pixel point; and acquiring a difference between the first brightness difference and the second brightness difference as a brightness difference of the target pixel point.
  • the weighting and synthesizing the plurality of to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state specifically includes:
  • the weighting and synthesizing the plurality of to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state specifically includes:
  • An HDR image synthesis apparatus including:
  • an image processing chip including a processor and a memory communicatively connected to the processor.
  • the memory stores a computer program instruction, and the computer program instruction, when invoked by the processor, causes the processor to perform the HDR image synthesis method as described above.
  • an aerial camera includes:
  • a weight ratio of different common images is adaptively adjusted during synthesizing of the HDR image according to different motion states and different image brightness types of the to-be-synthesized images, thereby effectively avoiding the problems of the smearing of a moving object and the decreased picture definition during the synthesis into the HDR image from the plurality of images.
  • FIG. 1 is a schematic diagram of an application scene of a high dynamic range (HDR) image synthesis method according to an embodiment of the disclosure.
  • HDR high dynamic range
  • FIG. 2 is a structural block diagram of an aerial camera according to an embodiment of the disclosure.
  • FIG. 3 is a schematic diagram of an HDR image synthesis apparatus according to an embodiment of the disclosure.
  • FIG. 4 is a method flowchart of an image processing method according to an embodiment of the disclosure.
  • FIG. 5 is a method flowchart of a motion state determination method according to an embodiment of the disclosure.
  • FIG. 6 is a schematic diagram of pixel point positions according to an embodiment of the disclosure.
  • FIG. 7 is a schematic structural diagram of an image processing chip according to an embodiment of the disclosure.
  • the disclosure is described in more detail below with reference to the accompanying drawings and specific embodiments. It should be noted that, when a component is expressed as “being fixed to” another component, the component may be directly on the another component, or one or more intermediate components may exist between the component and the another component. When one component is expressed as “being connected to” another component, the component may be directly connected to the another component, or one or more intermediate components may exist between the component and the another component.
  • orientation or position relationships indicated by the terms such as “up”, “down”, “inside”, “outside” and “bottom” are based on orientation or position relationships shown in the accompanying drawings, and are used only for ease and brevity of illustration and description of the disclosure, rather than indicating or implying that the mentioned apparatus or component needs to have a particular orientation or needs to be constructed and operated in a particular orientation. Therefore, such terms should not be construed as limiting of the disclosure.
  • terms “first”, “second” and “third” are only used to describe the objective and cannot be understood as indicating or implying relative importance.
  • a high dynamic range (HDR) image is synthesized by a plurality of common images having different exposure time, so as to show details of brightness more desirably.
  • FIG. 1 shows an application scene of an HDR image synthesis method according to an embodiment of the disclosure.
  • an unmanned aerial vehicle (UAV) 10 equipped with an aerial camera, a smart terminal 20 , and a wireless network 30 are included.
  • UAV unmanned aerial vehicle
  • the UAV 10 may be any type of power-driven UAV.
  • the UAV includes but is not limited to a four-axis UAV, a fixed-wing aircraft, a helicopter model, and the like.
  • the UAV may have a corresponding volume or power according to actual conditions, so as to provide a load capacity, a flight speed and a flight range that can meet use requirements.
  • the aerial camera may be any type of image acquisition device, including a sports camera, a high-definition camera or a wide-angle camera.
  • the aerial camera may be mounted and fixed to the UAV by a mounting and fixing bracket such as a gimbal, and is controlled by the UAV 10 to execute a task of image acquisition.
  • one or more functional modules may further be arranged on the UAV, so that the UAV can realize a corresponding function.
  • a built-in main control chip uploads captured image information to a device that establishes a connection to the UAV.
  • the smart terminal 20 may be any type of smart device configured to establish a communication connection to the UAV, for example, a mobile phone, a tablet computer, a smart remote control or the like.
  • the smart terminal 20 may be equipped with one or more different user interactive apparatuses for collecting instructions from a user or displaying and feeding back information to the user.
  • the interactive apparatuses include but are not limited to a button, a display screen, a touch screen, a speaker and a remote control joystick.
  • the smart terminal 20 may be equipped with a touch display screen. Through the touch display screen, a remote control instruction for the UAV is received from a user, and image information obtained by the aerial camera is presented to the user. The user may further switch the image information currently displayed on the display screen through a remote touch screen.
  • the existing image visual processing technology may further be fused between the UAV 10 and the smart terminal 20 to further provide more intelligent services.
  • the UAV 10 may capture an image through the aerial camera, and then the smart terminal 20 parses an operation gesture in the image, so as to realize gesture control for the UAV 10 by the user.
  • the wireless network 30 may be a wireless communication network configured to establish a data transmission channel between two nodes based on any type of data transmission principle, for example, a Bluetooth network, a Wi-Fi network, a wireless cellular network, or a combination thereof in specific signal frequency bands.
  • FIG. 2 is a structural block diagram of an aerial camera 11 according to an embodiment of the disclosure.
  • the aerial camera 11 may include an image sensor 111 , a controller 112 , and an image processor 113 .
  • the image sensor 111 is a functional module configured to capture an image with set shooting parameters. An optical signal corresponding to a visual picture is projected onto a photosensitive element through a lens and a related optical component, and the photosensitive element converts the optical signal to a corresponding electrical signal.
  • the shooting parameters are adjustable parameter variables such as an aperture, a focal length or an exposure time that are related to a structure of the lens and the related optical component (such as a shutter) during image acquisition of the image sensor 111 .
  • the image sensor 111 may capture one image through each exposure.
  • the controller 112 is a control core of the image sensor 111 .
  • the controller is connected to the image sensor, and may accordingly control a shooting behavior of the image sensor 111 according to the received instruction. For example, one or more shooting parameters of the image sensor 111 are set.
  • the controller 112 may trigger the image sensor to continuously capture a plurality of images with different exposure time.
  • a quantity of the captured images is a constant value set artificially, which may be a default value preset by a technician, or may be a value set by a user according to synthesis requirements of the HDR image during use.
  • three images having different exposure time may be continuously captured.
  • the images are respectively referred to as a short-exposure image, a medium-exposure image and a long-exposure image based on the exposure time.
  • the image processor 113 is a functional module configured to synthesize the HDR image.
  • the image processor may receive the plurality of images continuously captured by the image sensor and synthesize the images into a corresponding HDR image.
  • the aerial camera may further include a storage device 114 configured to store data information generated by the aerial camera 11 during use, for example, store the to-be-synthesized image, the synthesized HDR image, and the like.
  • the storage device may specifically adopt any type of non-volatile memory having a suitable capacity, such as an SD card, a flash memory, or a solid-state hard disk.
  • the storage device 114 may further be a detachable structure or a structure in a distributed arrangement.
  • the aerial camera may be provided with only a data interface, and the data of the to-be-synthesized image or the HDR image is transmitted to the corresponding device for storage through the data interface.
  • one or more functional modules (such as the controller, the image processor and the storage device) of the aerial camera 11 shown in FIG. 2 may also be integrated into the UAV 10 as a part of the UAV 10 .
  • the functional module of the aerial camera 11 is exemplarily described only based on the image capture process, which is not intended to limit the functional module of the aerial camera 11 .
  • FIG. 3 is a structural block diagram of an HDR image synthesis apparatus according to an embodiment of the disclosure.
  • the HDR image synthesis apparatus may be executed by the above image processor.
  • the composition of the HDR image synthesis apparatus is described by using the functional module.
  • the functional module shown in FIG. 3 may be implemented through software, hardware or a combination of the software and the hardware according to the actual situation.
  • the functional module may be implemented by the processor invoking a relevant software application stored in a memory.
  • the HDR image synthesis apparatus 300 includes an image acquisition module 310 , a brightness detection module 320 , a secondary difference calculation module 330 , a motion detection module 340 and a synthesis module 350 .
  • the image acquisition module 310 is configured to acquire a plurality of to-be-synthesized images. Each of the to-be-synthesized images has a different exposure time.
  • the to-be-synthesized images are image data information captured by the image sensor through one exposure.
  • the to-be-synthesized images that are continuously captured may be assembled into an image set configured to synthesize a final HDR image.
  • the brightness detection module 320 is configured to calculate a mean brightness of the to-be-synthesized images and determine an image brightness type of the to-be-synthesized images according to the mean brightness.
  • the to-be-synthesized images may have significantly different image brightness depending on different environments where the to-be-synthesized image is located when being shot.
  • each to-be-synthesized image may be roughly divided into different image brightness types according to a difference in the image brightness.
  • the to-be-synthesized image may be divided into two different image brightness types: a high-light image and a low-light image based on when it is captured (in the daytime or in the nighttime).
  • the secondary difference calculation module 330 is configured to calculate a brightness difference between adjacent pixel points in one to-be-synthesized image, and calculate an inter-frame difference of the to-be-synthesized image at a same pixel point position according to the brightness difference.
  • the inter-frame difference indicates a change situation between different to-be-synthesized images within a specific area.
  • the motion detection module 340 is configured to determine a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference.
  • the inter-frame difference calculated by the secondary difference calculation module 330 indicates a dynamic change situation of a certain area in time. Therefore, it may be accordingly determined whether different positions of the images are changed, so as to determine the motion state at the pixel point position. A specific motion state may be determined according to the actual situation.
  • the motion state may be simply divided into a moving pixel and a stationary pixel.
  • the moving pixel indicates that the image at the pixel point position is moved.
  • the stationary pixel indicates that the image at that pixel point position is not moved.
  • the synthesis module 350 is configured to weight and synthesize the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.
  • Weighting and synthesis mean assigning a corresponding weight to a different to-be-synthesized image, so as to obtain a required HDR image through synthesis. Some to-be-synthesized images with poor image quality may be less frequently considered by adjusting the weight of the to-be-synthesized images, so as to reduce the impact of these to-be-synthesized images on the quality of the HDR image.
  • the weights of the to-be-synthesized images are adaptively adjusted and considered according to the image brightness type and the motion state, thereby effectively avoiding the interference of some to-be-synthesized images, which is beneficial to improve the quality of the HDR image.
  • the aerial camera mounted to the UAV is used as an example.
  • the HDR image synthesis method may further be used in other types of scenes and devices, so as to improve the quality of the output HDR image.
  • the HDR image synthesis method disclosed in the embodiment of the disclosure is not limited to application on the UAV shown in FIG. 1 .
  • FIG. 4 is a method flowchart of an HDR synthesis method according to an embodiment of the disclosure. As shown in FIG. 4 , the image processing method includes the following steps.
  • Each of the to-be-synthesized images has a different exposure time.
  • a specific exposure time may be set according to an actual situation, which is an empirical value, and the details are not described herein.
  • These to-be-synthesized images are some images continuously shot.
  • the images are configured as a data base to be synthesized into an HDR image.
  • the “mean brightness” is an overall image brightness in the to-be-synthesized image, which reflects the light intensity of a surrounding environment during the image capturing. A higher mean brightness indicates a higher light intensity of the surrounding environment during capturing of the to-be-synthesized image.
  • the mean brightness may be calculated in any suitable manner.
  • the mean brightness may be calculated in the following manner.
  • the mean brightness value is calculated according to the total brightness value, a quantity of the to-be-synthesized images and sizes of the to-be-synthesized images. In this way, the mean brightness value of the plurality of to-be-synthesized images at one pixel point position may be calculated and used as the “mean brightness”.
  • the image brightness type is a type that is determined or divided in advance according to a difference in brightness. Under different use conditions, an appropriate quantity of image brightness types may be divided according to use requirements, so that the to-be-synthesized images having a similar mean brightness are used as the same image brightness type for further processing.
  • the image brightness type to which the to-be-synthesized image belongs may be determined by setting an appropriate brightness detection threshold.
  • a brightness detection threshold can be preset when the image brightness type includes a high-light image and a low-light image.
  • the mean brightness is greater than or equal to a preset brightness detection threshold, it is determined that the to-be-synthesized image is the high-light image.
  • the mean brightness is less than the brightness detection threshold, it is determined that the to-be-synthesized image is the low-light image.
  • the brightness detection threshold is an empirical value, which may be set according to the actual situation.
  • the high-light image corresponds to a scene with enough light or enough brightness in the daytime.
  • the low-light image indicates that a shooting scene of the to-be-synthesized image is a scene with severely insufficient light at night.
  • the to-be-synthesized image is actually composed of a plurality of different pixel points.
  • the pixel point is a smallest basic unit in an image. In one image, the difference between adjacent pixel points roughly reflects the texture of a shot object.
  • a specific calculation method for the brightness difference may include the following steps.
  • a first brightness difference of a target pixel point and an adjacent first pixel point and a second brightness difference between a target pixel point and an adjacent second pixel point are calculated. Then, a difference between the first brightness difference and the second brightness difference is acquired as a brightness difference of the target pixel point.
  • the target pixel point is currently selected, and the pixel point for determining the motion state is required to be calculated. As shown in FIG. 6 , for any pixel point, a surrounding edge of the pixel point can be surrounded by eight adjacent pixel points. The first pixel point and the second pixel point are two of the eight pixel points around the target pixel point.
  • the inter-frame difference is calculated from a difference between the brightness differences in the different to-be-synthesized images at the same position. It may be understood by those skilled in the art that, in a case that a shot object does not move significantly, the texture of the plurality of continuously shot images at the same position is not to change significantly.
  • the inter-frame difference obtained based on a secondary difference can reflect the movement situation of the shot object.
  • the inter-frame difference is excessively large, it indicates that the shot object moves violently.
  • the inter-frame difference is relatively small, it indicates that the position of the shot object is basically not changed.
  • the secondary difference provided in this embodiment is based on the brightness difference between the adjacent pixel points to measure the difference between the different to-be-synthesized images, which can effectively avoid the impact brought by the brightness difference existing in the to-be-synthesized images, and serves as an accurate determination basis for motion detection.
  • the motion state means whether a shot object moves.
  • the pixel point position where the shot object has moved may be referred to as a moving pixel.
  • the pixel point position when the shot object has not moved is referred to as a stationary pixel.
  • the to-be-synthesized image including a short-exposure image, a medium-exposure image and a long-exposure image that are continuously shot is used as an example (an exposure time of the short-exposure image is less than an exposure time of the medium-exposure image, and the exposure time of the medium-exposure image is less than an exposure time of the long-exposure image).
  • a specific determination process of the motion state is described in detail. As shown in FIG. 5 , a method for determining the motion state includes the following steps.
  • step 520 Determine whether K 1 and K 2 are both less than or equal to a preset motion detection threshold.
  • step 530 is performed, and when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both more than a preset motion detection threshold, step 540 is performed.
  • the motion detection threshold is an empirical value, which may be set according to the actual situation.
  • the “moving pixel” means that the shot object at the pixel point position moves.
  • the pixels of all to-be-synthesized images at this position are referred to as the “moving pixel”.
  • the “stationary pixel” means that the shot object at the pixel point position does not move.
  • the pixels of all to-be-synthesized images at this position are referred to as the “stationary pixel”.
  • the two indicators including the image brightness type and the motion state can well reflect the scene situation of the to-be-synthesized image during shooting. Therefore, the weight of each to-be-synthesized image in the synthesis process can be adaptively adjusted, so that the synthesized HDR image has better image quality.
  • each pixel point of the same HDR image is calculated and determined by weighting and synthesizing of the pixel points of the to-be-synthesized images at the same pixel point position.
  • the specific weighting and synthesis process is as follows.
  • a corresponding short-exposure weight coefficient, a medium-exposure weight coefficient and a long-exposure weight coefficient are respectively preset for the short-exposure image, the medium-exposure image and the long-exposure image.
  • the short-exposure weight coefficient, the medium-exposure weight coefficient and the long-exposure weight coefficient are all preset weight values, which can be adjusted or set accordingly according to the actual situation, indicating the weight ratio of the to-be-synthesized images during synthesizing of the HDR image in general.
  • the motion state at the pixel point position and the image brightness type of the to-be-synthesized image are determined, which are respectively divided into the following situations for processing.
  • Pixel points of the short-exposure image, the medium-exposure image and the long-exposure image at the pixel point position are weighted and synthesized into a pixel point of the HDR image at a same pixel point position according to the short-exposure weight coefficient, the medium-exposure weight coefficient and the long-exposure weight coefficient when the motion state at the pixel point position is the stationary pixel and the image brightness type is a high-light image.
  • the motion state at the pixel point position is the stationary pixel and the image brightness type is a high-light image
  • the short-exposure image and the long-exposure image are discarded, and the pixel points of the medium-exposure image at the pixel point position are weighted and synthesized into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient.
  • the weight coefficients of the short-exposure image and the long-exposure image are required to be adjusted to zero to avoid causing an adverse effect on the finally synthesized HDR image.
  • the motion state is the stationary pixel and the image brightness type is the low-light image
  • the short-exposure image is discarded, and the pixel points of the medium-exposure image and the long-exposure image at the pixel point position are weighted and synthesized into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient and the long-exposure weight coefficient.
  • the weight coefficient of the short-exposure image can be adjusted to zero in this case, so as to avoid causing an adverse effect on the finally synthesized HDR image.
  • the motion state is the moving pixel and the image brightness type is a low-light image
  • the medium-exposure image and the long-exposure image are discarded, and the pixel points of the short-exposure image at the pixel point position are weighted and synthesized into the pixel point of the HDR image at the same pixel point position according to the short-exposure weight coefficient.
  • the weight coefficients of the short-exposure image and the medium-exposure image may be adjusted to zero to avoid causing an adverse effect on the finally synthesized HDR image.
  • the finally outputted HDR image can have an HDR and high definition in a stationary scene in the daytime, and has a desirable technical effect of low picture noise and no smearing during the motion in the night scene.
  • the image sensor continuously captures a short-exposure image with an exposure time of x/2, a medium-exposure image with an exposure time of x, and a long-exposure image with an exposure time of 2 ⁇ as a to-be-synthesized image each time.
  • a length of the to-be-synthesized image is w pixels, and a width thereof is h pixels.
  • the short-exposure image is first transmitted to the image processor after being shot, and the medium-exposure image and the long-exposure image are transmitted in sequence, so that the overall image synthesis process has a minimum delay.
  • the mean brightness of the to-be-synthesized image is calculated by a brightness detection module 320 through the following equation (1):
  • S (i,j) is a brightness value of a pixel point in an i th row and a j th column of the short-exposure image
  • M (i,j) is a brightness value of a pixel point in an i th row and a j th column of the medium-exposure image
  • L (i,j) is a brightness value of a pixel point in an i th row and a j th column of the long-exposure image
  • L is the mean brightness.
  • the brightness detection module 320 is configured to determine whether the mean brightness L is greater than or equal to a preset brightness detection threshold T. When the mean brightness L is greater than or equal to the brightness detection threshold T, it is determined that the image brightness type of the to-be-synthesized image is the high-light image. When the mean brightness L is less than the brightness detection threshold T, it is determined that the image brightness type of the to-be-synthesized image is the low-light image.
  • ⁇ S (i,j) ⁇ S (i,j) ⁇ S (i+1,j)
  • ⁇ M (i,j) ⁇ M (i,j) ⁇ M (i+1,j)
  • ⁇ L (i,j) ⁇ L (i,j) ⁇ L (i+1,j)
  • ⁇ S (i,j) is a brightness difference of pixel points in an i th row and a j th column of the short-exposure image
  • ⁇ M (i,j) is a brightness difference of pixel points in an i th row and a j th column of the medium-exposure image
  • ⁇ L (i,j) is a brightness difference of pixel points in an i th row and a j th column of the long-exposure image (as shown in FIG. 6 , adjacent pixel points are eight pixel points around the target pixel point).
  • an inter-frame difference between the short-exposure image and the medium-exposure image can be calculated as
  • a motion detection module 340 determines, based on the inter-frame difference calculated by the secondary difference calculation module 330 , whether the two inter-frame differences are both less than a preset motion detection threshold A.
  • the motion state at the pixel point position (i,j) is determined as the moving pixel.
  • the synthesis module 350 is connected to the brightness detection module 320 and the motion detection module 340 , and adjusts and determines specific weight coefficients according to the image brightness type and the motion state provided by the brightness detection module and the motion detection module, so as to complete the synthesis of the HDR image.
  • H (i,j) a ⁇ S (i,j) +b ⁇ M (i,j) +c ⁇ L (i,j) (3)
  • a is a short-exposure weight coefficient
  • b is a medium-exposure weight coefficient
  • c is a long-exposure weight coefficient.
  • S (i,j) is a pixel point in an i th row and a j th column of the short-exposure image
  • M (i,j) is a pixel point in an i th row and a j th column of the medium-exposure image
  • L (i,j) is a pixel point in an i th row and a j th column of the long-exposure image
  • H (i,j) is a pixel point in an i th row and a j th column of the synthesized HDR image.
  • the synthesis module 350 performs weighting and synthesis according to the equation (3).
  • the to-be-synthesized images shot by after a plurality of consecutive exposures may be integrated into the HDR image having higher image quality in a targeted manner, so as to avoid the problems such as smearing of a moving object, a decrease in picture definition, and even an error in brightness that easily occur during synthesizing of the HDR image in a high-speed moving scene such as aerial photography.
  • An embodiment of the disclosure further provides a non-volatile computer storage medium.
  • the computer storage medium stores at least one executable instruction, and the computer-executable instruction can be used for performing the HDR image synthesis method in any of the above method embodiments.
  • FIG. 7 is a schematic structural diagram of an image processing chip according to an embodiment of the disclosure.
  • the specific embodiments of the disclosure do not limit the specific implementation of the image processing chip.
  • the image processing chip may include a processor 702 , a communication interface 704 , a memory 706 , and a communication bus 708 .
  • the processor 702 , the communication interface 704 , and the memory 706 communicate with each other through the communication bus 708 .
  • the communication interface 704 is configured to communicate with a network element of other devices such as a client or other servers.
  • the processor 702 is configured to execute a program 710 , and specifically may execute the relevant steps in the above embodiments of the HDR image synthesis method.
  • the program 710 may include program code, and the program code includes a computer operation instruction.
  • the processor 702 may be a central processing unit (CPU), or an application specific integrated circuit (ASIC), or one or more integrated circuits configured to implement the embodiments of the disclosure.
  • processors included in a network slicing device may be a same type of processor, such as one or more CPUs, or may be different types of processors, such as one or more CPUs and one or more ASICs.
  • the memory 706 is configured to store the program 710 .
  • the memory 706 may include a high-speed RAM memory, or may further include a non-volatile memory, for example, at least one magnetic disk memory.
  • the program 710 can specifically be configured to cause the processor 702 to execute the HDR image synthesis method in any of the above method embodiments.
  • the computer software may be stored in a computer-readable storage medium.
  • the program may include the processes of the embodiments of the foregoing methods.
  • the storage medium may be a magnetic disk, an optical disc, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)

Abstract

Embodiments of the present invention are a high dynamic range (HDR) synthesis method and apparatus, an image processing chip and an aerial camera. The method includes: acquiring a plurality of to-be-synthesized images having different exposure time; calculating a mean brightness of the to-be-synthesized images; determining an image brightness type of the to-be-synthesized images according to the mean brightness; calculating a brightness difference between adjacent pixel points in one to-be-synthesized image; calculating an inter-frame difference of different to-be-synthesized images at a same pixel point position according to the brightness difference; determining a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference; and weighting and synthesizing the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
The present application is a continuation of International Application No. PCT/CN2021/083350, filed on Mar. 26, 2021, which claims priority to Chinese Patent Application No 2020102915710, filed on Apr. 14, 2020, which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
The disclosure relates to the field of image processing technologies, and in particular, to a high dynamic range (HDR) image synthesis method and apparatus, an image processing chip and an aerial camera.
BACKGROUND
A high dynamic range (HDR) image is image data that may provide more details of bright and dark images and can reflect the visual effect in the real environment more desirably relative to a common image. Generally, the HDR image is synthesized by a plurality of common images having different exposure time (that is, a low dynamic range) by using the optimal detail corresponding to each exposure time.
However, the obtained HDR image is synthesized after the plurality of common images are generated through a plurality of exposures. Problems such as smearing of a moving object, a decrease in the picture definition, and sometimes even an error in brightness easily occur in a specific application scene such as aerial photography with a high moving rate.
Therefore, how to avoid the defects such as the smearing of a moving object and the decrease in the picture definition caused by synthesizing the HDR image through the plurality of images is an urgent problem to be solved.
SUMMARY
Embodiments of the disclosure are intended to provide a high dynamic range (HDR) image synthesis method and apparatus, an image processing chip and an aerial camera, which can solve the defects existing in the HDR image synthesis method.
To resolve the foregoing technical problems, the embodiments of the disclosure provide the following technical solutions. An HDR image synthesis method is provided, including:
    • acquiring a plurality of to-be-synthesized images, each having different exposure time; calculating a mean brightness of the to-be-synthesized images; determining an image brightness type of the to-be-synthesized images according to the mean brightness; calculating a brightness difference between adjacent pixel points in one to-be-synthesized image; calculating an inter-frame difference of different to-be-synthesized images at a same pixel point position according to the brightness difference; determining a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference; and weighting and synthesizing the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.
Optionally, the image brightness type includes a high-light image and a low-light scene. The determining an image brightness type of the to-be-synthesized images according to the mean brightness specifically includes: determining that the image brightness type of the to-be-synthesized image is the high-light image when the mean brightness is greater than or equal to a preset brightness detection threshold; and determining that the image brightness type of the to-be-synthesized image is the low-light scene when the mean brightness is less than the brightness detection threshold.
Optionally, the calculating a mean brightness of the to-be-synthesized images specifically includes: superimposing brightness values of all pixel points in the to-be-synthesized image to obtain an accumulated brightness value; summing the accumulated brightness values of all of the to-be-synthesized images to obtain a total brightness value; and calculating the mean brightness value according to the total brightness value, a quantity of the to-be-synthesized images and sizes of the to-be-synthesized images.
Optionally, the to-be-synthesized images include a short-exposure image, a medium-exposure image and a long-exposure image that are continuously shot. An exposure time of the short-exposure image is less than an exposure time of the medium-exposure image, and the exposure time of the medium-exposure image is less than an exposure time of the long-exposure image.
Optionally, the motion state includes a moving pixel and a stationary pixel. The determining a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference specifically includes: determining whether an inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both greater than or equal to a preset motion detection threshold; when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both less than a preset motion detection threshold, determining that a motion state at the pixel point position is the moving pixel; and when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both more than a preset motion detection threshold, determining that a motion state at the pixel point position is the stationary pixel.
Optionally, the calculating a brightness difference between adjacent pixel points in one to-be-synthesized image specifically includes: calculating a first brightness difference between a target pixel point and an adjacent first pixel point and a second brightness difference between the target pixel point and an adjacent second pixel point; and acquiring a difference between the first brightness difference and the second brightness difference as a brightness difference of the target pixel point.
Optionally, the weighting and synthesizing the plurality of to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state specifically includes:
    • respectively presetting a corresponding short-exposure weight coefficient, a medium-exposure weight coefficient and a long-exposure weight coefficient for the short-exposure image, the medium-exposure image and the long-exposure image; and
    • weighting and synthesizing pixel points of the short-exposure image, the medium-exposure image and the long-exposure image at the pixel point position into a pixel point of the HDR image at a same pixel point position according to the short-exposure weight coefficient, the medium-exposure weight coefficient and the long-exposure weight coefficient when the motion state at the pixel point position is the stationary pixel and the image brightness type is a high-light image.
Optionally, the weighting and synthesizing the plurality of to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state specifically includes:
    • discarding the short-exposure image and the long-exposure image when the motion state at the pixel point position is the stationary pixel and the image brightness type is the high-light image;
    • weighting and synthesizing the pixel points of the medium-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient;
    • discarding the short-exposure image when the motion state is the stationary pixel and the image brightness type is a low-light image;
    • weighting and synthesizing the pixel points of the medium-exposure image and the long-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient and the long-exposure weight coefficient;
    • discarding the medium-exposure image and the long-exposure image when the motion state is the moving pixel and the image brightness type is the low-light image; and
    • weighting and synthesizing the pixel points of the short-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the short-exposure weight coefficient.
In order to resolve the above technical problem, the embodiments of the disclosure further provide the following technical solution. An HDR image synthesis apparatus is provided, including:
    • an image acquisition module, configured to acquire a plurality of to-be-synthesized images, each having different exposure time; a scene detection module, configured to calculate a mean brightness of the to-be-synthesized images, and determine an image brightness type of the to-be-synthesized images according to the mean brightness; a secondary difference calculation module, configured to calculate a brightness difference between adjacent pixel points in one to-be-synthesized image, and calculate an inter-frame difference of the to-be-synthesized image at a same pixel point position according to the brightness difference; a motion detection module, configured to determine a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference; and a synthesis module, configured to weight and synthesize the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.
In order to resolve the above technical problem, the embodiments of the disclosure further provide the following technical solution: an image processing chip, including a processor and a memory communicatively connected to the processor. The memory stores a computer program instruction, and the computer program instruction, when invoked by the processor, causes the processor to perform the HDR image synthesis method as described above.
In order to resolve the above technical problem, the embodiments of the disclosure further provide the following technical solution: an aerial camera. The aerial camera includes:
    • an image sensor, configured to capture a plurality of images with set shooting parameters; a controller, connected to the image sensor and configured to trigger the image sensor to capture the plurality of images with different exposure time; an image processor, configured to receive the plurality of images captured by the image sensor through continuous exposure and perform the above HDR image synthesis method on the received plurality of images to obtain an HDR image; and a storage device, connected to the image processor and configured to store the HDR image.
Compared with the prior art, according to the HDR image synthesis method in the embodiment of the disclosure, a weight ratio of different common images is adaptively adjusted during synthesizing of the HDR image according to different motion states and different image brightness types of the to-be-synthesized images, thereby effectively avoiding the problems of the smearing of a moving object and the decreased picture definition during the synthesis into the HDR image from the plurality of images.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments are exemplarily described with reference to the corresponding figures in the accompanying drawings, and the descriptions are not to be construed as limiting the embodiments. Elements in the accompanying drawings that have same reference numerals are represented as similar elements, and unless otherwise particularly stated, the figures in the accompanying drawings are not drawn to scale.
FIG. 1 is a schematic diagram of an application scene of a high dynamic range (HDR) image synthesis method according to an embodiment of the disclosure.
FIG. 2 is a structural block diagram of an aerial camera according to an embodiment of the disclosure.
FIG. 3 is a schematic diagram of an HDR image synthesis apparatus according to an embodiment of the disclosure.
FIG. 4 is a method flowchart of an image processing method according to an embodiment of the disclosure.
FIG. 5 is a method flowchart of a motion state determination method according to an embodiment of the disclosure.
FIG. 6 is a schematic diagram of pixel point positions according to an embodiment of the disclosure.
FIG. 7 is a schematic structural diagram of an image processing chip according to an embodiment of the disclosure.
DETAILED DESCRIPTION
For ease of understanding the disclosure, the disclosure is described in more detail below with reference to the accompanying drawings and specific embodiments. It should be noted that, when a component is expressed as “being fixed to” another component, the component may be directly on the another component, or one or more intermediate components may exist between the component and the another component. When one component is expressed as “being connected to” another component, the component may be directly connected to the another component, or one or more intermediate components may exist between the component and the another component. In the description of this specification, orientation or position relationships indicated by the terms such as “up”, “down”, “inside”, “outside” and “bottom” are based on orientation or position relationships shown in the accompanying drawings, and are used only for ease and brevity of illustration and description of the disclosure, rather than indicating or implying that the mentioned apparatus or component needs to have a particular orientation or needs to be constructed and operated in a particular orientation. Therefore, such terms should not be construed as limiting of the disclosure. In addition, terms “first”, “second” and “third” are only used to describe the objective and cannot be understood as indicating or implying relative importance.
Unless otherwise defined, meanings of all technical and scientific terms used in this specification are the same as those usually understood by a person skilled in art of the disclosure. Terms used in the specification of the disclosure are merely intended to describe objectives of the specific embodiments and are not intended to limit the disclosure. A term “and/or” used in this specification includes any or all combinations of one or more related listed items.
In addition, technical features involved in different embodiments of the disclosure described below may be combined together if there is no conflict.
When a camera captures a picture, exposure time of different lengths may change an amount of transmitted light that enters a photosensitive element, thereby obtaining images having different details. A high dynamic range (HDR) image is synthesized by a plurality of common images having different exposure time, so as to show details of brightness more desirably.
In some shooting environments, such as in case of high-speed movement, an image having an inappropriate exposure time has an obvious quality problem. Therefore, these images are required to be screened and adjusted, so as to more effectively improve the quality of the HDR image obtained by synthesis, thereby avoiding the problems such as the decrease in picture definition and an error in brightness.
FIG. 1 shows an application scene of an HDR image synthesis method according to an embodiment of the disclosure. As shown in FIG. 1 , in the application scene, an unmanned aerial vehicle (UAV) 10 equipped with an aerial camera, a smart terminal 20, and a wireless network 30 are included.
The UAV 10 may be any type of power-driven UAV. The UAV includes but is not limited to a four-axis UAV, a fixed-wing aircraft, a helicopter model, and the like. The UAV may have a corresponding volume or power according to actual conditions, so as to provide a load capacity, a flight speed and a flight range that can meet use requirements.
The aerial camera may be any type of image acquisition device, including a sports camera, a high-definition camera or a wide-angle camera. As a functional module mounted to the UAV, the aerial camera may be mounted and fixed to the UAV by a mounting and fixing bracket such as a gimbal, and is controlled by the UAV 10 to execute a task of image acquisition.
Certainly, one or more functional modules may further be arranged on the UAV, so that the UAV can realize a corresponding function. For example, as a control core for UAV flight and data transmission or an image transmission apparatus, a built-in main control chip uploads captured image information to a device that establishes a connection to the UAV.
The smart terminal 20 may be any type of smart device configured to establish a communication connection to the UAV, for example, a mobile phone, a tablet computer, a smart remote control or the like. The smart terminal 20 may be equipped with one or more different user interactive apparatuses for collecting instructions from a user or displaying and feeding back information to the user.
The interactive apparatuses include but are not limited to a button, a display screen, a touch screen, a speaker and a remote control joystick. For example, the smart terminal 20 may be equipped with a touch display screen. Through the touch display screen, a remote control instruction for the UAV is received from a user, and image information obtained by the aerial camera is presented to the user. The user may further switch the image information currently displayed on the display screen through a remote touch screen.
In some embodiments, the existing image visual processing technology may further be fused between the UAV 10 and the smart terminal 20 to further provide more intelligent services. For example, the UAV 10 may capture an image through the aerial camera, and then the smart terminal 20 parses an operation gesture in the image, so as to realize gesture control for the UAV 10 by the user.
The wireless network 30 may be a wireless communication network configured to establish a data transmission channel between two nodes based on any type of data transmission principle, for example, a Bluetooth network, a Wi-Fi network, a wireless cellular network, or a combination thereof in specific signal frequency bands.
FIG. 2 is a structural block diagram of an aerial camera 11 according to an embodiment of the disclosure. As shown in FIG. 2 , the aerial camera 11 may include an image sensor 111, a controller 112, and an image processor 113.
The image sensor 111 is a functional module configured to capture an image with set shooting parameters. An optical signal corresponding to a visual picture is projected onto a photosensitive element through a lens and a related optical component, and the photosensitive element converts the optical signal to a corresponding electrical signal.
The shooting parameters are adjustable parameter variables such as an aperture, a focal length or an exposure time that are related to a structure of the lens and the related optical component (such as a shutter) during image acquisition of the image sensor 111. The image sensor 111 may capture one image through each exposure.
The controller 112 is a control core of the image sensor 111. The controller is connected to the image sensor, and may accordingly control a shooting behavior of the image sensor 111 according to the received instruction. For example, one or more shooting parameters of the image sensor 111 are set.
Under an appropriate trigger condition, the controller 112 may trigger the image sensor to continuously capture a plurality of images with different exposure time. A quantity of the captured images is a constant value set artificially, which may be a default value preset by a technician, or may be a value set by a user according to synthesis requirements of the HDR image during use.
For example, three images having different exposure time may be continuously captured. The images are respectively referred to as a short-exposure image, a medium-exposure image and a long-exposure image based on the exposure time.
The image processor 113 is a functional module configured to synthesize the HDR image. The image processor may receive the plurality of images continuously captured by the image sensor and synthesize the images into a corresponding HDR image.
In some embodiments, the aerial camera may further include a storage device 114 configured to store data information generated by the aerial camera 11 during use, for example, store the to-be-synthesized image, the synthesized HDR image, and the like. The storage device may specifically adopt any type of non-volatile memory having a suitable capacity, such as an SD card, a flash memory, or a solid-state hard disk.
In some embodiments, the storage device 114 may further be a detachable structure or a structure in a distributed arrangement. The aerial camera may be provided with only a data interface, and the data of the to-be-synthesized image or the HDR image is transmitted to the corresponding device for storage through the data interface.
It should be noted that one or more functional modules (such as the controller, the image processor and the storage device) of the aerial camera 11 shown in FIG. 2 may also be integrated into the UAV 10 as a part of the UAV 10. In FIG. 2 , the functional module of the aerial camera 11 is exemplarily described only based on the image capture process, which is not intended to limit the functional module of the aerial camera 11.
FIG. 3 is a structural block diagram of an HDR image synthesis apparatus according to an embodiment of the disclosure. The HDR image synthesis apparatus may be executed by the above image processor. In this embodiment, the composition of the HDR image synthesis apparatus is described by using the functional module.
Those skilled in the art can understand that the functional module shown in FIG. 3 may be implemented through software, hardware or a combination of the software and the hardware according to the actual situation. For example, the functional module may be implemented by the processor invoking a relevant software application stored in a memory.
As shown in FIG. 3 , the HDR image synthesis apparatus 300 includes an image acquisition module 310, a brightness detection module 320, a secondary difference calculation module 330, a motion detection module 340 and a synthesis module 350.
The image acquisition module 310 is configured to acquire a plurality of to-be-synthesized images. Each of the to-be-synthesized images has a different exposure time. The to-be-synthesized images are image data information captured by the image sensor through one exposure. The to-be-synthesized images that are continuously captured may be assembled into an image set configured to synthesize a final HDR image.
The brightness detection module 320 is configured to calculate a mean brightness of the to-be-synthesized images and determine an image brightness type of the to-be-synthesized images according to the mean brightness.
The to-be-synthesized images may have significantly different image brightness depending on different environments where the to-be-synthesized image is located when being shot. In this embodiment, each to-be-synthesized image may be roughly divided into different image brightness types according to a difference in the image brightness.
For example, the to-be-synthesized image may be divided into two different image brightness types: a high-light image and a low-light image based on when it is captured (in the daytime or in the nighttime).
The secondary difference calculation module 330 is configured to calculate a brightness difference between adjacent pixel points in one to-be-synthesized image, and calculate an inter-frame difference of the to-be-synthesized image at a same pixel point position according to the brightness difference. The inter-frame difference indicates a change situation between different to-be-synthesized images within a specific area.
The motion detection module 340 is configured to determine a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference.
The inter-frame difference calculated by the secondary difference calculation module 330 indicates a dynamic change situation of a certain area in time. Therefore, it may be accordingly determined whether different positions of the images are changed, so as to determine the motion state at the pixel point position. A specific motion state may be determined according to the actual situation.
For example, the motion state may be simply divided into a moving pixel and a stationary pixel. The moving pixel indicates that the image at the pixel point position is moved. The stationary pixel indicates that the image at that pixel point position is not moved.
The synthesis module 350 is configured to weight and synthesize the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.
“Weighting and synthesis” mean assigning a corresponding weight to a different to-be-synthesized image, so as to obtain a required HDR image through synthesis. Some to-be-synthesized images with poor image quality may be less frequently considered by adjusting the weight of the to-be-synthesized images, so as to reduce the impact of these to-be-synthesized images on the quality of the HDR image.
In this embodiment, the weights of the to-be-synthesized images are adaptively adjusted and considered according to the image brightness type and the motion state, thereby effectively avoiding the interference of some to-be-synthesized images, which is beneficial to improve the quality of the HDR image.
However, in the application scene shown in FIG. 1 , the aerial camera mounted to the UAV is used as an example. Those skilled in the art may understand that the HDR image synthesis method may further be used in other types of scenes and devices, so as to improve the quality of the output HDR image. The HDR image synthesis method disclosed in the embodiment of the disclosure is not limited to application on the UAV shown in FIG. 1 .
FIG. 4 is a method flowchart of an HDR synthesis method according to an embodiment of the disclosure. As shown in FIG. 4 , the image processing method includes the following steps.
410. Acquire a plurality of to-be-synthesized images.
Each of the to-be-synthesized images has a different exposure time. A specific exposure time may be set according to an actual situation, which is an empirical value, and the details are not described herein. These to-be-synthesized images are some images continuously shot. The images are configured as a data base to be synthesized into an HDR image.
420. Calculate a mean brightness of the to-be-synthesized images.
The “mean brightness” is an overall image brightness in the to-be-synthesized image, which reflects the light intensity of a surrounding environment during the image capturing. A higher mean brightness indicates a higher light intensity of the surrounding environment during capturing of the to-be-synthesized image.
Specifically, the mean brightness may be calculated in any suitable manner. In some embodiments, the mean brightness may be calculated in the following manner.
First, brightness values of all pixel points in the to-be-synthesized image are superimposed to obtain an accumulated brightness value. Then, the accumulated brightness values of all of the to-be-synthesized images are summed to obtain a total brightness value. Finally, the mean brightness value is calculated according to the total brightness value, a quantity of the to-be-synthesized images and sizes of the to-be-synthesized images. In this way, the mean brightness value of the plurality of to-be-synthesized images at one pixel point position may be calculated and used as the “mean brightness”.
430. Determine an image brightness type of the to-be-synthesized image according to the mean brightness.
The image brightness type is a type that is determined or divided in advance according to a difference in brightness. Under different use conditions, an appropriate quantity of image brightness types may be divided according to use requirements, so that the to-be-synthesized images having a similar mean brightness are used as the same image brightness type for further processing.
In some embodiments, the image brightness type to which the to-be-synthesized image belongs may be determined by setting an appropriate brightness detection threshold. For example, a brightness detection threshold can be preset when the image brightness type includes a high-light image and a low-light image.
When the mean brightness is greater than or equal to a preset brightness detection threshold, it is determined that the to-be-synthesized image is the high-light image. When the mean brightness is less than the brightness detection threshold, it is determined that the to-be-synthesized image is the low-light image.
The brightness detection threshold is an empirical value, which may be set according to the actual situation. The high-light image corresponds to a scene with enough light or enough brightness in the daytime. The low-light image indicates that a shooting scene of the to-be-synthesized image is a scene with severely insufficient light at night.
440. Calculate a brightness difference between adjacent pixel points in one to-be-synthesized image.
The to-be-synthesized image is actually composed of a plurality of different pixel points. The pixel point is a smallest basic unit in an image. In one image, the difference between adjacent pixel points roughly reflects the texture of a shot object.
Specifically, a specific calculation method for the brightness difference may include the following steps.
First, a first brightness difference of a target pixel point and an adjacent first pixel point and a second brightness difference between a target pixel point and an adjacent second pixel point are calculated. Then, a difference between the first brightness difference and the second brightness difference is acquired as a brightness difference of the target pixel point.
The target pixel point is currently selected, and the pixel point for determining the motion state is required to be calculated. As shown in FIG. 6 , for any pixel point, a surrounding edge of the pixel point can be surrounded by eight adjacent pixel points. The first pixel point and the second pixel point are two of the eight pixel points around the target pixel point.
450. Calculate an inter-frame difference of different to-be-synthesized images at a same pixel point position according to the brightness difference.
The inter-frame difference is calculated from a difference between the brightness differences in the different to-be-synthesized images at the same position. It may be understood by those skilled in the art that, in a case that a shot object does not move significantly, the texture of the plurality of continuously shot images at the same position is not to change significantly.
Therefore, the inter-frame difference obtained based on a secondary difference can reflect the movement situation of the shot object. In a case that the inter-frame difference is excessively large, it indicates that the shot object moves violently. In a case that the inter-frame difference is relatively small, it indicates that the position of the shot object is basically not changed.
Due to the significantly different exposure time between the to-be-synthesized images, a relatively large brightness difference exists between the different to-be-synthesized images. The conventional method of comparing the brightness difference of the same pixel point position to detect whether the shot object moves cannot exclude the brightness difference existing in the to-be-synthesized image.
The secondary difference provided in this embodiment is based on the brightness difference between the adjacent pixel points to measure the difference between the different to-be-synthesized images, which can effectively avoid the impact brought by the brightness difference existing in the to-be-synthesized images, and serves as an accurate determination basis for motion detection.
460. Determine a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference.
The motion state means whether a shot object moves. Specifically, the pixel point position where the shot object has moved may be referred to as a moving pixel. The pixel point position when the shot object has not moved is referred to as a stationary pixel.
The to-be-synthesized image including a short-exposure image, a medium-exposure image and a long-exposure image that are continuously shot is used as an example (an exposure time of the short-exposure image is less than an exposure time of the medium-exposure image, and the exposure time of the medium-exposure image is less than an exposure time of the long-exposure image). A specific determination process of the motion state is described in detail. As shown in FIG. 5 , a method for determining the motion state includes the following steps.
470. Weight and synthesize the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.
510. Calculate an inter-frame difference K1 between the short-exposure image and the medium-exposure image and an inter-frame difference K2 between the medium-exposure image and the long-exposure image.
520. Determine whether K1 and K2 are both less than or equal to a preset motion detection threshold. when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both more than a preset motion detection threshold, step 530 is performed, and when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both less than a preset motion detection threshold, step 540 is performed.
The motion detection threshold is an empirical value, which may be set according to the actual situation.
530. Determine that a motion state at the pixel point position is the moving pixel.
The “moving pixel” means that the shot object at the pixel point position moves. In this embodiment, the pixels of all to-be-synthesized images at this position are referred to as the “moving pixel”.
540. Determine that a motion state at the pixel point position is the stationary pixel.
The “stationary pixel” means that the shot object at the pixel point position does not move. In this embodiment, the pixels of all to-be-synthesized images at this position are referred to as the “stationary pixel”.
As described above, the two indicators including the image brightness type and the motion state can well reflect the scene situation of the to-be-synthesized image during shooting. Therefore, the weight of each to-be-synthesized image in the synthesis process can be adaptively adjusted, so that the synthesized HDR image has better image quality.
Similarly, a specific weighting and synthesis process is described in detail by using the to-be-synthesized image including a short-exposure image, a medium-exposure image and a long-exposure image continuously shot as an example.
In this embodiment, each pixel point of the same HDR image is calculated and determined by weighting and synthesizing of the pixel points of the to-be-synthesized images at the same pixel point position. The specific weighting and synthesis process is as follows.
First, a corresponding short-exposure weight coefficient, a medium-exposure weight coefficient and a long-exposure weight coefficient are respectively preset for the short-exposure image, the medium-exposure image and the long-exposure image.
The short-exposure weight coefficient, the medium-exposure weight coefficient and the long-exposure weight coefficient are all preset weight values, which can be adjusted or set accordingly according to the actual situation, indicating the weight ratio of the to-be-synthesized images during synthesizing of the HDR image in general.
Then the motion state at the pixel point position and the image brightness type of the to-be-synthesized image are determined, which are respectively divided into the following situations for processing.
1) Pixel points of the short-exposure image, the medium-exposure image and the long-exposure image at the pixel point position are weighted and synthesized into a pixel point of the HDR image at a same pixel point position according to the short-exposure weight coefficient, the medium-exposure weight coefficient and the long-exposure weight coefficient when the motion state at the pixel point position is the stationary pixel and the image brightness type is a high-light image.
Stationariness and sufficient illumination may be considered as ideal conditions, and in ideal conditions, the weight coefficient is not required to be adjusted, and the preset weight ratio can be directly used.
2) When the motion state at the pixel point position is the stationary pixel and the image brightness type is a high-light image, the short-exposure image and the long-exposure image are discarded, and the pixel points of the medium-exposure image at the pixel point position are weighted and synthesized into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient.
Under conditions of motion and sufficient illumination, an excessively long exposure time and an excessively short exposure time cannot achieve good shooting quality (problems of blur and dimness easily occur). Therefore, the weight coefficients of the short-exposure image and the long-exposure image are required to be adjusted to zero to avoid causing an adverse effect on the finally synthesized HDR image.
3) When the motion state is the stationary pixel and the image brightness type is the low-light image, the short-exposure image is discarded, and the pixel points of the medium-exposure image and the long-exposure image at the pixel point position are weighted and synthesized into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient and the long-exposure weight coefficient.
In the case of low light, more light noise occurs to the to-be-synthesized image having a relatively short exposure time, and therefore the image quality is relatively poor. Therefore, the weight coefficient of the short-exposure image can be adjusted to zero in this case, so as to avoid causing an adverse effect on the finally synthesized HDR image.
4) When the motion state is the moving pixel and the image brightness type is a low-light image, the medium-exposure image and the long-exposure image are discarded, and the pixel points of the short-exposure image at the pixel point position are weighted and synthesized into the pixel point of the HDR image at the same pixel point position according to the short-exposure weight coefficient.
In a state of low light and a moving shot object, a relatively long exposure time is required to ensure an amount of transmitted light, and a clear image of the shot object can be obtained. Therefore, the weight coefficients of the short-exposure image and the medium-exposure image may be adjusted to zero to avoid causing an adverse effect on the finally synthesized HDR image.
Through the adaptive adjustment of the above weight coefficients to avoid the interference of some to-be-synthesized images having poor quality, the finally outputted HDR image can have an HDR and high definition in a stationary scene in the daytime, and has a desirable technical effect of low picture noise and no smearing during the motion in the night scene.
In order to fully describe the disclosure, an execution process of the HDR image synthesis method disclosed in the embodiments of the disclosure in the image processor is described in detail below with reference to specific examples.
It is assumed that the image sensor continuously captures a short-exposure image with an exposure time of x/2, a medium-exposure image with an exposure time of x, and a long-exposure image with an exposure time of 2× as a to-be-synthesized image each time. A length of the to-be-synthesized image is w pixels, and a width thereof is h pixels.
Preferably, the short-exposure image is first transmitted to the image processor after being shot, and the medium-exposure image and the long-exposure image are transmitted in sequence, so that the overall image synthesis process has a minimum delay.
During the processing, the mean brightness of the to-be-synthesized image is calculated by a brightness detection module 320 through the following equation (1):
L = i = 1 , j = 1 i = w , j = h S ( i , j ) + i = 1 , j = 1 i = w , j = h M ( i , j ) + i = 1 , j = 1 i = w , j = h L ( i , j ) w × h × 3 ( 1 )
S(i,j) is a brightness value of a pixel point in an ith row and a jth column of the short-exposure image, M(i,j) is a brightness value of a pixel point in an ith row and a jth column of the medium-exposure image, L(i,j) is a brightness value of a pixel point in an ith row and a jth column of the long-exposure image, and L is the mean brightness.
The brightness detection module 320 is configured to determine whether the mean brightness L is greater than or equal to a preset brightness detection threshold T. When the mean brightness L is greater than or equal to the brightness detection threshold T, it is determined that the image brightness type of the to-be-synthesized image is the high-light image. When the mean brightness L is less than the brightness detection threshold T, it is determined that the image brightness type of the to-be-synthesized image is the low-light image.
In addition, the brightness difference between the adjacent pixel points is calculated by the secondary difference calculation module 330 through the following equations (2-1) to (2-3):
ΔS (i,j) =∥S (i,j) −S (i+1,j) |−|S (i,j) −S (i,j+1)∥  (2-1)
ΔM (i,j) =∥M (i,j) −M (i+1,j) |−|M (i,j) −M (i,j+1)∥  (2-2)
ΔL (i,j) =∥L (i,j) −L (i+1,j) |−|L (i,j) −L (i,j+1)∥  (2-3)
ΔS(i,j) is a brightness difference of pixel points in an ith row and a jth column of the short-exposure image, ΔM(i,j) is a brightness difference of pixel points in an ith row and a jth column of the medium-exposure image, and ΔL(i,j) is a brightness difference of pixel points in an ith row and a jth column of the long-exposure image (as shown in FIG. 6 , adjacent pixel points are eight pixel points around the target pixel point).
Based on the brightness difference, an inter-frame difference between the short-exposure image and the medium-exposure image can be calculated as |ΔS(i,j)−ΔM(i,j)|, and an inter-frame difference between the medium-exposure image and the long-exposure image is |ΔM(i,j)−ΔL(i,j)|.
A motion detection module 340 determines, based on the inter-frame difference calculated by the secondary difference calculation module 330, whether the two inter-frame differences are both less than a preset motion detection threshold A.
When |ΔS(i,j)−ΔM(i,j)|<A and |ΔM(i,j)−ΔL(i,j)|<A, the motion state at the pixel point position (i,j) is determined as the stationary pixel.
When one inter-frame difference is greater than or equal to the motion detection threshold A, the motion state at the pixel point position (i,j) is determined as the moving pixel.
The synthesis module 350 is connected to the brightness detection module 320 and the motion detection module 340, and adjusts and determines specific weight coefficients according to the image brightness type and the motion state provided by the brightness detection module and the motion detection module, so as to complete the synthesis of the HDR image.
An ideal weighting and synthesis process shown in the following equation (3) is preset in the synthesis module 350:
H (i,j) =a×S (i,j) +b×M (i,j) +c×L (i,j)  (3)
a is a short-exposure weight coefficient, b is a medium-exposure weight coefficient, and c is a long-exposure weight coefficient. S(i,j) is a pixel point in an ith row and a jth column of the short-exposure image, M(i,j) is a pixel point in an ith row and a jth column of the medium-exposure image, L(i,j) is a pixel point in an ith row and a jth column of the long-exposure image, and H(i,j) is a pixel point in an ith row and a jth column of the synthesized HDR image.
When the image brightness type is the high-light image and the motion state is the stationary pixel, an ideal state is achieved. The synthesis module 350 performs weighting and synthesis according to the equation (3).
When the image brightness type is the high-light image and the motion state is the moving pixel, the synthesis module 350 adjusts coefficients a and c to zero, and performs the weighting and synthesis in the manner shown in the following equation (3-1):
H (i,j) =b×M (i,j)  (3-1)
When the image brightness type is the low-light image and the motion state is the stationary pixel, the synthesis module 350 discards the short-exposure images having more noise, and performs the weighting and synthesis in the manner shown in the following equation (3-2):
H (i,j) =b×M (i,j) +c×L (i,j)  (3-2)
When the image brightness type is the low-light image and the motion state is the moving pixel, the synthesis module 350 only uses the long-exposure image with a long enough exposure time to perform the weighting and synthesis in the manner shown in the following equation (3-3):
H (i,j) =c×L (i,j)  (3-3)
Through the above method, the to-be-synthesized images shot by after a plurality of consecutive exposures may be integrated into the HDR image having higher image quality in a targeted manner, so as to avoid the problems such as smearing of a moving object, a decrease in picture definition, and even an error in brightness that easily occur during synthesizing of the HDR image in a high-speed moving scene such as aerial photography.
An embodiment of the disclosure further provides a non-volatile computer storage medium. The computer storage medium stores at least one executable instruction, and the computer-executable instruction can be used for performing the HDR image synthesis method in any of the above method embodiments.
FIG. 7 is a schematic structural diagram of an image processing chip according to an embodiment of the disclosure. The specific embodiments of the disclosure do not limit the specific implementation of the image processing chip.
As shown in FIG. 7 , the image processing chip may include a processor 702, a communication interface 704, a memory 706, and a communication bus 708.
The processor 702, the communication interface 704, and the memory 706 communicate with each other through the communication bus 708. The communication interface 704 is configured to communicate with a network element of other devices such as a client or other servers. The processor 702 is configured to execute a program 710, and specifically may execute the relevant steps in the above embodiments of the HDR image synthesis method.
Specifically, the program 710 may include program code, and the program code includes a computer operation instruction.
The processor 702 may be a central processing unit (CPU), or an application specific integrated circuit (ASIC), or one or more integrated circuits configured to implement the embodiments of the disclosure. One or more processors included in a network slicing device may be a same type of processor, such as one or more CPUs, or may be different types of processors, such as one or more CPUs and one or more ASICs.
The memory 706 is configured to store the program 710. The memory 706 may include a high-speed RAM memory, or may further include a non-volatile memory, for example, at least one magnetic disk memory.
The program 710 can specifically be configured to cause the processor 702 to execute the HDR image synthesis method in any of the above method embodiments.
A person of ordinary skill in the art may further be aware that, in combination with examples of each step of the HDR image synthesis method described in the embodiments disclosed in this specification, the present application may be implemented by using electronic hardware, computer software, or a combination thereof. To clearly describe interchangeability between the hardware and the software, compositions and steps of each example have been generally described according to functions in the foregoing descriptions. Whether the functions are executed by hardware or software depends on particular applications and design constraint conditions of the technical solutions.
Persons skilled in the art can use different methods to implement the described functions for each particular application, but it should not be considered that the implementation goes beyond the scope of the embodiments of the disclosure. The computer software may be stored in a computer-readable storage medium. When being executed, the program may include the processes of the embodiments of the foregoing methods. The storage medium may be a magnetic disk, an optical disc, a read-only memory (ROM), or a random access memory (RAM).
Finally, it should be noted that the foregoing embodiments are merely used for describing the technical solutions of the disclosure, but are not intended to limit the disclosure. Under the concept of the disclosure, the technical features in the foregoing embodiments or different embodiments may be combined, the steps may be implemented in any sequence, and there may be many other changes in different aspects of the disclosure as described above. For brevity, those are not provided in detail. Although the disclosure is described in detail with reference to the foregoing embodiments, a person of ordinary skill in the art should understand that they may still make modifications to the technical solutions described in the foregoing embodiments or make equivalent replacements to some technical features thereof, without departing from the scope of the technical solutions of the embodiments of the disclosure.

Claims (20)

What is claimed is:
1. A high dynamic range (HDR) image synthesis method, comprising:
acquiring a plurality of to-be-synthesized images, wherein each of the to-be-synthesized images has a different exposure time;
calculating a mean brightness of the to-be-synthesized images;
determining an image brightness type of the to-be-synthesized images according to the mean brightness;
calculating a brightness difference between adjacent pixel points in one to-be-synthesized image;
calculating an inter-frame difference of different to-be-synthesized images at a same pixel point position according to the brightness difference;
determining a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference; and
weighting and synthesizing the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.
2. The method according to claim 1, wherein the image brightness type comprises a high-light image and a low-light image; and
the determining an image brightness type of the to-be-synthesized images according to the mean brightness specifically comprises:
determining that the to-be-synthesized image is the high-light image when the mean brightness is greater than or equal to a preset brightness detection threshold; and
determining that the to-be-synthesized image is the low-light image when the mean brightness is less than the brightness detection threshold.
3. The method according to claim 1, wherein the calculating a mean brightness of the to-be-synthesized image specifically comprises:
superimposing brightness values of all pixel points in the to-be-synthesized image to obtain an accumulated brightness value;
summing the accumulated brightness values of all of the to-be-synthesized images to obtain a total brightness value; and
calculating the mean brightness value according to the total brightness value, a quantity of the to-be-synthesized images and sizes of the to-be-synthesized images.
4. The method according to claim 1, wherein the to-be-synthesized images comprise a short-exposure image, a medium-exposure image and a long-exposure image that are continuously shot;
an exposure time of the short-exposure image is less than an exposure time of the medium-exposure image, and the exposure time of the medium-exposure image is less than an exposure time of the long-exposure image.
5. The method according to claim 4, wherein the motion state comprises a moving pixel and a stationary pixel; and
the determining a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference specifically comprises:
determining whether an inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both less than a preset motion detection threshold;
determining that the motion state at the pixel point position is the stationary pixel when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both less than a preset motion detection threshold; and
determining that the motion state at the pixel point position is the moving pixel when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both more than a preset motion detection threshold.
6. The method according to claim 1, wherein the calculating a brightness difference between adjacent pixel points in one to-be-synthesized image specifically comprises:
calculating a first brightness difference between a target pixel point and an adjacent first pixel point and a second brightness difference between the target pixel point and an adjacent second pixel point; and
acquiring a difference between the first brightness difference and the second brightness difference as a brightness difference of the target pixel point.
7. The method according to claim 5, wherein the weighting and synthesizing the plurality of to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state specifically comprises:
respectively presetting a corresponding short-exposure weight coefficient, a medium-exposure weight coefficient and a long-exposure weight coefficient for the short-exposure image, the medium-exposure image and the long-exposure image; and
weighting and synthesizing pixel points of the short-exposure image, the medium-exposure image and the long-exposure image at the pixel point position into a pixel point of the HDR image at a same pixel point position according to the short-exposure weight coefficient, the medium-exposure weight coefficient and the long-exposure weight coefficient when the motion state at the pixel point position is the stationary pixel and the image brightness type is a high-light image.
8. The method according to claim 7, wherein the weighting and synthesizing the plurality of to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state specifically comprises:
discarding the short-exposure image and the long-exposure image when the motion state at the pixel point position is the stationary pixel and the image brightness type is the high-light image;
weighting and synthesizing the pixel points of the medium-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient;
discarding the short-exposure image when the motion state is the stationary pixel and the image brightness type is a low-light image;
weighting and synthesizing the pixel points of the medium-exposure image and the long-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient and the long-exposure weight coefficient;
discarding the medium-exposure image and the long-exposure image when the motion state is the moving pixel and the image brightness type is the low-light image; and
weighting and synthesizing the pixel points of the short-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the short-exposure weight coefficient.
9. An image processing chip, comprising a processor and a memory communicatively connected to the processor,
the memory storing a computer program instruction, the computer program instruction, when invoked by the processor, causing the processor to:
acquire a plurality of to-be-synthesized images, wherein each of the to-be-synthesized images has a different exposure time;
calculate a mean brightness of the to-be-synthesized images;
determine an image brightness type of the to-be-synthesized images according to the mean brightness;
calculate a brightness difference between adjacent pixel points in one to-be-synthesized image;
calculate an inter-frame difference of different to-be-synthesized images at a same pixel point position according to the brightness difference;
determine a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference; and
weight and synthesize the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.
10. The image processing chip according to claim 9, wherein the image brightness type comprises a high-light image and a low-light image; and
the processor is further configured to:
determine that the to-be-synthesized image is the high-light image when the mean brightness is greater than or equal to a preset brightness detection threshold; and
determine that the to-be-synthesized image is the low-light image when the mean brightness is less than the brightness detection threshold.
11. The image processing chip according to claim 9, wherein the processor is further configured to:
superimpose brightness values of all pixel points in the to-be-synthesized image to obtain an accumulated brightness value;
sum the accumulated brightness values of all of the to-be-synthesized images to obtain a total brightness value; and
calculate the mean brightness value according to the total brightness value, a quantity of the to-be-synthesized images and sizes of the to-be-synthesized images.
12. The image processing chip according to claim 9, wherein the to-be-synthesized images comprise a short-exposure image, a medium-exposure image and a long-exposure image that are continuously shot;
an exposure time of the short-exposure image is less than an exposure time of the medium-exposure image, and the exposure time of the medium-exposure image is less than an exposure time of the long-exposure image.
13. The image processing chip according to claim 12, wherein the motion state comprises a moving pixel and a stationary pixel; and
the processor is further configured to:
determine whether an inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both less than a preset motion detection threshold;
determine that the motion state at the pixel point position is the stationary pixel when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both less than a preset motion detection threshold; and
determine that the motion state at the pixel point position is the moving pixel when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both more than a preset motion detection threshold.
14. The image processing chip according to claim 9, wherein the calculation of the brightness difference between the adjacent pixel points in one to-be-synthesized image specifically comprises:
calculating a first brightness difference between a target pixel point and an adjacent first pixel point and a second brightness difference between the target pixel point and an adjacent second pixel point; and
acquiring a difference between the first brightness difference and the second brightness difference as a brightness difference of the target pixel point.
15. The image processing chip according to claim 13, wherein the processor is further configured to:
respectively preset a corresponding short-exposure weight coefficient, a medium-exposure weight coefficient and a long-exposure weight coefficient for the short-exposure image, the medium-exposure image and the long-exposure image; and
weight and synthesize pixel points of the short-exposure image, the medium-exposure image and the long-exposure image at the pixel point position into a pixel point of the HDR image at the same pixel point position according to the short-exposure weight coefficient, the medium-exposure weight coefficient and the long-exposure weight coefficient when the motion state at the pixel point position is the stationary pixel and the image brightness type is a high-light image.
16. The image processing chip according to claim 15, wherein the processor is further configured to:
discard the short-exposure image and the long-exposure image when the motion state at the pixel point position is the stationary pixel and the image brightness type is the high-light image;
weight and synthesize the pixel points of the medium-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient;
discard the short-exposure image when the motion state is the stationary pixel and the image brightness type is a low-light image;
weight and synthesize the pixel points of the medium-exposure image and the long-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient and the long-exposure weight coefficient;
discard the medium-exposure image and the long-exposure image when the motion state is the moving pixel and the image brightness type is the low-light image; and
weight and synthesize the pixel points of the short-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the short-exposure weight coefficient.
17. An aerial camera, comprising:
an image sensor, configured to capture a plurality of images with set shooting parameters;
a controller, connected to the image sensor and configured to trigger the image sensor to capture the plurality of images with different exposure time; and
an image processor, configured to receive the plurality of images captured by the image sensor through continuous exposure and perform the following operations for the received plurality of images:
acquiring a plurality of to-be-synthesized images, wherein each of the to-be-synthesized images has a different exposure time;
calculating a mean brightness of the to-be-synthesized images;
determining an image brightness type of the to-be-synthesized images according to the mean brightness;
calculating a brightness difference between adjacent pixel points in one to-be-synthesized image;
calculating an inter-frame difference of different to-be-synthesized images at a same pixel point position according to the brightness difference;
determining a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference; and
weighting and synthesizing the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.
18. The aerial camera according to claim 17, wherein the image brightness type comprises a high-light image and a low-light image; and
the image processor is further configured to:
determine that the to-be-synthesized image is the high-light image when the mean brightness is greater than or equal to a preset brightness detection threshold; and
determine that the to-be-synthesized image is the low-light image when the mean brightness is less than the brightness detection threshold.
19. The aerial camera according to claim 17, wherein the image processor is further configured to:
superimpose brightness values of all pixel points in the to-be-synthesized image to obtain an accumulated brightness value;
sum the accumulated brightness values of all of the to-be-synthesized images to obtain a total brightness value; and
calculate the mean brightness value according to the total brightness value, a quantity of the to-be-synthesized images and sizes of the to-be-synthesized images.
20. The aerial camera according to claim 17, wherein the motion state comprises a moving pixel and a stationary pixel; and
the image processor is further configured to:
determine whether an inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both less than a preset motion detection threshold;
determine that the motion state at the pixel point position is the stationary pixel when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both less than a preset motion detection threshold; and
determine that the motion state at the pixel point position is the moving pixel when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are more than a preset motion detection threshold.
US17/938,517 2020-04-14 2022-10-06 High dynamic range image synthesis method and apparatus, image processing chip and aerial camera Active 2041-07-03 US12041358B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202010291571.0 2020-04-14
CN202010291571.0A CN111479072B (en) 2020-04-14 2020-04-14 High dynamic range image synthesis method and device, image processing chip and aerial camera
PCT/CN2021/083350 WO2021208706A1 (en) 2020-04-14 2021-03-26 High dynamic range image composition method and device, image processing chip and aerial camera

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/083350 Continuation WO2021208706A1 (en) 2020-04-14 2021-03-26 High dynamic range image composition method and device, image processing chip and aerial camera

Publications (2)

Publication Number Publication Date
US20230038844A1 US20230038844A1 (en) 2023-02-09
US12041358B2 true US12041358B2 (en) 2024-07-16

Family

ID=71751968

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/938,517 Active 2041-07-03 US12041358B2 (en) 2020-04-14 2022-10-06 High dynamic range image synthesis method and apparatus, image processing chip and aerial camera

Country Status (3)

Country Link
US (1) US12041358B2 (en)
CN (1) CN111479072B (en)
WO (1) WO2021208706A1 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT201900005536A1 (en) * 2019-04-10 2020-10-10 Doss Visual Solution S R L IMAGE ACQUISITION METHOD FOR AN OPTICAL INSPECTION MACHINE
CN111292282B (en) * 2020-01-21 2023-02-17 展讯通信(上海)有限公司 Method and device for generating low-bit-width HDR image, storage medium and terminal
CN111479072B (en) 2020-04-14 2021-12-17 深圳市道通智能航空技术股份有限公司 High dynamic range image synthesis method and device, image processing chip and aerial camera
CN111770243B (en) * 2020-08-04 2021-09-03 深圳市精锋医疗科技有限公司 Image processing method, device and storage medium for endoscope
CN111787183B (en) * 2020-08-04 2021-09-03 深圳市精锋医疗科技有限公司 Image processing method, device and storage medium for endoscope
WO2022041287A1 (en) * 2020-08-31 2022-03-03 华为技术有限公司 Image acquisition method and apparatus, device, and computer-readable storage medium
CN114513610A (en) * 2020-11-17 2022-05-17 浙江大华技术股份有限公司 Image processing method, image processing apparatus, and storage apparatus
CN114630053B (en) * 2020-12-11 2023-12-12 青岛海信移动通信技术有限公司 HDR image display method and display device
CN114650361B (en) * 2020-12-17 2023-06-06 北京字节跳动网络技术有限公司 Shooting mode determining method, shooting mode determining device, electronic equipment and storage medium
CN114820404B (en) * 2021-01-29 2024-08-20 抖音视界有限公司 Image processing method, device, electronic equipment and medium
CN115037915B (en) * 2021-03-05 2023-11-14 华为技术有限公司 Video processing method and processing device
KR20220156242A (en) * 2021-05-18 2022-11-25 에스케이하이닉스 주식회사 Image Processing Device
CN113822926A (en) * 2021-07-23 2021-12-21 昆山丘钛光电科技有限公司 RAW image size determination method, apparatus and medium
US11863880B2 (en) * 2022-05-31 2024-01-02 Microsoft Technology Licensing, Llc Image frame selection for multi-frame fusion
CN117082355B (en) * 2023-09-19 2024-04-12 荣耀终端有限公司 Image processing method and electronic device

Citations (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060177150A1 (en) 2005-02-01 2006-08-10 Microsoft Corporation Method and system for combining multiple exposure images having scene and camera motion
US7382931B2 (en) * 2003-04-29 2008-06-03 Microsoft Corporation System and process for generating high dynamic range video
CN101262564A (en) 2007-03-09 2008-09-10 索尼株式会社 Image processing apparatus, image forming apparatus, image processing method, and computer program
EP2175635A1 (en) 2008-10-10 2010-04-14 Samsung Electronics Co., Ltd. Method and apparatus for creating high dynamic range image
US20100328482A1 (en) * 2009-06-26 2010-12-30 Samsung Electronics Co., Ltd. Digital photographing apparatus, method of controlling the digital photographing apparatus, and recording medium storing program to implement the method
US20130136364A1 (en) 2011-11-28 2013-05-30 Fujitsu Limited Image combining device and method and storage medium storing image combining program
CN103973989A (en) 2014-04-15 2014-08-06 北京理工大学 Method and system for obtaining high-dynamic images
US20150116525A1 (en) * 2013-10-31 2015-04-30 Himax Imaging Limited Method for generating high dynamic range images
US20150348242A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Scene Motion Correction In Fused Image Systems
US9307212B2 (en) * 2007-03-05 2016-04-05 Fotonation Limited Tone mapping for low-light video frame enhancement
US9437171B2 (en) * 2012-12-05 2016-09-06 Texas Instruments Incorporated Local tone mapping for high dynamic range images
CN107231530A (en) 2017-06-22 2017-10-03 维沃移动通信有限公司 A kind of photographic method and mobile terminal
CN108419023A (en) 2018-03-26 2018-08-17 华为技术有限公司 A kind of method and relevant device generating high dynamic range images
WO2018190649A1 (en) 2017-04-12 2018-10-18 Samsung Electronics Co., Ltd. Method and apparatus for generating hdr images
CN108881731A (en) 2018-08-06 2018-11-23 Oppo广东移动通信有限公司 Panorama shooting method, device and imaging device
CN108989700A (en) 2018-08-13 2018-12-11 Oppo广东移动通信有限公司 Image formation control method, device, electronic equipment and computer readable storage medium
CN109005361A (en) 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 Control method, device, imaging device, electronic equipment and readable storage medium storing program for executing
CN109005346A (en) 2018-08-13 2018-12-14 Oppo广东移动通信有限公司 Control method, device, electronic equipment and computer readable storage medium
US10165194B1 (en) * 2016-12-16 2018-12-25 Amazon Technologies, Inc. Multi-sensor camera system
CN109120862A (en) 2018-10-15 2019-01-01 Oppo广东移动通信有限公司 High-dynamic-range image acquisition method, device and mobile terminal
CN109286758A (en) 2018-10-15 2019-01-29 Oppo广东移动通信有限公司 A kind of generation method of high dynamic range images, mobile terminal and storage medium
US10264193B2 (en) * 2016-12-06 2019-04-16 Polycom, Inc. System and method for providing images and video having high dynamic range
CN110381263A (en) 2019-08-20 2019-10-25 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN110572585A (en) 2019-08-26 2019-12-13 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
US10616499B2 (en) * 2016-07-29 2020-04-07 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and device for capturing high dynamic range image, and electronic device
US10701279B2 (en) * 2018-10-02 2020-06-30 Adobe Inc. Utilizing alignment models and motion vector path blending to generate a long exposure digital image from a sequence of short exposure digital images
US20200211166A1 (en) * 2018-12-28 2020-07-02 Qualcomm Incorporated Methods and apparatus for motion compensation in high dynamic range processing
US20200236273A1 (en) * 2019-01-18 2020-07-23 Samsung Electronics Co., Ltd. Imaging systems for generating hdr images and operating methods thereof
CN111479072A (en) 2020-04-14 2020-07-31 深圳市道通智能航空技术有限公司 High dynamic range image synthesis method and device, image processing chip and aerial camera
US10742892B1 (en) * 2019-02-18 2020-08-11 Samsung Electronics Co., Ltd. Apparatus and method for capturing and blending multiple images for high-quality flash photography using mobile electronic device
US10750098B2 (en) * 2016-10-13 2020-08-18 Panasonic Intellectual Property Management Co., Ltd. Image processing device, image processing method, and image processing circuit
US10757344B2 (en) * 2016-07-01 2020-08-25 Maxell, Ltd. Imaging apparatus, imaging method and imaging program
US10916036B2 (en) * 2018-12-28 2021-02-09 Intel Corporation Method and system of generating multi-exposure camera statistics for image processing
US10944914B1 (en) * 2019-12-02 2021-03-09 Samsung Electronics Co., Ltd. System and method for generating multi-exposure frames from single input
US11017509B2 (en) * 2016-12-22 2021-05-25 Huawei Technologies Co., Ltd. Method and apparatus for generating high dynamic range image
US11095829B2 (en) * 2019-06-11 2021-08-17 Samsung Electronics Co., Ltd. Apparatus and method for high dynamic range (HDR) image creation of dynamic scenes using graph cut-based labeling
US11113802B1 (en) * 2019-09-09 2021-09-07 Apple Inc. Progressive image fusion
US20210278836A1 (en) * 2018-11-07 2021-09-09 Autel Robotics Co., Ltd. Method and device for dual-light image integration, and unmanned aerial vehicle
US11128809B2 (en) * 2019-02-15 2021-09-21 Samsung Electronics Co., Ltd. System and method for compositing high dynamic range images
US11190707B2 (en) * 2018-05-22 2021-11-30 Arashi Vision Inc. Motion ghost resistant HDR image generation method and portable terminal
US20210377457A1 (en) * 2017-10-31 2021-12-02 Morpho, Inc. Image compositing device, image compositing method, and storage medium
US20210400172A1 (en) * 2019-03-06 2021-12-23 Autel Robotics Co., Ltd. Imaging processing method and apparatus for a camera module in a night scene, an electronic device, and a storage medium
US20220043117A1 (en) * 2019-11-13 2022-02-10 Lumotive, LLC Lidar systems based on tunable optical metasurfaces
US11276154B2 (en) * 2020-07-17 2022-03-15 Samsung Electronics Co., Ltd. Multi-frame depth-based multi-camera relighting of images
US20220138964A1 (en) * 2020-10-30 2022-05-05 Qualcomm Incorporated Frame processing and/or capture instruction systems and techniques
US11356604B2 (en) * 2020-02-14 2022-06-07 Pixelworks, Inc. Methods and systems for image processing with multiple image sources
US11363213B1 (en) * 2021-02-26 2022-06-14 Qualcomm Incorporated Minimizing ghosting in high dynamic range image processing
US20220198625A1 (en) * 2019-04-11 2022-06-23 Dolby Laboratories Licensing Corporation High-dynamic-range image generation with pre-combination denoising
US11373281B1 (en) * 2021-02-23 2022-06-28 Qualcomm Incorporated Techniques for anchor frame switching
US11379997B2 (en) * 2019-11-01 2022-07-05 Samsung Electronics Co., Ltd. Image devices including image sensors and image signal processors, and operation methods of image sensors
US20220230283A1 (en) * 2021-01-21 2022-07-21 Beijing Xiaomi Pinecone Electronics Co., Ltd. Method and device for processing image, and storage medium
US20220236056A1 (en) * 2019-08-28 2022-07-28 Autel Robotics Co., Ltd. Metering adjustment method, apparatus and device and storage medium
US11457157B2 (en) * 2019-01-04 2022-09-27 Gopro, Inc. High dynamic range processing based on angular rate measurements
US20220345607A1 (en) * 2019-12-30 2022-10-27 Autel Robotics Co., Ltd. Image exposure method and device, unmanned aerial vehicle
US11539895B1 (en) * 2021-09-27 2022-12-27 Wisconsin Alumni Research Foundation Systems, methods, and media for motion adaptive imaging using single-photon image sensor data
US11570374B1 (en) * 2020-09-23 2023-01-31 Apple Inc. Subject-aware low light photography
US11653088B2 (en) * 2016-05-25 2023-05-16 Gopro, Inc. Three-dimensional noise reduction
US11671714B1 (en) * 2022-01-24 2023-06-06 Qualcomm Incorporated Motion based exposure control
US11710223B2 (en) * 2020-02-14 2023-07-25 Pixelworks, Inc. Methods and systems for image processing with multiple image sources
US20230269489A1 (en) * 2022-02-23 2023-08-24 Gopro, Inc. Method and apparatus for multi-image multi-exposure processing
US11825207B1 (en) * 2022-05-02 2023-11-21 Qualcomm Incorporated Methods and systems for shift estimation for one or more output frames
US20230388668A1 (en) * 2022-05-26 2023-11-30 Guangzhou Tyrafos Semiconductor Technologies Co., Ltd. Image sensor circuit and image sensor device
US11863880B2 (en) * 2022-05-31 2024-01-02 Microsoft Technology Licensing, Llc Image frame selection for multi-frame fusion

Patent Citations (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7382931B2 (en) * 2003-04-29 2008-06-03 Microsoft Corporation System and process for generating high dynamic range video
US20060177150A1 (en) 2005-02-01 2006-08-10 Microsoft Corporation Method and system for combining multiple exposure images having scene and camera motion
US9307212B2 (en) * 2007-03-05 2016-04-05 Fotonation Limited Tone mapping for low-light video frame enhancement
CN101262564A (en) 2007-03-09 2008-09-10 索尼株式会社 Image processing apparatus, image forming apparatus, image processing method, and computer program
EP2175635A1 (en) 2008-10-10 2010-04-14 Samsung Electronics Co., Ltd. Method and apparatus for creating high dynamic range image
US20100328482A1 (en) * 2009-06-26 2010-12-30 Samsung Electronics Co., Ltd. Digital photographing apparatus, method of controlling the digital photographing apparatus, and recording medium storing program to implement the method
US20130136364A1 (en) 2011-11-28 2013-05-30 Fujitsu Limited Image combining device and method and storage medium storing image combining program
US9437171B2 (en) * 2012-12-05 2016-09-06 Texas Instruments Incorporated Local tone mapping for high dynamic range images
US20150116525A1 (en) * 2013-10-31 2015-04-30 Himax Imaging Limited Method for generating high dynamic range images
CN103973989A (en) 2014-04-15 2014-08-06 北京理工大学 Method and system for obtaining high-dynamic images
US20150348242A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Scene Motion Correction In Fused Image Systems
US11653088B2 (en) * 2016-05-25 2023-05-16 Gopro, Inc. Three-dimensional noise reduction
US10757344B2 (en) * 2016-07-01 2020-08-25 Maxell, Ltd. Imaging apparatus, imaging method and imaging program
US10616499B2 (en) * 2016-07-29 2020-04-07 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and device for capturing high dynamic range image, and electronic device
US10750098B2 (en) * 2016-10-13 2020-08-18 Panasonic Intellectual Property Management Co., Ltd. Image processing device, image processing method, and image processing circuit
US10264193B2 (en) * 2016-12-06 2019-04-16 Polycom, Inc. System and method for providing images and video having high dynamic range
US10165194B1 (en) * 2016-12-16 2018-12-25 Amazon Technologies, Inc. Multi-sensor camera system
US11017509B2 (en) * 2016-12-22 2021-05-25 Huawei Technologies Co., Ltd. Method and apparatus for generating high dynamic range image
WO2018190649A1 (en) 2017-04-12 2018-10-18 Samsung Electronics Co., Ltd. Method and apparatus for generating hdr images
US10638052B2 (en) * 2017-04-12 2020-04-28 Samsung Electronics Co., Ltd. Method and apparatus for generating HDR images
CN107231530A (en) 2017-06-22 2017-10-03 维沃移动通信有限公司 A kind of photographic method and mobile terminal
US20210377457A1 (en) * 2017-10-31 2021-12-02 Morpho, Inc. Image compositing device, image compositing method, and storage medium
CN108419023A (en) 2018-03-26 2018-08-17 华为技术有限公司 A kind of method and relevant device generating high dynamic range images
US11190707B2 (en) * 2018-05-22 2021-11-30 Arashi Vision Inc. Motion ghost resistant HDR image generation method and portable terminal
CN109005361A (en) 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 Control method, device, imaging device, electronic equipment and readable storage medium storing program for executing
CN108881731A (en) 2018-08-06 2018-11-23 Oppo广东移动通信有限公司 Panorama shooting method, device and imaging device
CN108989700A (en) 2018-08-13 2018-12-11 Oppo广东移动通信有限公司 Image formation control method, device, electronic equipment and computer readable storage medium
CN109005346A (en) 2018-08-13 2018-12-14 Oppo广东移动通信有限公司 Control method, device, electronic equipment and computer readable storage medium
US10701279B2 (en) * 2018-10-02 2020-06-30 Adobe Inc. Utilizing alignment models and motion vector path blending to generate a long exposure digital image from a sequence of short exposure digital images
CN109120862A (en) 2018-10-15 2019-01-01 Oppo广东移动通信有限公司 High-dynamic-range image acquisition method, device and mobile terminal
CN109286758A (en) 2018-10-15 2019-01-29 Oppo广东移动通信有限公司 A kind of generation method of high dynamic range images, mobile terminal and storage medium
US20210278836A1 (en) * 2018-11-07 2021-09-09 Autel Robotics Co., Ltd. Method and device for dual-light image integration, and unmanned aerial vehicle
US20200211166A1 (en) * 2018-12-28 2020-07-02 Qualcomm Incorporated Methods and apparatus for motion compensation in high dynamic range processing
US10916036B2 (en) * 2018-12-28 2021-02-09 Intel Corporation Method and system of generating multi-exposure camera statistics for image processing
US11457157B2 (en) * 2019-01-04 2022-09-27 Gopro, Inc. High dynamic range processing based on angular rate measurements
US20200236273A1 (en) * 2019-01-18 2020-07-23 Samsung Electronics Co., Ltd. Imaging systems for generating hdr images and operating methods thereof
US11128809B2 (en) * 2019-02-15 2021-09-21 Samsung Electronics Co., Ltd. System and method for compositing high dynamic range images
US10742892B1 (en) * 2019-02-18 2020-08-11 Samsung Electronics Co., Ltd. Apparatus and method for capturing and blending multiple images for high-quality flash photography using mobile electronic device
US20210400172A1 (en) * 2019-03-06 2021-12-23 Autel Robotics Co., Ltd. Imaging processing method and apparatus for a camera module in a night scene, an electronic device, and a storage medium
US20220198625A1 (en) * 2019-04-11 2022-06-23 Dolby Laboratories Licensing Corporation High-dynamic-range image generation with pre-combination denoising
US11095829B2 (en) * 2019-06-11 2021-08-17 Samsung Electronics Co., Ltd. Apparatus and method for high dynamic range (HDR) image creation of dynamic scenes using graph cut-based labeling
CN110381263A (en) 2019-08-20 2019-10-25 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN110572585A (en) 2019-08-26 2019-12-13 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
US20220236056A1 (en) * 2019-08-28 2022-07-28 Autel Robotics Co., Ltd. Metering adjustment method, apparatus and device and storage medium
US11113802B1 (en) * 2019-09-09 2021-09-07 Apple Inc. Progressive image fusion
US11379997B2 (en) * 2019-11-01 2022-07-05 Samsung Electronics Co., Ltd. Image devices including image sensors and image signal processors, and operation methods of image sensors
US20220043117A1 (en) * 2019-11-13 2022-02-10 Lumotive, LLC Lidar systems based on tunable optical metasurfaces
US10944914B1 (en) * 2019-12-02 2021-03-09 Samsung Electronics Co., Ltd. System and method for generating multi-exposure frames from single input
US20220345607A1 (en) * 2019-12-30 2022-10-27 Autel Robotics Co., Ltd. Image exposure method and device, unmanned aerial vehicle
US11710223B2 (en) * 2020-02-14 2023-07-25 Pixelworks, Inc. Methods and systems for image processing with multiple image sources
US11356604B2 (en) * 2020-02-14 2022-06-07 Pixelworks, Inc. Methods and systems for image processing with multiple image sources
CN111479072A (en) 2020-04-14 2020-07-31 深圳市道通智能航空技术有限公司 High dynamic range image synthesis method and device, image processing chip and aerial camera
US11276154B2 (en) * 2020-07-17 2022-03-15 Samsung Electronics Co., Ltd. Multi-frame depth-based multi-camera relighting of images
US11570374B1 (en) * 2020-09-23 2023-01-31 Apple Inc. Subject-aware low light photography
US20220138964A1 (en) * 2020-10-30 2022-05-05 Qualcomm Incorporated Frame processing and/or capture instruction systems and techniques
US20220230283A1 (en) * 2021-01-21 2022-07-21 Beijing Xiaomi Pinecone Electronics Co., Ltd. Method and device for processing image, and storage medium
US11373281B1 (en) * 2021-02-23 2022-06-28 Qualcomm Incorporated Techniques for anchor frame switching
US11363213B1 (en) * 2021-02-26 2022-06-14 Qualcomm Incorporated Minimizing ghosting in high dynamic range image processing
US11539895B1 (en) * 2021-09-27 2022-12-27 Wisconsin Alumni Research Foundation Systems, methods, and media for motion adaptive imaging using single-photon image sensor data
US11671714B1 (en) * 2022-01-24 2023-06-06 Qualcomm Incorporated Motion based exposure control
US20230269489A1 (en) * 2022-02-23 2023-08-24 Gopro, Inc. Method and apparatus for multi-image multi-exposure processing
US11825207B1 (en) * 2022-05-02 2023-11-21 Qualcomm Incorporated Methods and systems for shift estimation for one or more output frames
US20230388668A1 (en) * 2022-05-26 2023-11-30 Guangzhou Tyrafos Semiconductor Technologies Co., Ltd. Image sensor circuit and image sensor device
US11863880B2 (en) * 2022-05-31 2024-01-02 Microsoft Technology Licensing, Llc Image frame selection for multi-frame fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
The International Search Report mailed Jun. 15, 2021; PCT/CN2021/083350.

Also Published As

Publication number Publication date
US20230038844A1 (en) 2023-02-09
WO2021208706A1 (en) 2021-10-21
CN111479072A (en) 2020-07-31
CN111479072B (en) 2021-12-17

Similar Documents

Publication Publication Date Title
US12041358B2 (en) High dynamic range image synthesis method and apparatus, image processing chip and aerial camera
AU2019326496B2 (en) Method for capturing images at night, apparatus, electronic device, and storage medium
US12052514B2 (en) Imaging processing method and apparatus for a camera module in a night scene, an electronic device, and a storage medium
US11532076B2 (en) Image processing method, electronic device and storage medium
CN108322646B (en) Image processing method, image processing device, storage medium and electronic equipment
CN109218627B (en) Image processing method, image processing device, electronic equipment and storage medium
CN104184958A (en) Automatic exposure control method and device based on FPGA (field programmable Gate array) and suitable for space detection imaging
CN110493524B (en) Photometric adjustment method, device and equipment and storage medium
CN108337447A (en) High dynamic range images exposure compensating value-acquiring method, device, equipment and medium
EP3481051A1 (en) Combining optical and digital zoom under varying image capturing conditions
US9961269B2 (en) Imaging device, imaging device body, and lens barrel that can prevent an image diaphragm value from frequently changing
US10609265B2 (en) Methods and apparatus for synchronizing camera flash and sensor blanking
CN105867047A (en) Flashlight adjusting method and shooting device
CN112335224A (en) Image acquisition method and device for movable platform and storage medium
EP3454547A1 (en) Imaging apparatus, image processing apparatus, imaging method, image processing method, and storage medium
US11032483B2 (en) Imaging apparatus, imaging method, and program
US20090002544A1 (en) Methods of adding additional parameters during automatic exposure for a digital camera and related electronic devices and computer program products
US10944899B2 (en) Image processing device and image processing method
CN117135293A (en) Image processing method and electronic device
US9807311B2 (en) Imaging apparatus, video data transmitting apparatus, video data transmitting and receiving system, image processing method, and program
WO2021253167A1 (en) Digital zoom imaging method and apparatus, camera, and unmanned aerial vehicle system
JP2019033470A (en) Image processing system, imaging apparatus, image processing apparatus, control method, and program
JP7353864B2 (en) Information processing device, control method and program for information processing device, imaging system
KR20080057345A (en) Imaging system with adjustable optics
CN114928694A (en) Image acquisition method and apparatus, device, and medium

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: AUTEL ROBOTICS CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LI, ZHAOZAO;REEL/FRAME:061358/0947

Effective date: 20220624

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE