US12041358B2 - High dynamic range image synthesis method and apparatus, image processing chip and aerial camera - Google Patents
High dynamic range image synthesis method and apparatus, image processing chip and aerial camera Download PDFInfo
- Publication number
- US12041358B2 US12041358B2 US17/938,517 US202217938517A US12041358B2 US 12041358 B2 US12041358 B2 US 12041358B2 US 202217938517 A US202217938517 A US 202217938517A US 12041358 B2 US12041358 B2 US 12041358B2
- Authority
- US
- United States
- Prior art keywords
- image
- exposure
- brightness
- pixel point
- medium
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000012545 processing Methods 0.000 title claims abstract description 23
- 238000001308 synthesis method Methods 0.000 title claims abstract description 20
- 230000033001 locomotion Effects 0.000 claims abstract description 100
- 238000000034 method Methods 0.000 claims abstract description 27
- 230000002194 synthesizing effect Effects 0.000 claims abstract description 21
- 238000001514 detection method Methods 0.000 claims description 48
- 238000004364 calculation method Methods 0.000 claims description 8
- 238000004590 computer program Methods 0.000 claims description 4
- 230000015572 biosynthetic process Effects 0.000 description 30
- 238000003786 synthesis reaction Methods 0.000 description 30
- 238000010586 diagram Methods 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000002411 adverse Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10144—Varying exposure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20208—High dynamic range [HDR] image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Definitions
- the disclosure relates to the field of image processing technologies, and in particular, to a high dynamic range (HDR) image synthesis method and apparatus, an image processing chip and an aerial camera.
- HDR high dynamic range
- a high dynamic range (HDR) image is image data that may provide more details of bright and dark images and can reflect the visual effect in the real environment more desirably relative to a common image.
- the HDR image is synthesized by a plurality of common images having different exposure time (that is, a low dynamic range) by using the optimal detail corresponding to each exposure time.
- the obtained HDR image is synthesized after the plurality of common images are generated through a plurality of exposures. Problems such as smearing of a moving object, a decrease in the picture definition, and sometimes even an error in brightness easily occur in a specific application scene such as aerial photography with a high moving rate.
- Embodiments of the disclosure are intended to provide a high dynamic range (HDR) image synthesis method and apparatus, an image processing chip and an aerial camera, which can solve the defects existing in the HDR image synthesis method.
- HDR high dynamic range
- An HDR image synthesis method including:
- the image brightness type includes a high-light image and a low-light scene.
- the determining an image brightness type of the to-be-synthesized images according to the mean brightness specifically includes: determining that the image brightness type of the to-be-synthesized image is the high-light image when the mean brightness is greater than or equal to a preset brightness detection threshold; and determining that the image brightness type of the to-be-synthesized image is the low-light scene when the mean brightness is less than the brightness detection threshold.
- the calculating a mean brightness of the to-be-synthesized images specifically includes: superimposing brightness values of all pixel points in the to-be-synthesized image to obtain an accumulated brightness value; summing the accumulated brightness values of all of the to-be-synthesized images to obtain a total brightness value; and calculating the mean brightness value according to the total brightness value, a quantity of the to-be-synthesized images and sizes of the to-be-synthesized images.
- the to-be-synthesized images include a short-exposure image, a medium-exposure image and a long-exposure image that are continuously shot.
- An exposure time of the short-exposure image is less than an exposure time of the medium-exposure image, and the exposure time of the medium-exposure image is less than an exposure time of the long-exposure image.
- the motion state includes a moving pixel and a stationary pixel.
- the determining a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference specifically includes: determining whether an inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both greater than or equal to a preset motion detection threshold; when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both less than a preset motion detection threshold, determining that a motion state at the pixel point position is the moving pixel; and when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both more than a preset motion detection threshold, determining that a motion state at the pixel point position is the stationary pixel.
- the calculating a brightness difference between adjacent pixel points in one to-be-synthesized image specifically includes: calculating a first brightness difference between a target pixel point and an adjacent first pixel point and a second brightness difference between the target pixel point and an adjacent second pixel point; and acquiring a difference between the first brightness difference and the second brightness difference as a brightness difference of the target pixel point.
- the weighting and synthesizing the plurality of to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state specifically includes:
- the weighting and synthesizing the plurality of to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state specifically includes:
- An HDR image synthesis apparatus including:
- an image processing chip including a processor and a memory communicatively connected to the processor.
- the memory stores a computer program instruction, and the computer program instruction, when invoked by the processor, causes the processor to perform the HDR image synthesis method as described above.
- an aerial camera includes:
- a weight ratio of different common images is adaptively adjusted during synthesizing of the HDR image according to different motion states and different image brightness types of the to-be-synthesized images, thereby effectively avoiding the problems of the smearing of a moving object and the decreased picture definition during the synthesis into the HDR image from the plurality of images.
- FIG. 1 is a schematic diagram of an application scene of a high dynamic range (HDR) image synthesis method according to an embodiment of the disclosure.
- HDR high dynamic range
- FIG. 2 is a structural block diagram of an aerial camera according to an embodiment of the disclosure.
- FIG. 3 is a schematic diagram of an HDR image synthesis apparatus according to an embodiment of the disclosure.
- FIG. 4 is a method flowchart of an image processing method according to an embodiment of the disclosure.
- FIG. 5 is a method flowchart of a motion state determination method according to an embodiment of the disclosure.
- FIG. 6 is a schematic diagram of pixel point positions according to an embodiment of the disclosure.
- FIG. 7 is a schematic structural diagram of an image processing chip according to an embodiment of the disclosure.
- the disclosure is described in more detail below with reference to the accompanying drawings and specific embodiments. It should be noted that, when a component is expressed as “being fixed to” another component, the component may be directly on the another component, or one or more intermediate components may exist between the component and the another component. When one component is expressed as “being connected to” another component, the component may be directly connected to the another component, or one or more intermediate components may exist between the component and the another component.
- orientation or position relationships indicated by the terms such as “up”, “down”, “inside”, “outside” and “bottom” are based on orientation or position relationships shown in the accompanying drawings, and are used only for ease and brevity of illustration and description of the disclosure, rather than indicating or implying that the mentioned apparatus or component needs to have a particular orientation or needs to be constructed and operated in a particular orientation. Therefore, such terms should not be construed as limiting of the disclosure.
- terms “first”, “second” and “third” are only used to describe the objective and cannot be understood as indicating or implying relative importance.
- a high dynamic range (HDR) image is synthesized by a plurality of common images having different exposure time, so as to show details of brightness more desirably.
- FIG. 1 shows an application scene of an HDR image synthesis method according to an embodiment of the disclosure.
- an unmanned aerial vehicle (UAV) 10 equipped with an aerial camera, a smart terminal 20 , and a wireless network 30 are included.
- UAV unmanned aerial vehicle
- the UAV 10 may be any type of power-driven UAV.
- the UAV includes but is not limited to a four-axis UAV, a fixed-wing aircraft, a helicopter model, and the like.
- the UAV may have a corresponding volume or power according to actual conditions, so as to provide a load capacity, a flight speed and a flight range that can meet use requirements.
- the aerial camera may be any type of image acquisition device, including a sports camera, a high-definition camera or a wide-angle camera.
- the aerial camera may be mounted and fixed to the UAV by a mounting and fixing bracket such as a gimbal, and is controlled by the UAV 10 to execute a task of image acquisition.
- one or more functional modules may further be arranged on the UAV, so that the UAV can realize a corresponding function.
- a built-in main control chip uploads captured image information to a device that establishes a connection to the UAV.
- the smart terminal 20 may be any type of smart device configured to establish a communication connection to the UAV, for example, a mobile phone, a tablet computer, a smart remote control or the like.
- the smart terminal 20 may be equipped with one or more different user interactive apparatuses for collecting instructions from a user or displaying and feeding back information to the user.
- the interactive apparatuses include but are not limited to a button, a display screen, a touch screen, a speaker and a remote control joystick.
- the smart terminal 20 may be equipped with a touch display screen. Through the touch display screen, a remote control instruction for the UAV is received from a user, and image information obtained by the aerial camera is presented to the user. The user may further switch the image information currently displayed on the display screen through a remote touch screen.
- the existing image visual processing technology may further be fused between the UAV 10 and the smart terminal 20 to further provide more intelligent services.
- the UAV 10 may capture an image through the aerial camera, and then the smart terminal 20 parses an operation gesture in the image, so as to realize gesture control for the UAV 10 by the user.
- the wireless network 30 may be a wireless communication network configured to establish a data transmission channel between two nodes based on any type of data transmission principle, for example, a Bluetooth network, a Wi-Fi network, a wireless cellular network, or a combination thereof in specific signal frequency bands.
- FIG. 2 is a structural block diagram of an aerial camera 11 according to an embodiment of the disclosure.
- the aerial camera 11 may include an image sensor 111 , a controller 112 , and an image processor 113 .
- the image sensor 111 is a functional module configured to capture an image with set shooting parameters. An optical signal corresponding to a visual picture is projected onto a photosensitive element through a lens and a related optical component, and the photosensitive element converts the optical signal to a corresponding electrical signal.
- the shooting parameters are adjustable parameter variables such as an aperture, a focal length or an exposure time that are related to a structure of the lens and the related optical component (such as a shutter) during image acquisition of the image sensor 111 .
- the image sensor 111 may capture one image through each exposure.
- the controller 112 is a control core of the image sensor 111 .
- the controller is connected to the image sensor, and may accordingly control a shooting behavior of the image sensor 111 according to the received instruction. For example, one or more shooting parameters of the image sensor 111 are set.
- the controller 112 may trigger the image sensor to continuously capture a plurality of images with different exposure time.
- a quantity of the captured images is a constant value set artificially, which may be a default value preset by a technician, or may be a value set by a user according to synthesis requirements of the HDR image during use.
- three images having different exposure time may be continuously captured.
- the images are respectively referred to as a short-exposure image, a medium-exposure image and a long-exposure image based on the exposure time.
- the image processor 113 is a functional module configured to synthesize the HDR image.
- the image processor may receive the plurality of images continuously captured by the image sensor and synthesize the images into a corresponding HDR image.
- the aerial camera may further include a storage device 114 configured to store data information generated by the aerial camera 11 during use, for example, store the to-be-synthesized image, the synthesized HDR image, and the like.
- the storage device may specifically adopt any type of non-volatile memory having a suitable capacity, such as an SD card, a flash memory, or a solid-state hard disk.
- the storage device 114 may further be a detachable structure or a structure in a distributed arrangement.
- the aerial camera may be provided with only a data interface, and the data of the to-be-synthesized image or the HDR image is transmitted to the corresponding device for storage through the data interface.
- one or more functional modules (such as the controller, the image processor and the storage device) of the aerial camera 11 shown in FIG. 2 may also be integrated into the UAV 10 as a part of the UAV 10 .
- the functional module of the aerial camera 11 is exemplarily described only based on the image capture process, which is not intended to limit the functional module of the aerial camera 11 .
- FIG. 3 is a structural block diagram of an HDR image synthesis apparatus according to an embodiment of the disclosure.
- the HDR image synthesis apparatus may be executed by the above image processor.
- the composition of the HDR image synthesis apparatus is described by using the functional module.
- the functional module shown in FIG. 3 may be implemented through software, hardware or a combination of the software and the hardware according to the actual situation.
- the functional module may be implemented by the processor invoking a relevant software application stored in a memory.
- the HDR image synthesis apparatus 300 includes an image acquisition module 310 , a brightness detection module 320 , a secondary difference calculation module 330 , a motion detection module 340 and a synthesis module 350 .
- the image acquisition module 310 is configured to acquire a plurality of to-be-synthesized images. Each of the to-be-synthesized images has a different exposure time.
- the to-be-synthesized images are image data information captured by the image sensor through one exposure.
- the to-be-synthesized images that are continuously captured may be assembled into an image set configured to synthesize a final HDR image.
- the brightness detection module 320 is configured to calculate a mean brightness of the to-be-synthesized images and determine an image brightness type of the to-be-synthesized images according to the mean brightness.
- the to-be-synthesized images may have significantly different image brightness depending on different environments where the to-be-synthesized image is located when being shot.
- each to-be-synthesized image may be roughly divided into different image brightness types according to a difference in the image brightness.
- the to-be-synthesized image may be divided into two different image brightness types: a high-light image and a low-light image based on when it is captured (in the daytime or in the nighttime).
- the secondary difference calculation module 330 is configured to calculate a brightness difference between adjacent pixel points in one to-be-synthesized image, and calculate an inter-frame difference of the to-be-synthesized image at a same pixel point position according to the brightness difference.
- the inter-frame difference indicates a change situation between different to-be-synthesized images within a specific area.
- the motion detection module 340 is configured to determine a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference.
- the inter-frame difference calculated by the secondary difference calculation module 330 indicates a dynamic change situation of a certain area in time. Therefore, it may be accordingly determined whether different positions of the images are changed, so as to determine the motion state at the pixel point position. A specific motion state may be determined according to the actual situation.
- the motion state may be simply divided into a moving pixel and a stationary pixel.
- the moving pixel indicates that the image at the pixel point position is moved.
- the stationary pixel indicates that the image at that pixel point position is not moved.
- the synthesis module 350 is configured to weight and synthesize the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.
- Weighting and synthesis mean assigning a corresponding weight to a different to-be-synthesized image, so as to obtain a required HDR image through synthesis. Some to-be-synthesized images with poor image quality may be less frequently considered by adjusting the weight of the to-be-synthesized images, so as to reduce the impact of these to-be-synthesized images on the quality of the HDR image.
- the weights of the to-be-synthesized images are adaptively adjusted and considered according to the image brightness type and the motion state, thereby effectively avoiding the interference of some to-be-synthesized images, which is beneficial to improve the quality of the HDR image.
- the aerial camera mounted to the UAV is used as an example.
- the HDR image synthesis method may further be used in other types of scenes and devices, so as to improve the quality of the output HDR image.
- the HDR image synthesis method disclosed in the embodiment of the disclosure is not limited to application on the UAV shown in FIG. 1 .
- FIG. 4 is a method flowchart of an HDR synthesis method according to an embodiment of the disclosure. As shown in FIG. 4 , the image processing method includes the following steps.
- Each of the to-be-synthesized images has a different exposure time.
- a specific exposure time may be set according to an actual situation, which is an empirical value, and the details are not described herein.
- These to-be-synthesized images are some images continuously shot.
- the images are configured as a data base to be synthesized into an HDR image.
- the “mean brightness” is an overall image brightness in the to-be-synthesized image, which reflects the light intensity of a surrounding environment during the image capturing. A higher mean brightness indicates a higher light intensity of the surrounding environment during capturing of the to-be-synthesized image.
- the mean brightness may be calculated in any suitable manner.
- the mean brightness may be calculated in the following manner.
- the mean brightness value is calculated according to the total brightness value, a quantity of the to-be-synthesized images and sizes of the to-be-synthesized images. In this way, the mean brightness value of the plurality of to-be-synthesized images at one pixel point position may be calculated and used as the “mean brightness”.
- the image brightness type is a type that is determined or divided in advance according to a difference in brightness. Under different use conditions, an appropriate quantity of image brightness types may be divided according to use requirements, so that the to-be-synthesized images having a similar mean brightness are used as the same image brightness type for further processing.
- the image brightness type to which the to-be-synthesized image belongs may be determined by setting an appropriate brightness detection threshold.
- a brightness detection threshold can be preset when the image brightness type includes a high-light image and a low-light image.
- the mean brightness is greater than or equal to a preset brightness detection threshold, it is determined that the to-be-synthesized image is the high-light image.
- the mean brightness is less than the brightness detection threshold, it is determined that the to-be-synthesized image is the low-light image.
- the brightness detection threshold is an empirical value, which may be set according to the actual situation.
- the high-light image corresponds to a scene with enough light or enough brightness in the daytime.
- the low-light image indicates that a shooting scene of the to-be-synthesized image is a scene with severely insufficient light at night.
- the to-be-synthesized image is actually composed of a plurality of different pixel points.
- the pixel point is a smallest basic unit in an image. In one image, the difference between adjacent pixel points roughly reflects the texture of a shot object.
- a specific calculation method for the brightness difference may include the following steps.
- a first brightness difference of a target pixel point and an adjacent first pixel point and a second brightness difference between a target pixel point and an adjacent second pixel point are calculated. Then, a difference between the first brightness difference and the second brightness difference is acquired as a brightness difference of the target pixel point.
- the target pixel point is currently selected, and the pixel point for determining the motion state is required to be calculated. As shown in FIG. 6 , for any pixel point, a surrounding edge of the pixel point can be surrounded by eight adjacent pixel points. The first pixel point and the second pixel point are two of the eight pixel points around the target pixel point.
- the inter-frame difference is calculated from a difference between the brightness differences in the different to-be-synthesized images at the same position. It may be understood by those skilled in the art that, in a case that a shot object does not move significantly, the texture of the plurality of continuously shot images at the same position is not to change significantly.
- the inter-frame difference obtained based on a secondary difference can reflect the movement situation of the shot object.
- the inter-frame difference is excessively large, it indicates that the shot object moves violently.
- the inter-frame difference is relatively small, it indicates that the position of the shot object is basically not changed.
- the secondary difference provided in this embodiment is based on the brightness difference between the adjacent pixel points to measure the difference between the different to-be-synthesized images, which can effectively avoid the impact brought by the brightness difference existing in the to-be-synthesized images, and serves as an accurate determination basis for motion detection.
- the motion state means whether a shot object moves.
- the pixel point position where the shot object has moved may be referred to as a moving pixel.
- the pixel point position when the shot object has not moved is referred to as a stationary pixel.
- the to-be-synthesized image including a short-exposure image, a medium-exposure image and a long-exposure image that are continuously shot is used as an example (an exposure time of the short-exposure image is less than an exposure time of the medium-exposure image, and the exposure time of the medium-exposure image is less than an exposure time of the long-exposure image).
- a specific determination process of the motion state is described in detail. As shown in FIG. 5 , a method for determining the motion state includes the following steps.
- step 520 Determine whether K 1 and K 2 are both less than or equal to a preset motion detection threshold.
- step 530 is performed, and when the inter-frame difference between the short-exposure image and the medium-exposure image and an inter-frame difference between the medium-exposure image and the long-exposure image are both more than a preset motion detection threshold, step 540 is performed.
- the motion detection threshold is an empirical value, which may be set according to the actual situation.
- the “moving pixel” means that the shot object at the pixel point position moves.
- the pixels of all to-be-synthesized images at this position are referred to as the “moving pixel”.
- the “stationary pixel” means that the shot object at the pixel point position does not move.
- the pixels of all to-be-synthesized images at this position are referred to as the “stationary pixel”.
- the two indicators including the image brightness type and the motion state can well reflect the scene situation of the to-be-synthesized image during shooting. Therefore, the weight of each to-be-synthesized image in the synthesis process can be adaptively adjusted, so that the synthesized HDR image has better image quality.
- each pixel point of the same HDR image is calculated and determined by weighting and synthesizing of the pixel points of the to-be-synthesized images at the same pixel point position.
- the specific weighting and synthesis process is as follows.
- a corresponding short-exposure weight coefficient, a medium-exposure weight coefficient and a long-exposure weight coefficient are respectively preset for the short-exposure image, the medium-exposure image and the long-exposure image.
- the short-exposure weight coefficient, the medium-exposure weight coefficient and the long-exposure weight coefficient are all preset weight values, which can be adjusted or set accordingly according to the actual situation, indicating the weight ratio of the to-be-synthesized images during synthesizing of the HDR image in general.
- the motion state at the pixel point position and the image brightness type of the to-be-synthesized image are determined, which are respectively divided into the following situations for processing.
- Pixel points of the short-exposure image, the medium-exposure image and the long-exposure image at the pixel point position are weighted and synthesized into a pixel point of the HDR image at a same pixel point position according to the short-exposure weight coefficient, the medium-exposure weight coefficient and the long-exposure weight coefficient when the motion state at the pixel point position is the stationary pixel and the image brightness type is a high-light image.
- the motion state at the pixel point position is the stationary pixel and the image brightness type is a high-light image
- the short-exposure image and the long-exposure image are discarded, and the pixel points of the medium-exposure image at the pixel point position are weighted and synthesized into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient.
- the weight coefficients of the short-exposure image and the long-exposure image are required to be adjusted to zero to avoid causing an adverse effect on the finally synthesized HDR image.
- the motion state is the stationary pixel and the image brightness type is the low-light image
- the short-exposure image is discarded, and the pixel points of the medium-exposure image and the long-exposure image at the pixel point position are weighted and synthesized into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient and the long-exposure weight coefficient.
- the weight coefficient of the short-exposure image can be adjusted to zero in this case, so as to avoid causing an adverse effect on the finally synthesized HDR image.
- the motion state is the moving pixel and the image brightness type is a low-light image
- the medium-exposure image and the long-exposure image are discarded, and the pixel points of the short-exposure image at the pixel point position are weighted and synthesized into the pixel point of the HDR image at the same pixel point position according to the short-exposure weight coefficient.
- the weight coefficients of the short-exposure image and the medium-exposure image may be adjusted to zero to avoid causing an adverse effect on the finally synthesized HDR image.
- the finally outputted HDR image can have an HDR and high definition in a stationary scene in the daytime, and has a desirable technical effect of low picture noise and no smearing during the motion in the night scene.
- the image sensor continuously captures a short-exposure image with an exposure time of x/2, a medium-exposure image with an exposure time of x, and a long-exposure image with an exposure time of 2 ⁇ as a to-be-synthesized image each time.
- a length of the to-be-synthesized image is w pixels, and a width thereof is h pixels.
- the short-exposure image is first transmitted to the image processor after being shot, and the medium-exposure image and the long-exposure image are transmitted in sequence, so that the overall image synthesis process has a minimum delay.
- the mean brightness of the to-be-synthesized image is calculated by a brightness detection module 320 through the following equation (1):
- S (i,j) is a brightness value of a pixel point in an i th row and a j th column of the short-exposure image
- M (i,j) is a brightness value of a pixel point in an i th row and a j th column of the medium-exposure image
- L (i,j) is a brightness value of a pixel point in an i th row and a j th column of the long-exposure image
- L is the mean brightness.
- the brightness detection module 320 is configured to determine whether the mean brightness L is greater than or equal to a preset brightness detection threshold T. When the mean brightness L is greater than or equal to the brightness detection threshold T, it is determined that the image brightness type of the to-be-synthesized image is the high-light image. When the mean brightness L is less than the brightness detection threshold T, it is determined that the image brightness type of the to-be-synthesized image is the low-light image.
- ⁇ S (i,j) ⁇ S (i,j) ⁇ S (i+1,j)
- ⁇ M (i,j) ⁇ M (i,j) ⁇ M (i+1,j)
- ⁇ L (i,j) ⁇ L (i,j) ⁇ L (i+1,j)
- ⁇ S (i,j) is a brightness difference of pixel points in an i th row and a j th column of the short-exposure image
- ⁇ M (i,j) is a brightness difference of pixel points in an i th row and a j th column of the medium-exposure image
- ⁇ L (i,j) is a brightness difference of pixel points in an i th row and a j th column of the long-exposure image (as shown in FIG. 6 , adjacent pixel points are eight pixel points around the target pixel point).
- an inter-frame difference between the short-exposure image and the medium-exposure image can be calculated as
- a motion detection module 340 determines, based on the inter-frame difference calculated by the secondary difference calculation module 330 , whether the two inter-frame differences are both less than a preset motion detection threshold A.
- the motion state at the pixel point position (i,j) is determined as the moving pixel.
- the synthesis module 350 is connected to the brightness detection module 320 and the motion detection module 340 , and adjusts and determines specific weight coefficients according to the image brightness type and the motion state provided by the brightness detection module and the motion detection module, so as to complete the synthesis of the HDR image.
- H (i,j) a ⁇ S (i,j) +b ⁇ M (i,j) +c ⁇ L (i,j) (3)
- a is a short-exposure weight coefficient
- b is a medium-exposure weight coefficient
- c is a long-exposure weight coefficient.
- S (i,j) is a pixel point in an i th row and a j th column of the short-exposure image
- M (i,j) is a pixel point in an i th row and a j th column of the medium-exposure image
- L (i,j) is a pixel point in an i th row and a j th column of the long-exposure image
- H (i,j) is a pixel point in an i th row and a j th column of the synthesized HDR image.
- the synthesis module 350 performs weighting and synthesis according to the equation (3).
- the to-be-synthesized images shot by after a plurality of consecutive exposures may be integrated into the HDR image having higher image quality in a targeted manner, so as to avoid the problems such as smearing of a moving object, a decrease in picture definition, and even an error in brightness that easily occur during synthesizing of the HDR image in a high-speed moving scene such as aerial photography.
- An embodiment of the disclosure further provides a non-volatile computer storage medium.
- the computer storage medium stores at least one executable instruction, and the computer-executable instruction can be used for performing the HDR image synthesis method in any of the above method embodiments.
- FIG. 7 is a schematic structural diagram of an image processing chip according to an embodiment of the disclosure.
- the specific embodiments of the disclosure do not limit the specific implementation of the image processing chip.
- the image processing chip may include a processor 702 , a communication interface 704 , a memory 706 , and a communication bus 708 .
- the processor 702 , the communication interface 704 , and the memory 706 communicate with each other through the communication bus 708 .
- the communication interface 704 is configured to communicate with a network element of other devices such as a client or other servers.
- the processor 702 is configured to execute a program 710 , and specifically may execute the relevant steps in the above embodiments of the HDR image synthesis method.
- the program 710 may include program code, and the program code includes a computer operation instruction.
- the processor 702 may be a central processing unit (CPU), or an application specific integrated circuit (ASIC), or one or more integrated circuits configured to implement the embodiments of the disclosure.
- processors included in a network slicing device may be a same type of processor, such as one or more CPUs, or may be different types of processors, such as one or more CPUs and one or more ASICs.
- the memory 706 is configured to store the program 710 .
- the memory 706 may include a high-speed RAM memory, or may further include a non-volatile memory, for example, at least one magnetic disk memory.
- the program 710 can specifically be configured to cause the processor 702 to execute the HDR image synthesis method in any of the above method embodiments.
- the computer software may be stored in a computer-readable storage medium.
- the program may include the processes of the embodiments of the foregoing methods.
- the storage medium may be a magnetic disk, an optical disc, a read-only memory (ROM), or a random access memory (RAM).
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
Abstract
Description
-
- acquiring a plurality of to-be-synthesized images, each having different exposure time; calculating a mean brightness of the to-be-synthesized images; determining an image brightness type of the to-be-synthesized images according to the mean brightness; calculating a brightness difference between adjacent pixel points in one to-be-synthesized image; calculating an inter-frame difference of different to-be-synthesized images at a same pixel point position according to the brightness difference; determining a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference; and weighting and synthesizing the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.
-
- respectively presetting a corresponding short-exposure weight coefficient, a medium-exposure weight coefficient and a long-exposure weight coefficient for the short-exposure image, the medium-exposure image and the long-exposure image; and
- weighting and synthesizing pixel points of the short-exposure image, the medium-exposure image and the long-exposure image at the pixel point position into a pixel point of the HDR image at a same pixel point position according to the short-exposure weight coefficient, the medium-exposure weight coefficient and the long-exposure weight coefficient when the motion state at the pixel point position is the stationary pixel and the image brightness type is a high-light image.
-
- discarding the short-exposure image and the long-exposure image when the motion state at the pixel point position is the stationary pixel and the image brightness type is the high-light image;
- weighting and synthesizing the pixel points of the medium-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient;
- discarding the short-exposure image when the motion state is the stationary pixel and the image brightness type is a low-light image;
- weighting and synthesizing the pixel points of the medium-exposure image and the long-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the medium-exposure weight coefficient and the long-exposure weight coefficient;
- discarding the medium-exposure image and the long-exposure image when the motion state is the moving pixel and the image brightness type is the low-light image; and
- weighting and synthesizing the pixel points of the short-exposure image at the pixel point position into the pixel point of the HDR image at the same pixel point position according to the short-exposure weight coefficient.
-
- an image acquisition module, configured to acquire a plurality of to-be-synthesized images, each having different exposure time; a scene detection module, configured to calculate a mean brightness of the to-be-synthesized images, and determine an image brightness type of the to-be-synthesized images according to the mean brightness; a secondary difference calculation module, configured to calculate a brightness difference between adjacent pixel points in one to-be-synthesized image, and calculate an inter-frame difference of the to-be-synthesized image at a same pixel point position according to the brightness difference; a motion detection module, configured to determine a motion state of the to-be-synthesized images at the pixel point position according to the inter-frame difference; and a synthesis module, configured to weight and synthesize the to-be-synthesized images into a corresponding HDR image according to the image brightness type and the motion state.
-
- an image sensor, configured to capture a plurality of images with set shooting parameters; a controller, connected to the image sensor and configured to trigger the image sensor to capture the plurality of images with different exposure time; an image processor, configured to receive the plurality of images captured by the image sensor through continuous exposure and perform the above HDR image synthesis method on the received plurality of images to obtain an HDR image; and a storage device, connected to the image processor and configured to store the HDR image.
ΔS (i,j) =∥S (i,j) −S (i+1,j) |−|S (i,j) −S (i,j+1)∥ (2-1)
ΔM (i,j) =∥M (i,j) −M (i+1,j) |−|M (i,j) −M (i,j+1)∥ (2-2)
ΔL (i,j) =∥L (i,j) −L (i+1,j) |−|L (i,j) −L (i,j+1)∥ (2-3)
H (i,j) =a×S (i,j) +b×M (i,j) +c×L (i,j) (3)
H (i,j) =b×M (i,j) (3-1)
H (i,j) =b×M (i,j) +c×L (i,j) (3-2)
H (i,j) =c×L (i,j) (3-3)
Claims (20)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010291571.0 | 2020-04-14 | ||
CN202010291571.0A CN111479072B (en) | 2020-04-14 | 2020-04-14 | High dynamic range image synthesis method and device, image processing chip and aerial camera |
PCT/CN2021/083350 WO2021208706A1 (en) | 2020-04-14 | 2021-03-26 | High dynamic range image composition method and device, image processing chip and aerial camera |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/083350 Continuation WO2021208706A1 (en) | 2020-04-14 | 2021-03-26 | High dynamic range image composition method and device, image processing chip and aerial camera |
Publications (2)
Publication Number | Publication Date |
---|---|
US20230038844A1 US20230038844A1 (en) | 2023-02-09 |
US12041358B2 true US12041358B2 (en) | 2024-07-16 |
Family
ID=71751968
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/938,517 Active 2041-07-03 US12041358B2 (en) | 2020-04-14 | 2022-10-06 | High dynamic range image synthesis method and apparatus, image processing chip and aerial camera |
Country Status (3)
Country | Link |
---|---|
US (1) | US12041358B2 (en) |
CN (1) | CN111479072B (en) |
WO (1) | WO2021208706A1 (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IT201900005536A1 (en) * | 2019-04-10 | 2020-10-10 | Doss Visual Solution S R L | IMAGE ACQUISITION METHOD FOR AN OPTICAL INSPECTION MACHINE |
CN111292282B (en) * | 2020-01-21 | 2023-02-17 | 展讯通信(上海)有限公司 | Method and device for generating low-bit-width HDR image, storage medium and terminal |
CN111479072B (en) | 2020-04-14 | 2021-12-17 | 深圳市道通智能航空技术股份有限公司 | High dynamic range image synthesis method and device, image processing chip and aerial camera |
CN111770243B (en) * | 2020-08-04 | 2021-09-03 | 深圳市精锋医疗科技有限公司 | Image processing method, device and storage medium for endoscope |
CN111787183B (en) * | 2020-08-04 | 2021-09-03 | 深圳市精锋医疗科技有限公司 | Image processing method, device and storage medium for endoscope |
WO2022041287A1 (en) * | 2020-08-31 | 2022-03-03 | 华为技术有限公司 | Image acquisition method and apparatus, device, and computer-readable storage medium |
CN114513610A (en) * | 2020-11-17 | 2022-05-17 | 浙江大华技术股份有限公司 | Image processing method, image processing apparatus, and storage apparatus |
CN114630053B (en) * | 2020-12-11 | 2023-12-12 | 青岛海信移动通信技术有限公司 | HDR image display method and display device |
CN114650361B (en) * | 2020-12-17 | 2023-06-06 | 北京字节跳动网络技术有限公司 | Shooting mode determining method, shooting mode determining device, electronic equipment and storage medium |
CN114820404B (en) * | 2021-01-29 | 2024-08-20 | 抖音视界有限公司 | Image processing method, device, electronic equipment and medium |
CN115037915B (en) * | 2021-03-05 | 2023-11-14 | 华为技术有限公司 | Video processing method and processing device |
KR20220156242A (en) * | 2021-05-18 | 2022-11-25 | 에스케이하이닉스 주식회사 | Image Processing Device |
CN113822926A (en) * | 2021-07-23 | 2021-12-21 | 昆山丘钛光电科技有限公司 | RAW image size determination method, apparatus and medium |
US11863880B2 (en) * | 2022-05-31 | 2024-01-02 | Microsoft Technology Licensing, Llc | Image frame selection for multi-frame fusion |
CN117082355B (en) * | 2023-09-19 | 2024-04-12 | 荣耀终端有限公司 | Image processing method and electronic device |
Citations (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060177150A1 (en) | 2005-02-01 | 2006-08-10 | Microsoft Corporation | Method and system for combining multiple exposure images having scene and camera motion |
US7382931B2 (en) * | 2003-04-29 | 2008-06-03 | Microsoft Corporation | System and process for generating high dynamic range video |
CN101262564A (en) | 2007-03-09 | 2008-09-10 | 索尼株式会社 | Image processing apparatus, image forming apparatus, image processing method, and computer program |
EP2175635A1 (en) | 2008-10-10 | 2010-04-14 | Samsung Electronics Co., Ltd. | Method and apparatus for creating high dynamic range image |
US20100328482A1 (en) * | 2009-06-26 | 2010-12-30 | Samsung Electronics Co., Ltd. | Digital photographing apparatus, method of controlling the digital photographing apparatus, and recording medium storing program to implement the method |
US20130136364A1 (en) | 2011-11-28 | 2013-05-30 | Fujitsu Limited | Image combining device and method and storage medium storing image combining program |
CN103973989A (en) | 2014-04-15 | 2014-08-06 | 北京理工大学 | Method and system for obtaining high-dynamic images |
US20150116525A1 (en) * | 2013-10-31 | 2015-04-30 | Himax Imaging Limited | Method for generating high dynamic range images |
US20150348242A1 (en) * | 2014-05-30 | 2015-12-03 | Apple Inc. | Scene Motion Correction In Fused Image Systems |
US9307212B2 (en) * | 2007-03-05 | 2016-04-05 | Fotonation Limited | Tone mapping for low-light video frame enhancement |
US9437171B2 (en) * | 2012-12-05 | 2016-09-06 | Texas Instruments Incorporated | Local tone mapping for high dynamic range images |
CN107231530A (en) | 2017-06-22 | 2017-10-03 | 维沃移动通信有限公司 | A kind of photographic method and mobile terminal |
CN108419023A (en) | 2018-03-26 | 2018-08-17 | 华为技术有限公司 | A kind of method and relevant device generating high dynamic range images |
WO2018190649A1 (en) | 2017-04-12 | 2018-10-18 | Samsung Electronics Co., Ltd. | Method and apparatus for generating hdr images |
CN108881731A (en) | 2018-08-06 | 2018-11-23 | Oppo广东移动通信有限公司 | Panorama shooting method, device and imaging device |
CN108989700A (en) | 2018-08-13 | 2018-12-11 | Oppo广东移动通信有限公司 | Image formation control method, device, electronic equipment and computer readable storage medium |
CN109005361A (en) | 2018-08-06 | 2018-12-14 | Oppo广东移动通信有限公司 | Control method, device, imaging device, electronic equipment and readable storage medium storing program for executing |
CN109005346A (en) | 2018-08-13 | 2018-12-14 | Oppo广东移动通信有限公司 | Control method, device, electronic equipment and computer readable storage medium |
US10165194B1 (en) * | 2016-12-16 | 2018-12-25 | Amazon Technologies, Inc. | Multi-sensor camera system |
CN109120862A (en) | 2018-10-15 | 2019-01-01 | Oppo广东移动通信有限公司 | High-dynamic-range image acquisition method, device and mobile terminal |
CN109286758A (en) | 2018-10-15 | 2019-01-29 | Oppo广东移动通信有限公司 | A kind of generation method of high dynamic range images, mobile terminal and storage medium |
US10264193B2 (en) * | 2016-12-06 | 2019-04-16 | Polycom, Inc. | System and method for providing images and video having high dynamic range |
CN110381263A (en) | 2019-08-20 | 2019-10-25 | Oppo广东移动通信有限公司 | Image processing method, device, storage medium and electronic equipment |
CN110572585A (en) | 2019-08-26 | 2019-12-13 | Oppo广东移动通信有限公司 | Image processing method, image processing device, storage medium and electronic equipment |
US10616499B2 (en) * | 2016-07-29 | 2020-04-07 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method and device for capturing high dynamic range image, and electronic device |
US10701279B2 (en) * | 2018-10-02 | 2020-06-30 | Adobe Inc. | Utilizing alignment models and motion vector path blending to generate a long exposure digital image from a sequence of short exposure digital images |
US20200211166A1 (en) * | 2018-12-28 | 2020-07-02 | Qualcomm Incorporated | Methods and apparatus for motion compensation in high dynamic range processing |
US20200236273A1 (en) * | 2019-01-18 | 2020-07-23 | Samsung Electronics Co., Ltd. | Imaging systems for generating hdr images and operating methods thereof |
CN111479072A (en) | 2020-04-14 | 2020-07-31 | 深圳市道通智能航空技术有限公司 | High dynamic range image synthesis method and device, image processing chip and aerial camera |
US10742892B1 (en) * | 2019-02-18 | 2020-08-11 | Samsung Electronics Co., Ltd. | Apparatus and method for capturing and blending multiple images for high-quality flash photography using mobile electronic device |
US10750098B2 (en) * | 2016-10-13 | 2020-08-18 | Panasonic Intellectual Property Management Co., Ltd. | Image processing device, image processing method, and image processing circuit |
US10757344B2 (en) * | 2016-07-01 | 2020-08-25 | Maxell, Ltd. | Imaging apparatus, imaging method and imaging program |
US10916036B2 (en) * | 2018-12-28 | 2021-02-09 | Intel Corporation | Method and system of generating multi-exposure camera statistics for image processing |
US10944914B1 (en) * | 2019-12-02 | 2021-03-09 | Samsung Electronics Co., Ltd. | System and method for generating multi-exposure frames from single input |
US11017509B2 (en) * | 2016-12-22 | 2021-05-25 | Huawei Technologies Co., Ltd. | Method and apparatus for generating high dynamic range image |
US11095829B2 (en) * | 2019-06-11 | 2021-08-17 | Samsung Electronics Co., Ltd. | Apparatus and method for high dynamic range (HDR) image creation of dynamic scenes using graph cut-based labeling |
US11113802B1 (en) * | 2019-09-09 | 2021-09-07 | Apple Inc. | Progressive image fusion |
US20210278836A1 (en) * | 2018-11-07 | 2021-09-09 | Autel Robotics Co., Ltd. | Method and device for dual-light image integration, and unmanned aerial vehicle |
US11128809B2 (en) * | 2019-02-15 | 2021-09-21 | Samsung Electronics Co., Ltd. | System and method for compositing high dynamic range images |
US11190707B2 (en) * | 2018-05-22 | 2021-11-30 | Arashi Vision Inc. | Motion ghost resistant HDR image generation method and portable terminal |
US20210377457A1 (en) * | 2017-10-31 | 2021-12-02 | Morpho, Inc. | Image compositing device, image compositing method, and storage medium |
US20210400172A1 (en) * | 2019-03-06 | 2021-12-23 | Autel Robotics Co., Ltd. | Imaging processing method and apparatus for a camera module in a night scene, an electronic device, and a storage medium |
US20220043117A1 (en) * | 2019-11-13 | 2022-02-10 | Lumotive, LLC | Lidar systems based on tunable optical metasurfaces |
US11276154B2 (en) * | 2020-07-17 | 2022-03-15 | Samsung Electronics Co., Ltd. | Multi-frame depth-based multi-camera relighting of images |
US20220138964A1 (en) * | 2020-10-30 | 2022-05-05 | Qualcomm Incorporated | Frame processing and/or capture instruction systems and techniques |
US11356604B2 (en) * | 2020-02-14 | 2022-06-07 | Pixelworks, Inc. | Methods and systems for image processing with multiple image sources |
US11363213B1 (en) * | 2021-02-26 | 2022-06-14 | Qualcomm Incorporated | Minimizing ghosting in high dynamic range image processing |
US20220198625A1 (en) * | 2019-04-11 | 2022-06-23 | Dolby Laboratories Licensing Corporation | High-dynamic-range image generation with pre-combination denoising |
US11373281B1 (en) * | 2021-02-23 | 2022-06-28 | Qualcomm Incorporated | Techniques for anchor frame switching |
US11379997B2 (en) * | 2019-11-01 | 2022-07-05 | Samsung Electronics Co., Ltd. | Image devices including image sensors and image signal processors, and operation methods of image sensors |
US20220230283A1 (en) * | 2021-01-21 | 2022-07-21 | Beijing Xiaomi Pinecone Electronics Co., Ltd. | Method and device for processing image, and storage medium |
US20220236056A1 (en) * | 2019-08-28 | 2022-07-28 | Autel Robotics Co., Ltd. | Metering adjustment method, apparatus and device and storage medium |
US11457157B2 (en) * | 2019-01-04 | 2022-09-27 | Gopro, Inc. | High dynamic range processing based on angular rate measurements |
US20220345607A1 (en) * | 2019-12-30 | 2022-10-27 | Autel Robotics Co., Ltd. | Image exposure method and device, unmanned aerial vehicle |
US11539895B1 (en) * | 2021-09-27 | 2022-12-27 | Wisconsin Alumni Research Foundation | Systems, methods, and media for motion adaptive imaging using single-photon image sensor data |
US11570374B1 (en) * | 2020-09-23 | 2023-01-31 | Apple Inc. | Subject-aware low light photography |
US11653088B2 (en) * | 2016-05-25 | 2023-05-16 | Gopro, Inc. | Three-dimensional noise reduction |
US11671714B1 (en) * | 2022-01-24 | 2023-06-06 | Qualcomm Incorporated | Motion based exposure control |
US11710223B2 (en) * | 2020-02-14 | 2023-07-25 | Pixelworks, Inc. | Methods and systems for image processing with multiple image sources |
US20230269489A1 (en) * | 2022-02-23 | 2023-08-24 | Gopro, Inc. | Method and apparatus for multi-image multi-exposure processing |
US11825207B1 (en) * | 2022-05-02 | 2023-11-21 | Qualcomm Incorporated | Methods and systems for shift estimation for one or more output frames |
US20230388668A1 (en) * | 2022-05-26 | 2023-11-30 | Guangzhou Tyrafos Semiconductor Technologies Co., Ltd. | Image sensor circuit and image sensor device |
US11863880B2 (en) * | 2022-05-31 | 2024-01-02 | Microsoft Technology Licensing, Llc | Image frame selection for multi-frame fusion |
-
2020
- 2020-04-14 CN CN202010291571.0A patent/CN111479072B/en active Active
-
2021
- 2021-03-26 WO PCT/CN2021/083350 patent/WO2021208706A1/en active Application Filing
-
2022
- 2022-10-06 US US17/938,517 patent/US12041358B2/en active Active
Patent Citations (64)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7382931B2 (en) * | 2003-04-29 | 2008-06-03 | Microsoft Corporation | System and process for generating high dynamic range video |
US20060177150A1 (en) | 2005-02-01 | 2006-08-10 | Microsoft Corporation | Method and system for combining multiple exposure images having scene and camera motion |
US9307212B2 (en) * | 2007-03-05 | 2016-04-05 | Fotonation Limited | Tone mapping for low-light video frame enhancement |
CN101262564A (en) | 2007-03-09 | 2008-09-10 | 索尼株式会社 | Image processing apparatus, image forming apparatus, image processing method, and computer program |
EP2175635A1 (en) | 2008-10-10 | 2010-04-14 | Samsung Electronics Co., Ltd. | Method and apparatus for creating high dynamic range image |
US20100328482A1 (en) * | 2009-06-26 | 2010-12-30 | Samsung Electronics Co., Ltd. | Digital photographing apparatus, method of controlling the digital photographing apparatus, and recording medium storing program to implement the method |
US20130136364A1 (en) | 2011-11-28 | 2013-05-30 | Fujitsu Limited | Image combining device and method and storage medium storing image combining program |
US9437171B2 (en) * | 2012-12-05 | 2016-09-06 | Texas Instruments Incorporated | Local tone mapping for high dynamic range images |
US20150116525A1 (en) * | 2013-10-31 | 2015-04-30 | Himax Imaging Limited | Method for generating high dynamic range images |
CN103973989A (en) | 2014-04-15 | 2014-08-06 | 北京理工大学 | Method and system for obtaining high-dynamic images |
US20150348242A1 (en) * | 2014-05-30 | 2015-12-03 | Apple Inc. | Scene Motion Correction In Fused Image Systems |
US11653088B2 (en) * | 2016-05-25 | 2023-05-16 | Gopro, Inc. | Three-dimensional noise reduction |
US10757344B2 (en) * | 2016-07-01 | 2020-08-25 | Maxell, Ltd. | Imaging apparatus, imaging method and imaging program |
US10616499B2 (en) * | 2016-07-29 | 2020-04-07 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method and device for capturing high dynamic range image, and electronic device |
US10750098B2 (en) * | 2016-10-13 | 2020-08-18 | Panasonic Intellectual Property Management Co., Ltd. | Image processing device, image processing method, and image processing circuit |
US10264193B2 (en) * | 2016-12-06 | 2019-04-16 | Polycom, Inc. | System and method for providing images and video having high dynamic range |
US10165194B1 (en) * | 2016-12-16 | 2018-12-25 | Amazon Technologies, Inc. | Multi-sensor camera system |
US11017509B2 (en) * | 2016-12-22 | 2021-05-25 | Huawei Technologies Co., Ltd. | Method and apparatus for generating high dynamic range image |
WO2018190649A1 (en) | 2017-04-12 | 2018-10-18 | Samsung Electronics Co., Ltd. | Method and apparatus for generating hdr images |
US10638052B2 (en) * | 2017-04-12 | 2020-04-28 | Samsung Electronics Co., Ltd. | Method and apparatus for generating HDR images |
CN107231530A (en) | 2017-06-22 | 2017-10-03 | 维沃移动通信有限公司 | A kind of photographic method and mobile terminal |
US20210377457A1 (en) * | 2017-10-31 | 2021-12-02 | Morpho, Inc. | Image compositing device, image compositing method, and storage medium |
CN108419023A (en) | 2018-03-26 | 2018-08-17 | 华为技术有限公司 | A kind of method and relevant device generating high dynamic range images |
US11190707B2 (en) * | 2018-05-22 | 2021-11-30 | Arashi Vision Inc. | Motion ghost resistant HDR image generation method and portable terminal |
CN109005361A (en) | 2018-08-06 | 2018-12-14 | Oppo广东移动通信有限公司 | Control method, device, imaging device, electronic equipment and readable storage medium storing program for executing |
CN108881731A (en) | 2018-08-06 | 2018-11-23 | Oppo广东移动通信有限公司 | Panorama shooting method, device and imaging device |
CN108989700A (en) | 2018-08-13 | 2018-12-11 | Oppo广东移动通信有限公司 | Image formation control method, device, electronic equipment and computer readable storage medium |
CN109005346A (en) | 2018-08-13 | 2018-12-14 | Oppo广东移动通信有限公司 | Control method, device, electronic equipment and computer readable storage medium |
US10701279B2 (en) * | 2018-10-02 | 2020-06-30 | Adobe Inc. | Utilizing alignment models and motion vector path blending to generate a long exposure digital image from a sequence of short exposure digital images |
CN109120862A (en) | 2018-10-15 | 2019-01-01 | Oppo广东移动通信有限公司 | High-dynamic-range image acquisition method, device and mobile terminal |
CN109286758A (en) | 2018-10-15 | 2019-01-29 | Oppo广东移动通信有限公司 | A kind of generation method of high dynamic range images, mobile terminal and storage medium |
US20210278836A1 (en) * | 2018-11-07 | 2021-09-09 | Autel Robotics Co., Ltd. | Method and device for dual-light image integration, and unmanned aerial vehicle |
US20200211166A1 (en) * | 2018-12-28 | 2020-07-02 | Qualcomm Incorporated | Methods and apparatus for motion compensation in high dynamic range processing |
US10916036B2 (en) * | 2018-12-28 | 2021-02-09 | Intel Corporation | Method and system of generating multi-exposure camera statistics for image processing |
US11457157B2 (en) * | 2019-01-04 | 2022-09-27 | Gopro, Inc. | High dynamic range processing based on angular rate measurements |
US20200236273A1 (en) * | 2019-01-18 | 2020-07-23 | Samsung Electronics Co., Ltd. | Imaging systems for generating hdr images and operating methods thereof |
US11128809B2 (en) * | 2019-02-15 | 2021-09-21 | Samsung Electronics Co., Ltd. | System and method for compositing high dynamic range images |
US10742892B1 (en) * | 2019-02-18 | 2020-08-11 | Samsung Electronics Co., Ltd. | Apparatus and method for capturing and blending multiple images for high-quality flash photography using mobile electronic device |
US20210400172A1 (en) * | 2019-03-06 | 2021-12-23 | Autel Robotics Co., Ltd. | Imaging processing method and apparatus for a camera module in a night scene, an electronic device, and a storage medium |
US20220198625A1 (en) * | 2019-04-11 | 2022-06-23 | Dolby Laboratories Licensing Corporation | High-dynamic-range image generation with pre-combination denoising |
US11095829B2 (en) * | 2019-06-11 | 2021-08-17 | Samsung Electronics Co., Ltd. | Apparatus and method for high dynamic range (HDR) image creation of dynamic scenes using graph cut-based labeling |
CN110381263A (en) | 2019-08-20 | 2019-10-25 | Oppo广东移动通信有限公司 | Image processing method, device, storage medium and electronic equipment |
CN110572585A (en) | 2019-08-26 | 2019-12-13 | Oppo广东移动通信有限公司 | Image processing method, image processing device, storage medium and electronic equipment |
US20220236056A1 (en) * | 2019-08-28 | 2022-07-28 | Autel Robotics Co., Ltd. | Metering adjustment method, apparatus and device and storage medium |
US11113802B1 (en) * | 2019-09-09 | 2021-09-07 | Apple Inc. | Progressive image fusion |
US11379997B2 (en) * | 2019-11-01 | 2022-07-05 | Samsung Electronics Co., Ltd. | Image devices including image sensors and image signal processors, and operation methods of image sensors |
US20220043117A1 (en) * | 2019-11-13 | 2022-02-10 | Lumotive, LLC | Lidar systems based on tunable optical metasurfaces |
US10944914B1 (en) * | 2019-12-02 | 2021-03-09 | Samsung Electronics Co., Ltd. | System and method for generating multi-exposure frames from single input |
US20220345607A1 (en) * | 2019-12-30 | 2022-10-27 | Autel Robotics Co., Ltd. | Image exposure method and device, unmanned aerial vehicle |
US11710223B2 (en) * | 2020-02-14 | 2023-07-25 | Pixelworks, Inc. | Methods and systems for image processing with multiple image sources |
US11356604B2 (en) * | 2020-02-14 | 2022-06-07 | Pixelworks, Inc. | Methods and systems for image processing with multiple image sources |
CN111479072A (en) | 2020-04-14 | 2020-07-31 | 深圳市道通智能航空技术有限公司 | High dynamic range image synthesis method and device, image processing chip and aerial camera |
US11276154B2 (en) * | 2020-07-17 | 2022-03-15 | Samsung Electronics Co., Ltd. | Multi-frame depth-based multi-camera relighting of images |
US11570374B1 (en) * | 2020-09-23 | 2023-01-31 | Apple Inc. | Subject-aware low light photography |
US20220138964A1 (en) * | 2020-10-30 | 2022-05-05 | Qualcomm Incorporated | Frame processing and/or capture instruction systems and techniques |
US20220230283A1 (en) * | 2021-01-21 | 2022-07-21 | Beijing Xiaomi Pinecone Electronics Co., Ltd. | Method and device for processing image, and storage medium |
US11373281B1 (en) * | 2021-02-23 | 2022-06-28 | Qualcomm Incorporated | Techniques for anchor frame switching |
US11363213B1 (en) * | 2021-02-26 | 2022-06-14 | Qualcomm Incorporated | Minimizing ghosting in high dynamic range image processing |
US11539895B1 (en) * | 2021-09-27 | 2022-12-27 | Wisconsin Alumni Research Foundation | Systems, methods, and media for motion adaptive imaging using single-photon image sensor data |
US11671714B1 (en) * | 2022-01-24 | 2023-06-06 | Qualcomm Incorporated | Motion based exposure control |
US20230269489A1 (en) * | 2022-02-23 | 2023-08-24 | Gopro, Inc. | Method and apparatus for multi-image multi-exposure processing |
US11825207B1 (en) * | 2022-05-02 | 2023-11-21 | Qualcomm Incorporated | Methods and systems for shift estimation for one or more output frames |
US20230388668A1 (en) * | 2022-05-26 | 2023-11-30 | Guangzhou Tyrafos Semiconductor Technologies Co., Ltd. | Image sensor circuit and image sensor device |
US11863880B2 (en) * | 2022-05-31 | 2024-01-02 | Microsoft Technology Licensing, Llc | Image frame selection for multi-frame fusion |
Non-Patent Citations (1)
Title |
---|
The International Search Report mailed Jun. 15, 2021; PCT/CN2021/083350. |
Also Published As
Publication number | Publication date |
---|---|
US20230038844A1 (en) | 2023-02-09 |
WO2021208706A1 (en) | 2021-10-21 |
CN111479072A (en) | 2020-07-31 |
CN111479072B (en) | 2021-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12041358B2 (en) | High dynamic range image synthesis method and apparatus, image processing chip and aerial camera | |
AU2019326496B2 (en) | Method for capturing images at night, apparatus, electronic device, and storage medium | |
US12052514B2 (en) | Imaging processing method and apparatus for a camera module in a night scene, an electronic device, and a storage medium | |
US11532076B2 (en) | Image processing method, electronic device and storage medium | |
CN108322646B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
CN109218627B (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN104184958A (en) | Automatic exposure control method and device based on FPGA (field programmable Gate array) and suitable for space detection imaging | |
CN110493524B (en) | Photometric adjustment method, device and equipment and storage medium | |
CN108337447A (en) | High dynamic range images exposure compensating value-acquiring method, device, equipment and medium | |
EP3481051A1 (en) | Combining optical and digital zoom under varying image capturing conditions | |
US9961269B2 (en) | Imaging device, imaging device body, and lens barrel that can prevent an image diaphragm value from frequently changing | |
US10609265B2 (en) | Methods and apparatus for synchronizing camera flash and sensor blanking | |
CN105867047A (en) | Flashlight adjusting method and shooting device | |
CN112335224A (en) | Image acquisition method and device for movable platform and storage medium | |
EP3454547A1 (en) | Imaging apparatus, image processing apparatus, imaging method, image processing method, and storage medium | |
US11032483B2 (en) | Imaging apparatus, imaging method, and program | |
US20090002544A1 (en) | Methods of adding additional parameters during automatic exposure for a digital camera and related electronic devices and computer program products | |
US10944899B2 (en) | Image processing device and image processing method | |
CN117135293A (en) | Image processing method and electronic device | |
US9807311B2 (en) | Imaging apparatus, video data transmitting apparatus, video data transmitting and receiving system, image processing method, and program | |
WO2021253167A1 (en) | Digital zoom imaging method and apparatus, camera, and unmanned aerial vehicle system | |
JP2019033470A (en) | Image processing system, imaging apparatus, image processing apparatus, control method, and program | |
JP7353864B2 (en) | Information processing device, control method and program for information processing device, imaging system | |
KR20080057345A (en) | Imaging system with adjustable optics | |
CN114928694A (en) | Image acquisition method and apparatus, device, and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: AUTEL ROBOTICS CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LI, ZHAOZAO;REEL/FRAME:061358/0947 Effective date: 20220624 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |