Nothing Special   »   [go: up one dir, main page]

WO2020034702A1 - 控制方法、装置、电子设备和计算机可读存储介质 - Google Patents

控制方法、装置、电子设备和计算机可读存储介质 Download PDF

Info

Publication number
WO2020034702A1
WO2020034702A1 PCT/CN2019/088244 CN2019088244W WO2020034702A1 WO 2020034702 A1 WO2020034702 A1 WO 2020034702A1 CN 2019088244 W CN2019088244 W CN 2019088244W WO 2020034702 A1 WO2020034702 A1 WO 2020034702A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
exposure
pixels
pixel unit
imaging
Prior art date
Application number
PCT/CN2019/088244
Other languages
English (en)
French (fr)
Inventor
张弓
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2020034702A1 publication Critical patent/WO2020034702A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Definitions

  • the present disclosure relates to the field of imaging technology, and in particular, to a control method, an apparatus, an electronic device, and a computer-readable storage medium.
  • the imaging device in the prior art uses a pixel unit array of a fixed structure for imaging.
  • the present disclosure provides a control method, device, electronic device, and computer-readable storage medium for automatically adjusting the arrangement of short exposure pixels, middle exposure pixels, and / or long exposure pixels in a pixel unit array according to the brightness level of a shooting environment. Position, so that imaging can be performed through the output pixel values of at least two exposed pixels in adjacent arrangements, which can retain more effective information in the captured image, improve the brightness of the captured image, and then improve the imaging effect and imaging quality, and improve the user ’s
  • the shooting experience is to solve the problem that when the hardware structure of the pixels in the imaging device is determined in the prior art, it cannot be changed, and it is difficult to adapt to a variety of different shooting scenes, resulting in the imaging quality of currently captured images being limited by the imaging devices in electronic devices. Technical issues that cannot improve image quality.
  • An embodiment of one aspect of the present disclosure provides a control method applied to an imaging device, where the imaging device includes a pixel unit array composed of multiple exposure pixels, each exposure pixel being a short exposure pixel, a medium exposure pixel, or a long exposure pixel, wherein The exposure duration of the long exposure pixel is greater than the exposure duration of the middle exposure pixel, and the exposure duration of the middle exposure pixel is greater than the exposure duration of the short exposure pixel.
  • the method includes the following steps:
  • the brightness level includes a low brightness level, a medium brightness level, and a high brightness level in which the brightness is arranged from small to large;
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, adjust the arrangement positions of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array so that at least two The exposure pixels are arranged adjacently, and at least two of the adjacently arranged exposure pixels are used as the first pixel unit;
  • Imaging is performed according to output pixel values of at least two middle-exposed pixels in the first pixel unit.
  • the control method of the embodiment of the present disclosure determines the brightness level of the ambient brightness.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level
  • the imaging device includes a pixel unit array composed of multiple exposure pixels. Each exposure pixel is a short exposure pixel, a medium exposure pixel, or a long exposure pixel. Wherein, the exposure time of the long exposure pixel is greater than the exposure time of the middle exposure pixel, and the exposure time of the middle exposure pixel is greater than the exposure time of the short exposure pixel, and the device includes:
  • a determining module for determining a brightness level of ambient brightness includes a low brightness level, a medium brightness level, and a high brightness level in which brightness is arranged from small to large;
  • An adjustment module for adjusting the arrangement position of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array if the brightness level of the shooting environment belongs to a high brightness level or a low brightness level; Arranging at least two middle exposure pixels adjacent to each other, and using at least two middle exposure pixels adjacent to each other as a first pixel unit;
  • the imaging module is configured to perform imaging according to the output pixel values of at least two of the exposed pixels in the first pixel unit.
  • the control device of the embodiment of the present disclosure determines the brightness level of the ambient brightness.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, it adjusts the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array.
  • the middle so that at least two middle exposure pixels are arranged next to each other, with at least two middle exposure pixels arranged next to each other as the first pixel unit, and then according to at least two middle exposure pixels in the first pixel unit Output pixel values for imaging. Therefore, the arrangement position of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array can be automatically adjusted according to the brightness level of the shooting environment, so as to output through at least two middle exposure pixels arranged adjacently.
  • Pixel values for imaging can retain more effective information in the captured image, improve the brightness of the captured image, thereby improving the imaging effect and imaging quality, and improving the user's shooting experience.
  • An embodiment of another aspect of the present disclosure provides an electronic device including an imaging device, the imaging device including a pixel unit array composed of multiple exposure pixels, each exposure pixel being a short exposure pixel, a medium exposure pixel, or a long exposure pixel, wherein The exposure time of the long exposure pixel is greater than the exposure time of the middle exposure pixel, and the exposure time of the middle exposure pixel is greater than the exposure time of the short exposure pixel, the electronic device further includes a memory, a processor, and a storage A computer program on a memory and executable on a processor that, when the processor executes the program, implements a control method as proposed by the foregoing embodiments of the present disclosure.
  • An embodiment of another aspect of the present disclosure provides a non-transitory computer-readable storage medium on which a computer program is stored, which is characterized in that when the program is executed by a processor, the control method as proposed in the foregoing embodiment of the present disclosure is implemented.
  • FIG. 1 is a schematic flowchart of a control method according to Embodiment 1 of the present disclosure
  • FIG. 2 is a first schematic structural diagram of a portion of a pixel unit array of an imaging device in an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of a grayscale histogram corresponding to a backlight scene in an embodiment of the present disclosure
  • FIG. 4 is a schematic flowchart of a control method provided in Embodiment 2 of the present disclosure.
  • FIG. 5 is a second schematic structural diagram of a portion of a pixel unit array of an imaging device in an embodiment of the present disclosure
  • FIG. 6 is a schematic flowchart of a control method according to a third embodiment of the present disclosure.
  • FIG. 7 is a schematic structural diagram of a photosensitive pixel unit in an embodiment of the present disclosure.
  • FIG. 8 is a schematic flowchart of a control method according to a fourth embodiment of the present disclosure.
  • FIG. 9 is a schematic structural diagram of a control device according to a fifth embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of a control device according to a sixth embodiment of the present disclosure.
  • FIG. 11 is a schematic block diagram of an electronic device according to some embodiments of the present disclosure.
  • FIG. 12 is a schematic block diagram of an image processing circuit according to some embodiments of the present disclosure.
  • the present disclosure aims at a technical problem of poor image quality in the prior art, and provides a control method.
  • the control method of the embodiment of the present disclosure determines the brightness level of the ambient brightness.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level
  • FIG. 1 is a schematic flowchart of a control method according to a first embodiment of the present disclosure.
  • the control method of the embodiment of the present disclosure is applied to an imaging device.
  • the imaging device includes a pixel unit array composed of multiple exposure pixels, and each exposure pixel is a short exposure pixel, a medium exposure pixel, or a long exposure pixel.
  • the long exposure pixel refers to the exposure time corresponding to the photosensitive pixel is the long exposure time
  • the medium exposure pixel refers to the exposure time corresponding to the photosensitive pixel is the middle exposure time
  • the short exposure pixel refers to the short exposure time corresponding to the photosensitive pixel.
  • Exposure time, long exposure time> medium exposure time> short exposure time that is, the long exposure time of the long exposure pixel is greater than the middle exposure time of the middle exposure pixel, and the middle exposure time of the middle exposure pixel is greater than the short exposure time of the short exposure pixel.
  • the long exposure pixels, the middle exposure pixels, and the short exposure pixels are simultaneously exposed.
  • the synchronous exposure means that the exposure times of the middle exposure pixels and the short exposure pixels are within the exposure time of the long exposure pixels.
  • the long-exposure pixel can be controlled to start exposure first.
  • the exposure of the middle-exposure pixel and the short-exposure pixel can be controlled.
  • the exposure cut-off time of the middle-exposure pixel and the short-exposure pixel should be longer than the long-exposure pixel.
  • the exposure cutoff time of the exposure pixel is the same or before the exposure cutoff time of the long exposure pixel.
  • the long exposure pixel, the middle exposure pixel, and the short exposure pixel can be controlled to start exposure at the same time, that is, the exposure start time of the long exposure pixel, the middle exposure pixel, and the short exposure pixel is the same. In this way, there is no need to control the pixel unit array to perform long exposure, medium exposure, and short exposure in order, which can reduce the shooting time of the image.
  • control method includes the following steps:
  • Step 101 Determine a brightness level of ambient brightness; the brightness level includes a low brightness level, a medium brightness level, and a high brightness level in which the brightness is arranged from small to large.
  • the ambient brightness may be divided into three brightness levels in advance, which are a low brightness level, a medium brightness level, and a high brightness level.
  • the brightness level may be set in advance by a built-in program of an electronic device, or, Set by the user, there is no restriction on this.
  • an independent light measuring device may be used to measure the ambient brightness, or the ISO value of the sensitivity automatically adjusted by the camera may be read, and the ambient brightness may be determined according to the read ISO value, or it may be controlled.
  • the pixel unit array measures the ambient brightness value to determine the ambient brightness, which is not limited. After determining the ambient brightness, the brightness level can be determined based on the ambient brightness.
  • the above ISO value is used to indicate the sensitivity of the camera.
  • Commonly used ISO values are 50, 100, 200, 400, 1000, etc.
  • the camera can automatically adjust the ISO value according to the ambient brightness. Therefore, in this embodiment, According to the ISO value, the ambient brightness can be deduced.
  • the ISO value can be 50 or 100 in the case of sufficient light, and the ISO value can be 400 or higher in the case of insufficient light.
  • the brightness level can be set differently.
  • the ISO value is between 200 and 500, and during the day, the electronic device When the device is outdoors, the ISO value is generally lower than 200. Therefore, the size and interval of each brightness level can be set according to actual needs and specific shooting scenes.
  • Step 102 if the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, adjust the arrangement position of the short exposure pixels, the middle exposure pixels, and / or the long exposure pixels in the pixel unit array so that at least two middle exposures
  • the pixels are arranged adjacently, and at least two of the exposed pixels in the adjacent arrangement are used as the first pixel unit.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, it indicates that the shooting environment is brighter or darker. At this time, the environment brightness is extremely extreme.
  • the noise of the output image is high, or the pixel value output by the long-exposure pixel or the short-exposure pixel may overflow, and the image details are lost seriously.
  • the red, green, and blue (RGB) three-color histogram the detailed content distribution of three shades with different lightness and darkness can be displayed intuitively, and the overexposed parts will gather at the left and right ends of the histogram, which is tolerated.
  • the area outside the degree will not be brighter, but all details will be lost, and only the full white color block (255,255,255) will be displayed.
  • the underexposed part will not be darker than the tolerance level, but will lose all details and only display All black patches (0,0,0).
  • the arrangement position of the short exposure pixels, the middle exposure pixels, and / or the long exposure pixels in the pixel unit array can be adjusted so that at least two middle exposure pixels are arranged next to each other. Cloth, using at least two exposed pixels in adjacent arrangements as the first pixel unit.
  • the positions of each of the middle-exposed pixels and adjacent short-exposure pixels or long-exposure pixels can be exchanged, so that at least two middle-exposure pixels are arranged adjacently, and then At least two middle-exposed pixels arranged adjacently serve as a first pixel unit.
  • FIG. 2 is a first schematic structural diagram of a part of a pixel unit array of an imaging device in an embodiment of the present disclosure.
  • the original pixel unit array includes 16 exposure pixels, which are 4 long exposure pixels (L), 8 middle exposure pixels (M), and 4 short exposure pixels (S).
  • L long exposure pixels
  • M middle exposure pixels
  • S short exposure pixels
  • the middle exposure pixel (M) can be exchanged with an adjacent short exposure pixel (S)
  • the middle exposure pixel (M) can be replaced with an adjacent long exposure pixel.
  • the exposure pixels (L) are swapped, so that eight adjacently arranged middle exposure pixels (M) included in the region 21 can be used as the first pixel unit, or two, four, or six adjacently arranged
  • Each of the exposed pixels (M) serves as a first pixel unit.
  • the number of medium exposure pixels (M) contained in each first pixel unit may be adjusted according to the brightness of the shooting environment as a possible implementation. The lower the brightness, the lower the medium contained in each first pixel unit. The greater the number of exposed pixels (M). Normally, each of the first pixel units can be fixed with 4 medium exposure pixels (M), which can meet the needs of most shooting scenes.
  • Step 103 Perform imaging according to output pixel values of at least two exposed pixels in the first pixel unit.
  • imaging can be performed according to at least two of the exposed pixels in the first pixel unit.
  • eight adjacently arranged middle exposure pixels in the area 21 can be used to output pixel values for imaging. It can be understood that imaging is performed by outputting pixel values of the medium-exposure pixels to avoid pixel value overflow, thereby retaining more image details and improving the imaging effect and imaging quality.
  • the control method of the embodiment of the present disclosure determines the brightness level of the ambient brightness.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level
  • the arrangement positions of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array are automatically adjusted, so that at least two The exposure pixels output pixel values for imaging, which can retain more effective information in the captured image and improve the brightness of the captured image.
  • the histogram of the captured preview image it is determined that the current shooting environment belongs to a backlit scene.
  • a grayscale histogram may be generated according to the grayscale value corresponding to the environmental brightness value measured by the pixel unit array, and then the current shooting is determined according to the proportion of the number of photosensitive pixels in each grayscale range. Whether the environment is a backlit scene.
  • the ratio grayRatio to all photosensitive pixels in the pixel unit array is greater than
  • the first threshold for example, the first threshold may be 0.135, and the grayscale value corresponding to the measured ambient brightness value is in the grayscale range [200,256) of the photosensitive pixels, and the ratio of grayRatio to all photosensitive pixels in the pixel unit array is greater than the second threshold
  • the second threshold may be 0.0899, it is determined that the current shooting environment is a backlit scene.
  • the ratio of grayRatio to all light-sensitive pixels in the pixel unit array is greater than The third threshold, for example, the third threshold may be 0.3, and the grayscale value corresponding to the measured ambient brightness value is in the grayscale range [200,256).
  • the ratio of grayRatio to all the photosensitive pixels in the pixel unit array is greater than the fourth threshold. For example, when the fourth threshold may be 0.003, it is determined that the current shooting environment is a backlit scene.
  • the ratio of grayRatio to all light-sensitive pixels in the pixel unit array is greater than The fifth threshold, for example, the fifth threshold may be 0.005, and the grayscale value corresponding to the measured ambient brightness value is in the grayscale range [200,256).
  • the ratio of grayRatio to all the photosensitive pixels in the pixel unit array is greater than the sixth threshold. For example, when the sixth threshold may be 0.25, it is determined that the current shooting environment is a backlit scene.
  • a grayscale histogram corresponding to a backlight scene may be shown in FIG. 3.
  • the ambient brightness value measured by each photosensitive pixel in the pixel unit array will have a high brightness difference. Therefore, as another possible implementation, It can also determine the brightness value of the imaging object and the background brightness value according to the ambient brightness value measured by the pixel unit array, and determine whether the difference between the brightness value of the imaging object and the background brightness value is greater than a preset threshold.
  • the current shooting environment is a backlit scene, and the difference between the brightness value of the imaging object and the brightness value of the background is less than or equal to the preset
  • the threshold value is determined, the current shooting environment is a non-backlit scene.
  • the preset threshold may be preset in a built-in program of the electronic device, or the preset threshold may be set by a user, which is not limited.
  • An imaging object is an object that needs to be photographed by an electronic device, such as a person (or a human face), an animal, an object, a scene, or the like.
  • step 102 may specifically include the following sub-steps:
  • Step 201 Control output pixel values of at least two exposed pixels in the first pixel unit.
  • each of the middle exposure pixels in the first pixel unit can be controlled to be exposed synchronously, and the exposure time of each of the middle exposure pixels is the same, that is, the exposure cut-off time of each of the middle exposure pixels is also the same.
  • the exposed pixels in each of the first pixel units will output corresponding pixel values. For example, referring to FIG. 2, the first pixel unit will output 8 pixel values.
  • Step 202 Generate a first composite pixel value according to the pixel value output by the same first pixel unit.
  • the pixel unit array includes multiple exposure pixels, and the arrangement positions of the short exposure pixels, medium exposure pixels, and / or long exposure pixels in the pixel unit array are adjusted so that at least two middle exposure pixels are adjacent to each other. After the arrangement, there may be a plurality of first pixel units.
  • the pixel unit array may be composed of a plurality of photosensitive pixel units, and each photosensitive pixel unit includes at least two exposure pixels, and the at least two exposure pixels include at least one exposure pixel.
  • FIG. 5 is a second schematic diagram of a partial structure of a pixel unit array of an imaging device in an embodiment of the present disclosure.
  • the imaging device 30 includes a pixel unit array 31 and a filter unit array 32 provided on the pixel unit array 31.
  • the pixel unit array 31 includes a plurality of photosensitive pixel units 311, and each photosensitive pixel unit 311 includes at least two exposure pixels 3111, and at least two of the exposure pixels include at least one of the exposure pixels.
  • each photosensitive pixel unit 311 includes four exposure pixels 3111.
  • the four exposure pixels may be one long exposure pixel, two medium exposure pixels, and one short exposure pixel.
  • the number of long-exposure pixels, middle-exposure pixels, and short-exposure pixels in each photosensitive pixel unit 311 may also be other values, which is not limited.
  • the filter unit array 32 includes a plurality of filter units 322 corresponding to the plurality of photosensitive pixel units 311, and each filter unit 322 covers the corresponding photosensitive pixel unit 311.
  • the pixel unit array 31 may be a Bayer array.
  • each exposure pixel included in the same photosensitive pixel unit is covered by the same color filter, and at least two exposure pixels belonging to the same first pixel unit are covered with the same color filter, that is, at least two of the exposure pixels belong to the same Photosensitive pixel unit. That is, in the present disclosure, the middle exposure pixels in the same first pixel unit are located in the same photosensitive pixel unit, that is, each photosensitive pixel unit in the pixel unit array has a first pixel unit.
  • the sum of the pixel values output by all the middle-exposed pixels included in the first pixel unit may be used as the first synthesized pixel value.
  • an average value of pixel values output by all the middle-exposed pixels included in the first pixel unit may be used, and the average value is used as the first composite pixel value.
  • Step 203 Perform imaging according to the first synthesized pixel value.
  • the first composite pixel value corresponding to each first pixel unit in the pixel array can be obtained, and then multiple The first synthesized pixel value is calculated by interpolation.
  • the arrangement positions of the short exposure pixels, the middle exposure pixels, and / or the long exposure pixels in the pixel unit array can be adjusted so that the middle exposure pixels are arranged at intervals, and then the short exposure pixels adjacent to the middle exposure pixels are arranged.
  • medium-exposure pixels are used for imaging.
  • long-exposure pixels can correct the dark areas in the image
  • short-exposure pixels can correct the light areas in the image, improving the imaging effect and imaging quality.
  • FIG. 6 is a schematic flowchart of a control method provided in Embodiment 3 of the present disclosure.
  • control method may further include the following steps:
  • Step 301 if the brightness of the shooting environment is equal to the medium brightness level, adjust the arrangement position of the short exposure pixels, the medium exposure pixels, and / or the long exposure pixels in the pixel unit array so that the middle exposure pixels are arranged at intervals. At least two exposure pixels covered by the same color filter and short exposure pixels and / or long exposure pixels adjacent to the at least two exposure pixels covered by the same color filter are used as the second pixel unit.
  • the arrangement positions of the short exposure pixels, the middle exposure pixels, and / or the long exposure pixels in the pixel unit array can be adjusted, so that the middle exposure pixels are arranged at intervals.
  • each of the exposed pixels may be swapped with an adjacent short-exposed pixel and / or a long-exposed pixel, so that the middle-exposed pixels are arranged at intervals, and at least two of the same-color filters are covered.
  • the second pixel unit is an exposure pixel and a short exposure pixel or a long exposure pixel adjacent to at least two exposure pixels covered by the same color filter.
  • At least two exposure pixels covered by the same color filter and short exposure pixels adjacent to the at least two exposure pixels covered by the same color filter may be used as the second pixel unit, or the same color unit may be used.
  • the at least two exposure pixels covered by the filter and the long exposure pixels adjacent to the at least two exposure pixels covered by the same-color filter are used as the second pixel unit, or the at least two exposure pixels covered by the same-color filter,
  • a short exposure pixel adjacent to at least two exposure pixels covered by the same color filter and a long exposure pixel adjacent to at least two exposure pixels covered by the same color filter are used as the second pixel unit. Since each exposure pixel contained in the same photosensitive pixel unit is covered by the same color filter, each photosensitive pixel unit in the pixel unit array has a second pixel unit.
  • one photosensitive pixel unit includes 4 exposure pixels, which are 1 long exposure pixel (L), 2 middle exposure pixels (M), and 1 short exposure pixel (S).
  • the middle exposure pixel (M) may be exchanged with an adjacent short exposure pixel (S), or the middle exposure pixel (M) and an adjacent long exposure pixel ( L)
  • the positions are swapped so that the adjacent two exposed pixels (M) are arranged at intervals.
  • two middle exposure pixels (M) and one short exposure pixel (S) in the photosensitive pixel unit can be used as the second pixel unit, or two middle exposure pixels (M) and one long exposure pixel in the photosensitive pixel unit can be used.
  • the exposure pixel (L) is used as the second pixel unit, or two middle exposure pixels (M), one long exposure pixel (L), and one short exposure pixel (S) in the photosensitive pixel unit may be used as the second pixel unit.
  • Pixel unit Pixel unit.
  • Step 302 Perform imaging according to the pixel value output by each exposed pixel in the second pixel unit.
  • imaging can be performed according to the pixel value output by each exposure pixel in the second pixel unit. Therefore, the long exposure pixels can correct the dark areas in the image, or the short exposure pixels can correct the light areas in the image, improving the imaging effect and imaging quality.
  • step 302 may specifically include the following sub-steps:
  • Step 401 Control the output pixel value of each exposed pixel in the second pixel unit.
  • the output pixel value of each exposed pixel in the second pixel unit may be controlled.
  • the second pixel unit may be controlled to output multiple pixel values at different exposure times, for example, the middle exposure pixel and long exposure pixel in the second pixel unit may be controlled to be exposed synchronously, or, Control the synchronous exposure of the middle and short exposure pixels in the second pixel unit, or control the synchronous exposure of the middle, long, and short exposure pixels in the second pixel unit, where the exposure time corresponding to the long exposure pixel is Initial long exposure time.
  • the exposure time corresponding to the medium exposure pixel is the initial medium exposure time.
  • the exposure time corresponding to the short exposure pixel is the initial short exposure time.
  • the initial long exposure time, the initial medium exposure time, and the initial short exposure time are all preset. Ok. After the exposure is over, the second pixel unit will output multiple pixel values at different exposure times.
  • the second pixel unit may also be controlled to output a plurality of pixel values obtained through exposure at the same exposure time.
  • the middle exposure pixel and the long exposure pixel in the second pixel unit can be controlled to be exposed synchronously, or the middle exposure pixel and the short exposure pixel in the second pixel unit can be controlled to be simultaneously exposed, or the second pixel unit can be controlled to be exposed synchronously.
  • the middle exposure pixel, the long exposure pixel, and the short exposure pixel are simultaneously exposed.
  • the exposure time of each exposure pixel is the same, that is, the exposure cut-off time of the middle exposure pixel, the long exposure pixel, and the short exposure pixel is also the same.
  • the second pixel unit will output a plurality of pixel values obtained by exposure with the same exposure time.
  • Step 402 Generate a second synthesized pixel value according to the pixel value output by the same second pixel unit.
  • each photosensitive pixel unit in the pixel unit array has a second pixel unit
  • a second composite pixel value can be generated according to a pixel value output by the same second pixel unit.
  • the second pixel unit after controlling the second pixel unit to output multiple pixel values respectively under different exposure times, for the same second pixel unit, it can be calculated based on the pixel values in the second pixel unit that have the same exposure time. Get the second synthesized pixel value.
  • each second pixel unit when each second pixel unit includes one long exposure pixel and two middle exposure pixels, the pixel value output by the only long exposure pixel is the second composite pixel value of the long exposure, and two The sum of the pixel values output by the middle exposure pixels is the second composite pixel value of the middle exposure; when each second pixel unit includes 2 middle exposure pixels and 1 short exposure pixel, the pixels output by the 2 middle exposure pixels The sum of the values is the second composite pixel value for medium exposure, and the pixel value output by one short exposure pixel is the second composite pixel value for short exposure. In this way, the second composite pixel value of multiple exposures, the second composite pixel value of multiple long exposures, and the second composite pixel value of multiple short exposures can be obtained for the entire pixel unit array.
  • multiple pixel values output by the second pixel unit may be calculated An average value is obtained to obtain a second synthesized pixel value, wherein each second pixel unit corresponds to a second synthesized pixel value.
  • each second pixel unit includes one long-exposure pixel and two middle-exposure pixels
  • the value of one pixel output of one long-exposure pixel and the two-pixel value of two middle-exposure pixels output are respectively : R1, R2, and R3,
  • the second composite pixel value of the second pixel unit is: (R1 + R2 + R3) / 3.
  • Step 403 Perform imaging according to the second synthesized pixel value.
  • imaging can be performed according to the second synthesized pixel value.
  • the second pixel unit when the second pixel unit is controlled to output multiple pixel values respectively at different exposure times, and for the same second pixel unit, it is calculated based on the pixel values in the second pixel unit that have the same exposure time.
  • multiple middle-exposed second composite pixel values and multiple long-exposure second composite pixel values and / or multiple short-exposure second composite pixel values can be obtained for the entire pixel unit array, and then , And then calculate a mid-exposure sub-image based on a plurality of intermediate-exposed second composite pixel values, and calculate a long-exposure sub-image based on a plurality of long-exposed second composite pixel values, and / or a plurality of short-exposed second
  • the short-exposure sub-image is calculated by interpolating the composite pixel values.
  • the middle exposure sub image and the long exposure sub image and / or the short exposure sub image are processed by fusion to obtain a high dynamic range imaging image.
  • the long exposure sub image, the middle exposure sub image, and the short exposure sub image are not in the traditional sense.
  • the three frames of images are image parts formed by corresponding regions of long, short, and medium exposure pixels in the same frame of image.
  • the pixel value of the long exposure pixel and / or the pixel value of the short exposure pixel may be superimposed on the pixel value of the long exposure pixel based on the pixel value of the long exposure pixel.
  • different weights may be assigned to pixel values at different exposure times. After the pixel values corresponding to each exposure time are multiplied by the weight values, the pixel values after the multiplication are weighted. Add a second composite pixel value as a second pixel unit. Subsequently, since the gray level of each second composite pixel value calculated according to the pixel values of different exposure times will change, it is necessary to gray each second composite pixel value after obtaining the second composite pixel value.
  • an imaging image can be obtained by performing interpolation calculation based on a plurality of second synthesized pixel values obtained after compression.
  • the dark part of the imaged image has been compensated by the pixel value output by the long exposure pixel, or the bright part has been suppressed by the pixel value output by the short exposure pixel, so there is no over-exposed area or under-exposed area in the imaged image, which has a higher Dynamic range and better imaging results.
  • a plurality of second synthesized pixel values in the entire pixel unit array may be determined, and then interpolation calculation may be performed according to the plurality of second synthesized pixel values to obtain an imaging image.
  • the present disclosure also proposes a control device.
  • FIG. 9 is a schematic structural diagram of a control device according to a fifth embodiment of the present disclosure.
  • the control device 100 is applied to an imaging device.
  • the imaging device includes a pixel unit array composed of multiple exposure pixels. Each exposure pixel is a short exposure pixel, a medium exposure pixel, or a long exposure pixel. Among them, the long exposure pixel
  • the control device 100 includes a determination module 101, an adjustment module 102, and an imaging module 103.
  • the exposure duration of the exposure time is greater than the exposure duration of the middle exposure pixels and the exposure duration of the middle exposure pixels is greater than the exposure duration of the short exposure pixels. among them,
  • the determining module 101 is configured to determine a brightness level of ambient brightness; the brightness level includes a low brightness level, a medium brightness level, and a high brightness level in which the brightness is arranged from small to large.
  • the adjustment module 102 is configured to adjust the arrangement position of the short exposure pixels, the middle exposure pixels, and / or the long exposure pixels in the pixel unit array when the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, so that at least two The middle exposure pixels are arranged adjacently, and at least two middle exposure pixels arranged adjacently are used as the first pixel unit.
  • the imaging module 103 is configured to perform imaging according to output pixel values of at least two exposed pixels in the first pixel unit.
  • the imaging module 103 is specifically configured to: control output pixel values of at least two exposed pixels in the first pixel unit; and generate a first composite pixel value according to the pixel value output by the same first pixel unit; Imaging is performed according to the first synthesized pixel value.
  • the adjustment module 102 is further configured to: after determining the brightness level of the ambient brightness, if the brightness of the shooting environment is equal to the medium brightness level, adjust the short exposure pixels, the medium exposure pixels, and / or the long exposure
  • the pixels are arranged in the pixel unit array such that the exposure pixels are arranged at intervals. At least two exposure pixels covered by the same color filter and short adjacent pixels of the at least two exposure pixels covered by the same color filter are arranged.
  • the exposure pixels and / or long exposure pixels serve as the second pixel unit.
  • the imaging module 103 is further configured to perform imaging according to a pixel value output by each exposed pixel in the second pixel unit.
  • the imaging module 103 is specifically configured to: control the output pixel value of each exposure pixel in the second pixel unit; generate a second composite pixel value according to the pixel value output by the same second pixel unit; and according to the second The pixel values are synthesized for imaging.
  • control device 100 may further include:
  • the first backlight scene determination module 104 is configured to determine, before determining the brightness level of the ambient brightness, according to the histogram of the captured preview image, that the current shooting environment belongs to the backlight scene.
  • the second backlight scene determination module 105 is configured to determine the brightness value of the imaging object and the background brightness value according to the environment brightness value measured by the pixel unit array before determining the brightness level of the environment brightness, and according to the The brightness value of the imaging object and the brightness value of the background determine that the current shooting environment belongs to a backlit scene.
  • At least two exposure pixels belonging to the same first pixel unit are covered with the same color filter.
  • the adjustment module 102 is specifically configured to: in the pixel unit array, exchange positions between each of the exposed pixels and an adjacent short exposure pixel or long exposure pixel.
  • the control device of the embodiment of the present disclosure determines the brightness level of the ambient brightness.
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, it adjusts the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array.
  • the middle so that at least two middle exposure pixels are arranged next to each other, with at least two middle exposure pixels arranged next to each other as the first pixel unit, and then according to at least two middle exposure pixels in the first pixel unit Output pixel values for imaging. Therefore, the arrangement position of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array can be automatically adjusted according to the brightness level of the shooting environment, so as to output through at least two middle exposure pixels arranged adjacently.
  • Pixel values for imaging can retain more effective information in the captured image, improve the brightness of the captured image, thereby improving the imaging effect and imaging quality, and improving the user's shooting experience.
  • the present disclosure also proposes an electronic device including the imaging device, the imaging device including a pixel unit array composed of multiple exposure pixels, each exposure pixel being a short exposure pixel, a medium exposure pixel, or a long exposure A pixel, wherein the exposure time of the long exposure pixel is greater than the exposure time of the middle exposure pixel, and the exposure time of the middle exposure pixel is greater than the exposure time of the short exposure pixel, the electronic device further includes a memory, a processing unit, And a computer program stored on a memory and executable on a processor, and when the processor executes the program, the control method proposed by the foregoing embodiment of the present disclosure is implemented.
  • the present disclosure also proposes a non-transitory computer-readable storage medium on which a computer program is stored, which is characterized in that when the program is executed by a processor, the control method as proposed in the foregoing embodiment of the present disclosure is implemented .
  • the present disclosure further provides an electronic device 200.
  • the electronic device 200 includes a memory 50 and a processor 60.
  • the memory 50 stores computer-readable instructions.
  • the processor 60 is caused to execute the control method of any one of the foregoing embodiments.
  • FIG. 11 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
  • the electronic device 200 includes a processor 60, a memory 50 (for example, a non-volatile storage medium), an internal memory 82, a display screen 83, and an input device 84 connected through a system bus 81.
  • the memory 50 of the electronic device 200 stores an operating system and computer-readable instructions.
  • the computer-readable instructions can be executed by the processor 60 to implement the control method of the embodiment of the present disclosure.
  • the processor 60 is used to provide computing and control capabilities to support the operation of the entire electronic device 200.
  • the internal memory 50 of the electronic device 200 provides an environment for execution of computer-readable instructions in the memory 52.
  • the display screen 83 of the electronic device 200 may be a liquid crystal display or an electronic ink display.
  • the input device 84 may be a touch layer covered on the display screen 83, or may be a button, a trackball, or a touch button provided on the housing of the electronic device 200.
  • Board which can also be an external keyboard, trackpad, or mouse.
  • the electronic device 200 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (for example, a smart bracelet, a smart watch, a smart helmet, or smart glasses).
  • FIG. 11 is only a schematic diagram of a part of the structure related to the solution of the present disclosure, and does not constitute a limitation on the electronic device 200 to which the solution of the present disclosure is applied.
  • the specific electronic device 200 may include more or fewer components than shown in the figure, or some components may be combined, or have different component arrangements.
  • the electronic device 200 includes an image processing circuit 90.
  • the image processing circuit 90 may be implemented by using hardware and / or software components, including various types of ISP (Image Signal Processing) pipelines. Processing unit.
  • FIG. 12 is a schematic diagram of an image processing circuit 90 in one embodiment. As shown in FIG. 12, for convenience of explanation, only aspects of the image processing technology related to the embodiments of the present disclosure are shown.
  • the image processing circuit 90 includes an ISP processor 91 (the ISP processor 91 may be the processor 60) and a control logic 92.
  • the image data captured by the camera 93 is first processed by the ISP processor 91.
  • the ISP processor 91 analyzes the image data to capture image statistical information that can be used to determine one or more control parameters of the camera 93.
  • the camera 93 may include one or more lenses 932 and an image sensor 934.
  • the image sensor 934 may include a color filter array (such as a Bayer filter). The image sensor 934 may obtain light intensity and wavelength information captured by each imaging pixel, and provide a set of raw image data that can be processed by the ISP processor 91.
  • the sensor 94 (such as a gyroscope) may provide parameters (such as image stabilization parameters) of the acquired image processing to the ISP processor 91 based on the interface type of the sensor 94.
  • the sensor 94 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the foregoing interfaces.
  • the image sensor 934 may also send the original image data to the sensor 94.
  • the sensor 94 may provide the original image data to the ISP processor 91 based on the interface type of the sensor 94, or the sensor 94 stores the original image data into the image memory 95.
  • the ISP processor 91 processes the original image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 91 may perform one or more image processing operations on the original image data and collect statistical information about the image data. The image processing operations may be performed with the same or different bit depth accuracy.
  • the ISP processor 91 may also receive image data from the image memory 95.
  • the sensor 94 interface sends the original image data to the image memory 95, and the original image data in the image memory 95 is then provided to the ISP processor 91 for processing.
  • the image memory 95 may be an independent dedicated memory in the memory 50, a part of the memory 50, a storage device, or an electronic device, and may include a DMA (Direct Memory Access) feature.
  • DMA Direct Memory Access
  • the ISP processor 91 may perform one or more image processing operations, such as time-domain filtering.
  • the processed image data may be sent to the image memory 95 for further processing before being displayed.
  • the ISP processor 91 receives processing data from the image memory 95, and performs processing on the image data in the original domain and in the RGB and YCbCr color spaces.
  • the image data processed by the ISP processor 91 may be output to a display 97 (the display 97 may include a display screen 83) for viewing by a user and / or further processing by a graphics engine or a GPU (Graphics Processing Unit).
  • the output of the ISP processor 91 can also be sent to the image memory 95, and the display 97 can read image data from the image memory 95.
  • the image memory 95 may be configured to implement one or more frame buffers.
  • the output of the ISP processor 91 may be sent to an encoder / decoder 96 to encode / decode image data.
  • the encoded image data can be saved and decompressed before being displayed on the display 97 device.
  • the encoder / decoder 96 may be implemented by a CPU or a GPU or a coprocessor.
  • the statistical data determined by the ISP processor 91 may be sent to the control logic unit 92.
  • the statistical data may include image sensor 934 statistical information such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, and lens 932 shading correction.
  • the control logic 92 may include a processing element and / or a microcontroller that executes one or more routines (such as firmware). The one or more routines may determine the control parameters of the camera 93 and the ISP processor according to the received statistical data. 91 control parameters.
  • control parameters of the camera 93 may include sensor 94 control parameters (such as gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 932 control parameters (such as focus distance for focusing or zooming), or these parameters The combination.
  • the ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), and lens 932 shading correction parameters.
  • the following are the steps for implementing the control method using the processor 60 in FIG. 11 or the image processing circuit 90 (specifically, the ISP processor 91) in FIG. 12:
  • the brightness level includes a low brightness level, a medium brightness level, and a high brightness level in which the brightness is arranged from small to large;
  • the brightness level of the shooting environment belongs to a high brightness level or a low brightness level, adjust the arrangement position of the short exposure pixels, middle exposure pixels, and / or long exposure pixels in the pixel unit array so that at least two Middle exposure pixels are arranged adjacently, and at least two middle exposure pixels arranged adjacently are used as a first pixel unit;
  • Imaging is performed according to output pixel values of at least two middle-exposed pixels in the first pixel unit.
  • Imaging is performed according to the first synthesized pixel value.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present disclosure, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
  • any process or method description in a flowchart or otherwise described herein can be understood as representing a module, fragment, or portion of code that includes one or more executable instructions for implementing steps of a custom logic function or process
  • the scope of the preferred embodiments of the present disclosure includes additional implementations in which functions may be performed out of the order shown or discussed, including performing functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present disclosure belong.
  • Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
  • computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
  • portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • Discrete logic circuits with logic gates for implementing logic functions on data signals Logic circuits, ASICs with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
  • a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware.
  • the program can be stored in a computer-readable storage medium.
  • the program is When executed, one or a combination of the steps of the method embodiment is included.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
  • the above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

本公开公开了一种控制方法、装置、电子设备和计算机可读存储介质,其中,方法应用于成像设备,成像设备包括多个曝光像素组成的像素单元阵列,各曝光像素为短曝光像素、中曝光像素或长曝光像素,方法包括:确定环境亮度的亮度等级;若拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元;根据第一像素单元中至少两个中曝光像素输出像素值进行成像。该方法能够保留拍摄图像中较多的有效信息,提升拍摄图像的亮度,进而提升成像效果和成像质量。

Description

控制方法、装置、电子设备和计算机可读存储介质
相关申请的交叉引用
本公开要求OPPO广东移动通信有限公司于2018年08月13日提交的、申请名称为“控制方法、装置、电子设备和计算机可读存储介质”的、中国专利申请号“201810915440.8”的优先权。
技术领域
本公开涉及成像技术领域,尤其涉及一种控制方法、装置、电子设备和计算机可读存储介质。
背景技术
随着终端技术的不断发展,越来越多的用户使用电子设备拍摄图像。现有技术中的成像设备,采用固定结构的像素单元阵列进行成像。
公开内容
本公开提出一种控制方法、装置、电子设备和计算机可读存储介质,用于根据拍摄环境的亮度等级,自动调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,从而通过相邻排布的至少两个中曝光像素输出像素值进行成像,可以保留拍摄图像中较多的有效信息,提升拍摄图像的亮度,进而提升成像效果和成像质量,改善用户的拍摄体验,以解决现有技术中当成像设备中的像素硬件结构确定时,则无法更改,很难适应多种不同拍摄场景,导致目前拍摄图像的成像质量受限于电子设备中的成像设备,图像质量无法提升的技术问题。
本公开一方面实施例提出了一种控制方法,应用于成像设备,所述成像设备包括多个曝光像素组成的像素单元阵列,各曝光像素为短曝光像素、中曝光像素或长曝光像素,其中,所述长曝光像素的曝光时长大于所述中曝光像素的曝光时长,且所述中曝光像素的曝光时长大于所述短曝光像素的曝光时长,所述方法包括以下步骤:
确定环境亮度的亮度等级;所述亮度等级包括亮度从小到大排列的低亮度等级、中亮度等级和高亮度等级;
若拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整所述短曝光像素、中曝 光像素和/或长曝光像素在所述像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元;
根据所述第一像素单元中至少两个中曝光像素输出像素值进行成像。
本公开实施例的控制方法,通过确定环境亮度的亮度等级,当拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元,而后根据第一像素单元中至少两个中曝光像素输出像素值进行成像。由此,可以根据拍摄环境的亮度等级,自动调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,从而通过相邻排布的至少两个中曝光像素输出像素值进行成像,可以保留拍摄图像中较多的有效信息,提升拍摄图像的亮度,进而提升成像效果和成像质量,改善用户的拍摄体验。
本公开又一方面实施例提出了一种控制装置,应用于成像设备,所述成像设备包括多个曝光像素组成的像素单元阵列,各曝光像素为短曝光像素、中曝光像素或长曝光像素,其中,所述长曝光像素的曝光时长大于所述中曝光像素的曝光时长,且所述中曝光像素的曝光时长大于所述短曝光像素的曝光时长,所述装置包括:
确定模块,用于确定环境亮度的亮度等级;所述亮度等级包括亮度从小到大排列的低亮度等级、中亮度等级和高亮度等级;
调整模块,用于若拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整所述短曝光像素、中曝光像素和/或长曝光像素在所述像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元;
成像模块,用于根据所述第一像素单元中至少两个中曝光像素输出像素值进行成像。
本公开实施例的控制装置,通过确定环境亮度的亮度等级,当拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元,而后根据第一像素单元中至少两个中曝光像素输出像素值进行成像。由此,可以根据拍摄环境的亮度等级,自动调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,从而通过相邻排布的至少两个中曝光像素输出像素值进行成像,可以保留拍摄图像中较多的有效信息,提升拍摄图像的亮度,进而提升成像效果和成像质量,改善用户的拍摄体验。
本公开又一方面实施例提出了一种电子设备,包括成像设备,所述成像设备包括多个曝光像素组成的像素单元阵列,各曝光像素为短曝光像素、中曝光像素或长曝光像素,其中,所述长曝光像素的曝光时长大于所述中曝光像素的曝光时长,且所述中曝光像素的曝 光时长大于所述短曝光像素的曝光时长,所述电子设备还包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如本公开前述实施例提出的控制方法。
本公开又一方面实施例提出了一种非临时性计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如本公开前述实施例提出的控制方法。
本公开附加的方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本公开的实践了解到。
附图说明
为了更清楚地说明本公开实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例一所提供的控制方法的流程示意图;
图2为本公开实施例中成像设备的像素单元阵列的部分结构示意图一;
图3为本公开实施例中逆光场景对应的灰度直方图示意图;
图4为本公开实施例二所提供的控制方法的流程示意图;
图5为本公开实施例中成像设备的像素单元阵列的部分结构示意图二;
图6为本公开实施例三所提供的控制方法的流程示意图;
图7为本公开实施例中感光像素单元的结构示意图;
图8为本公开实施例四所提供的控制方法的流程示意图;
图9为本公开实施例五所提供的控制装置的结构示意图;
图10为本公开实施例六所提供的控制装置的结构示意图;
图11是本公开某些实施方式的电子设备的模块示意图;
图12是本公开某些实施方式的图像处理电路的模块示意图。
具体实施方式
下面详细描述本公开的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本公开,而不能理解为对本公开的限制。
本公开主要针对现有技术中图像质量不佳的技术问题,提供一种控制方法。
本公开实施例的控制方法,通过确定环境亮度的亮度等级,当拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整短曝光像素、中曝光像素和/或长曝光像素在像素单元 阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元,而后根据第一像素单元中至少两个中曝光像素输出像素值进行成像。由此,可以根据拍摄环境的亮度等级,自动调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,从而通过相邻排布的至少两个中曝光像素输出像素值进行成像,可以保留拍摄图像中较多的有效信息,提升拍摄图像的亮度,进而提升成像效果和成像质量,改善用户的拍摄体验。
下面参考附图描述本公开实施例的控制方法、装置、电子设备和计算机可读存储介质。
图1为本公开实施例一所提供的控制方法的流程示意图。
本公开实施例的控制方法,应用于成像设备,成像设备包括多个曝光像素组成的像素单元阵列,各曝光像素为短曝光像素、中曝光像素或长曝光像素。
其中,长曝光像素指的是感光像素对应的曝光时间为长曝光时间,中曝光像素指的是感光像素对应的曝光时间为中曝光时间,短曝光像素指的是感光像素对应的曝光时间为短曝光时间,长曝光时间>中曝光时间>短曝光时间,即长曝光像素的长曝光时间大于中曝光像素的中曝光时间,且中曝光像素的中曝光时间大于短曝光像素的短曝光时间。
在成像设备工作时,长曝光像素、中曝光像素及短曝光像素同步曝光,同步曝光指的是中曝光像素及短曝光像素的曝光时间位于长曝光像素的曝光时间以内。具体地,可以首先控制长曝光像素最先开始曝光,在长曝光像素的曝光期间内,再控制中曝光像素以及短曝光像素曝光,其中,中曝光像素和短曝光像素的曝光截止时间应与长曝光像素的曝光截止时间相同或位于长曝光像素的曝光截止时间之前。
或者,还可以控制长曝光像素、中曝光像素以及短曝光像素同时开始曝光,即长曝光像素、中曝光像素以及短曝光像素的曝光起始时间相同。如此,无需控制像素单元阵列依次进行长曝、中曝和短曝,可减小图像的拍摄时间。
如图1所示,该控制方法包括以下步骤:
步骤101,确定环境亮度的亮度等级;亮度等级包括亮度从小到大排列的低亮度等级、中亮度等级和高亮度等级。
本公开实施例中,可以预先将环境亮度划分为三个亮度等级,分别为低亮度等级、中亮度等级和高亮度等级,例如,亮度等级可以由电子设备的内置程序预先设置,或者,还可以由用户进行设置,对此不作限制。
本公开实施例中,可以采用独立的测光器件,测量环境亮度,或者,还可以读取摄像头自动调节的感光度ISO值,根据读取到的ISO值,确定环境亮度,或者,还可以控制像素单元阵列对环境亮度值进行测量,确定环境亮度,对此不作限制。在确定环境亮度后,可以根据该环境亮度,确定亮度等级。
需要说明的是,上述ISO值用来指示摄像头的感光度,常用的ISO值有50、100、200、400、1000等等,摄像头可以根据环境亮度,自动调节ISO值,从而,本实施例中,可以根据ISO值,反推出环境亮度。一般在光线充足的情况下,ISO值可以为50或100,在光线不足的情况下,ISO值可以为400或更高。
应当理解的是,当电子设备所处的场景不同时,亮度等级可以设置不同,举例而言,当白天,且电子设备处于室内时,ISO值处于200至500之间,而当白天,且电子设备处于室外时,ISO值一般低于200。因此,可以根据实际需求以及具体的拍摄场景,设置每个亮度等级的大小和区间。
步骤102,若拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元。
本公开实施例中,当拍摄环境的亮度等级属于高亮度等级或低亮度等级时,表明拍摄环境较亮或者较暗,此时,环境亮度比较极端,当采用长曝光像素或者短曝光像素进行拍摄时,输出的图像的噪点较高,或者,长曝光像素或者短曝光像素输出的像素值可能会发生溢出,图像细节丢失的较为严重。
例如,根据红绿蓝(Red Green Blue,简称RGB)三色直方图,可以直观地显示三色不同明暗度的细节内容分布,过曝的部分就会聚集在直方图的左右两端,在容忍度以外的区域不会更亮,而是丢失全部细节,只显示全白的色块(255,255,255),反之,欠曝的部分,容忍度以外的不会更暗,而是丢失全部细节,只显示全黑的色块(0,0,0)。
因此,本公开中,为了保留较多的图像细节,可以调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元。
作为一种可能的实现方式,可以在像素单元阵列中,对各中曝光像素与相邻的短曝光像素或长曝光像素交换位置,从而使得至少两个中曝光像素相邻排布,而后将相邻排布的至少两个中曝光像素作为第一像素单元。
例如,参见图2,图2为本公开实施例中成像设备的像素单元阵列的部分结构示意图一。假设原始像素单元阵列包括16个曝光像素,分别为4个长曝光像素(L)、8个中曝光像素(M)以及4个短曝光像素(S)。当拍摄环境的亮度等级属于高亮度等级或低亮度等级时,可以将中曝光像素(M)与相邻的短曝光像素(S)交换位置,以及将中曝光像素(M)与相邻的长曝光像素(L)交换位置,从而可以将区域21中包含的8个相邻排布的中曝光像素(M)作为第一像素单元,或者可以将相邻排布的2个、4个或6个中曝光像素(M)作为第一像素单元。具体每一个第一像素单元中所含的中曝光像素(M)个数,作为一种可 能的实现方式,可以根据拍摄环境亮度进行调整,亮度越低每一个第一像素单元中所含的中曝光像素(M)个数越多。通常情况下,可以固定设置每一个第一像素单元中含4个中曝光像素(M),便可以满足大多数拍摄场景的需求。
步骤103,根据第一像素单元中至少两个中曝光像素输出像素值进行成像。
本公开实施例中,在调整各曝光像素在像素单元阵列中的排布位置,得到第一像素单元后,可以根据第一像素单元中至少两个中曝光像素进行成像。例如,参见图2,可以利用区域21中的8个相邻排布的中曝光像素输出像素值,进行成像。可以理解的是,由中曝光像素输出像素值进行成像,可以避免像素值溢出,从而保留较多的图像细节,提升成像效果和成像质量。
本公开实施例的控制方法,通过确定环境亮度的亮度等级,当拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元,而后根据第一像素单元中至少两个中曝光像素输出像素值进行成像。由此,可以根据拍摄环境的亮度等级,自动调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,从而通过相邻排布的至少两个中曝光像素输出像素值进行成像,可以保留拍摄图像中较多的有效信息,提升拍摄图像的亮度,进而提升成像效果和成像质量,改善用户的拍摄体验。
需要说明的是,在逆光场景下,当用户使用电子设备的前置摄像头自拍时,由于用户处于光源和电子设备之间,容易造成人脸曝光不充分的情况发生。因此,在逆光场景下,拍摄图像的有效信息较少、亮度偏低。而本公开实施例中,根据拍摄环境的亮度等级,自动调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,从而通过相邻排布的至少两个中曝光像素输出像素值进行成像,可以保留拍摄图像中较多的有效信息,提升拍摄图像的亮度。
作为一种可能的实现方式,可以根据拍摄的预览图像的直方图,确定当前拍摄环境属于逆光场景。
具体地,在拍摄的预览图像时,可以根据像素单元阵列测量得到的环境亮度值对应的灰度值,生成灰度直方图,而后根据处于各灰度范围的感光像素数占比,判断当前拍摄环境是否为逆光场景。
例如,当根据灰度直方图,确定像素单元阵列中测量得到的环境亮度值对应的灰度值处于灰度范围[0,20]的感光像素,与像素单元阵列中所有感光像素的比值grayRatio大于第一阈值,例如第一阈值可以为0.135,且测量得到的环境亮度值对应的灰度值处于灰度范围[200,256)的感光像素,与像素单元阵列中所有感光像素的比值grayRatio大于第二 阈值,例如第二阈值可以为0.0899时,则确定当前拍摄环境为逆光场景。
或者,当根据灰度直方图,确定像素单元阵列中测量得到的环境亮度值对应的灰度值处于灰度范围[0,50]的感光像素,与像素单元阵列中所有感光像素的比值grayRatio大于第三阈值,例如第三阈值可以为0.3,且测量得到的环境亮度值对应的灰度值处于灰度范围[200,256)的感光像素,与像素单元阵列中所有感光像素的比值grayRatio大于第四阈值,例如第四阈值可以为0.003时,则确定当前拍摄环境为逆光场景。
或者,当根据灰度直方图,确定像素单元阵列中测量得到的环境亮度值对应的灰度值处于灰度范围[0,50]的感光像素,与像素单元阵列中所有感光像素的比值grayRatio大于第五阈值,例如第五阈值可以为0.005,且测量得到的环境亮度值对应的灰度值处于灰度范围[200,256)的感光像素,与像素单元阵列中所有感光像素的比值grayRatio大于第六阈值,例如第六阈值可以为0.25时,则确定当前拍摄环境为逆光场景。
作为一种示例,逆光场景对应的灰度直方图可以如图3所示。
可以理解的是,一般情况下,在当前拍摄环境为逆光场景时,像素单元阵列中各感光像素测量得到的环境亮度值将存在较高的亮度差异,因此,作为另一种可能的实现方式,还可以根据像素单元阵列测量得到的环境亮度值,确定成像对象的亮度值以及背景的亮度值,并判断成像对象的亮度值和背景的亮度值之间的差值是否大于预设阈值,在成像对象的亮度值和背景的亮度值之间的差值大于预设阈值时,确定当前拍摄环境为逆光场景,而在成像对象的亮度值和背景的亮度值之间的差值小于或者等于预设阈值时,确定当前拍摄环境为非逆光场景。
其中,预设阈值可以预设在电子设备的内置程序中,或者,预设阈值还可以由用户进行设置,对此不作限制。成像对象为需要电子设备进行拍摄的对象,例如人(或者人脸)、动物、物体、景物等对象。
作为一种可能的实现方式,参见图4,在图1所示实施例的基础上,步骤102具体可以包括以下子步骤:
步骤201,控制第一像素单元中至少两个中曝光像素输出像素值。
具体地,可以控制第一像素单元中每个中曝光像素同步曝光,每个中曝光像素的曝光时间相同,即每个中曝光像素的曝光截止时间也相同。在曝光结束后,第一像素单元中的每个中曝光像素将输出对应的像素值。例如,参见图2,第一像素单元将输出8个像素值。
步骤202,根据同一第一像素单元输出的像素值,生成第一合成像素值。
需要说明的是,像素单元阵列中包括多个曝光像素,对短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置进行调整,使得至少两个中曝光像素相邻排布后,可能具有多个第一像素单元。
具体地,像素单元阵列可以由多个感光像素单元组成,每一感光像素单元包括至少两曝光像素,至少两曝光像素中包括至少一中曝光像素。
作为一种示例,参见图5,图5为本公开实施例中成像设备的像素单元阵列的部分结构示意图二。其中,成像设备30包括像素单元阵列31和设置在像素单元阵列31上的滤光片单元阵列32。像素单元阵列31包括多个感光像素单元311,每个感光像素单元311包括至少两个曝光像素3111,至少两个曝光像素中包括至少一个中曝光像素。例如,图5以每个感光像素单元311包括4个曝光像素3111示例,4个曝光像素可以为1个长曝光像素、2个中曝光像素、1个短曝光像素。当然,每一个感光像素单元311中的长曝光像素、中曝光像素和短曝光像素的数量也可为其他数值,对此不作限制。
滤光片单元阵列32包括与多个感光像素单元311对应的多个滤光片单元322,每个滤光片单元322覆盖对应的感光像素单元311。其中,像素单元阵列31可以为拜耳阵列。
需要说明的是,同一感光像素单元内包含的各曝光像素由同色滤光片覆盖,属于同一第一像素单元的至少两中曝光像素覆盖有同色滤光片,即至少两个中曝光像素属于同一感光像素单元。也就是说,本公开中,同一第一像素单元中的中曝光像素位于同一感光像素单元中,即像素单元阵列中每个感光像素单元均具有第一像素单元。
作为一种可能的实现方式,针对同一第一像素单元,可以将该第一像素单元中包含的所有的中曝光像素输出的像素值之和,作为第一合成像素值。
作为另一种可能的实现方式,针对同一第一像素单元,可以将该第一像素单元中包含的所有的中曝光像素输出的像素值求取均值,将均值作为第一合成像素值。
步骤203,根据第一合成像素值进行成像。
本公开实施例中,在根据同一第一像素单元输出的像素值,生成第一合成像素值后,可以得到像素阵列中每个第一像素单元对应的第一合成像素值,而后可以根据多个第一合成像素值,插值计算得到成像图像。
作为一种可能的实现方式,当拍摄环境的亮度等级属于中亮度等级时,此时,拍摄图像的细节保留的较多,可以进一步改善图像的成像效果。具体地,可以调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,从而使得各中曝光像素间隔排布,而后由与中曝光像素相邻的短曝光像素或中曝光像素进行成像,由此,长曝光像素可以修正图像中的暗部区域,短曝光像素可以修正图像中的亮部区域,提升成像效果和成像质量。下面结合图6,对上述成像过程进行具体说明。
图6为本公开实施例三所提供的控制方法的流程示意图。
如图6所示,在图1所示实施例的基础上,该控制方法还可以包括以下步骤:
步骤301,若拍摄环境的亮度等于属于中亮度等级时,调整短曝光像素、中曝光像素 和/或长曝光像素在像素单元阵列中的排布位置,以使各中曝光像素间隔排布,将同色滤光片覆盖的至少两中曝光像素以及与同色滤光片覆盖的至少两中曝光像素相邻的短曝光像素和/或长曝光像素作为第二像素单元。
本公开实施例中,当拍摄环境的亮度等级属于中亮度等级时,此时,拍摄图像的细节保留的较多,可以进一步改善图像的成像效果。具体地,可以调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,从而使得各中曝光像素间隔排布。例如,可以在像素单元阵列中,对各中曝光像素与相邻的短曝光像素和/或长曝光像素交换位置,从而使得各中曝光像素间隔排布,将同色滤光片覆盖的至少两中曝光像素以及与同色滤光片覆盖的至少两中曝光像素相邻的短曝光像素或长曝光像素作为第二像素单元。
也就是说,本公开中,可以将同色滤光片覆盖的至少两中曝光像素以及与同色滤光片覆盖的至少两中曝光像素相邻的短曝光像素作为第二像素单元,或者,将同色滤光片覆盖的至少两中曝光像素以及与同色滤光片覆盖的至少两中曝光像素相邻的长曝光像素作为第二像素单元,或者,将同色滤光片覆盖的至少两中曝光像素、与同色滤光片覆盖的至少两中曝光像素相邻的短曝光像素,以及与同色滤光片覆盖的至少两中曝光像素相邻的长曝光像素作为第二像素单元。由于同一感光像素单元内包含的各曝光像素由同色滤光片覆盖,因此,像素单元阵列中每个感光像素单元均具有第二像素单元。
作为一种示例,参见图7,假设一个感光像素单元包括4个曝光像素,分别为1个长曝光像素(L)、2个中曝光像素(M)以及1个短曝光像素(S)。当拍摄环境的亮度等级属于中亮度等级时,可以将中曝光像素(M)与相邻的短曝光像素(S)交换位置,或者,将中曝光像素(M)与相邻的长曝光像素(L)交换位置,从而使得相邻的两个中曝光像素(M)间隔排布。而后可以将感光像素单元中的两个中曝光像素(M)和一个短曝光像素(S),作为第二像素单元,或者,将感光像素单元中的两个中曝光像素(M)和一个长曝光像素(L),作为第二像素单元,又或者,可以将感光像素单元中的两个中曝光像素(M)、一个长曝光像素(L)和一个短曝光像素(S),作为第二像素单元。
步骤302,根据第二像素单元中各曝光像素输出的像素值进行成像。
本公开实施例中,在调整各曝光像素在像素单元阵列中的排布位置,得到第二像素单元后,可以根据第二像素单元中各曝光像素输出的像素值进行成像。由此,长曝光像素可以修正图像中的暗部区域,或者,短曝光像素可以修正图像中的亮部区域,提升成像效果和成像质量。
作为一种可能的实现方式,参见图8,在图6所示实施例的基础上,步骤302具体可以包括以下子步骤:
步骤401,控制第二像素单元中各曝光像素输出像素值。
本公开实施例中,在得到第二像素单元后,可以控制第二像素单元中各曝光像素输出像素值。
作为一种可能的实现方式,可以控制第二像素单元输出分别处于不同曝光时间下的多个像素值,例如,可以控制第二像素单元中的中曝光像素和长曝光像素同步曝光,或者,可以控制第二像素单元中的中曝光像素和短曝光像素同步曝光,或者,可以控制第二像素单元中的中曝光像素、长曝光像素和短曝光像素同步曝光,其中长曝光像素对应的曝光时间为初始长曝光时间,中曝光像素对应的曝光时间为初始中曝光时间,短曝光像素对应的曝光时间为初始短曝光时间,初始长曝光时间、初始中曝光时间及初始短曝光时间均为预先设定好的。在曝光结束后,第二像素单元将输出分别处于不同曝光时间下的多个像素值。
作为另一种可能的实现方式,还可以控制第二像素单元输出采用相同曝光时间曝光得到的多个像素值。例如,可以控制第二像素单元中的中曝光像素和长曝光像素同步曝光,或者,可以控制第二像素单元中的中曝光像素和短曝光像素同步曝光,或者,可以控制第二像素单元中的中曝光像素、长曝光像素和短曝光像素同步曝光,每个曝光像素的曝光时间相同,即中曝光像素、长曝光像素、短曝光像素的曝光截止时间也相同。在曝光结束后,第二像素单元将输出采用相同曝光时间曝光得到的多个像素值。
步骤402,根据同一第二像素单元输出的像素值,生成第二合成像素值。
本公开中,由于像素单元阵列中每个感光像素单元均具有第二像素单元,对于每个第二像素单元,可以根据同一第二像素单元输出的像素值,生成第二合成像素值。
作为一种可能的实现方式,当控制第二像素单元输出分别处于不同曝光时间下的多个像素值后,针对同一第二像素单元,可以根据该第二像素单元中曝光时间相同的像素值计算得到第二合成像素值。
例如,参见图7,当每个第二像素单元中包括1个长曝光像素、2个中曝光像素时,唯一的长曝光像素输出的像素值即为长曝光的第二合成像素值,2个中曝光像素输出的像素值之和即为中曝光的第二合成像素值;当每个第二像素单元中包括2个中曝光像素、1个短曝光像素时,2个中曝光像素输出的像素值之和即为中曝光的第二合成像素值,1个短曝光像素输出的像素值即为短曝光的第二合成像素值。如此,可以获得整个像素单元阵列的多个中曝光的第二合成像素值、多个长曝光的第二合成像素值、多个短曝光的第二合成像素值。
作为另一种可能的实现方式,当控制第二像素单元输出采用相同曝光时间曝光得到的多个像素值后,针对同一第二像素单元,可以对该第二像素单元输出的多个像素值计算平均值,得到第二合成像素值,其中,每个第二像素单元对应一个第二合成像素值。
例如,当每个第二像素单元中包括1个长曝光像素、2个中曝光像素时,标记1个长 曝光像素输出的1个像素值、2个中曝光像素输出的2个像素值分别为:R1、R2和R3,则该第二像素单元的第二合成像素值为:(R1+R2+R3)/3。
步骤403,根据第二合成像素值进行成像。
本公开实施例中,在得到每个第二像素单元对应的第二合成像素值后,可以根据第二合成像素值进行成像。
作为一种可能的实现方式,当控制第二像素单元输出分别处于不同曝光时间下的多个像素值,且针对同一第二像素单元,根据该第二像素单元中曝光时间相同的像素值计算得到第二合成像素值后,可以获得整个像素单元阵列的多个中曝光的第二合成像素值和多个长曝光的第二合成像素值和/或多个短曝光的第二合成像素值,而后,再根据多个中曝光的第二合成像素值插值计算得到中曝光子图像,根据多个长曝光的第二合成像素值插值计算得到长曝光子图像和/或根据多个短曝光的第二合成像素值插值计算得到短曝光子图像。最后,将中曝光子图像和长曝光子图像和/或短曝光子图像融合处理得到高动态范围的成像图像,其中,长曝光子图像、中曝光子图像和短曝光子图像并非为传统意义上的三帧图像,而是同一帧图像中长、短、中曝光像素对应区域形成的图像部分。
或者,在第二素单元曝光结束后,可以以长曝光像素输出的像素值为基准,将中曝光像素的像素值和/或短曝光像素输出的像素值叠加到长曝光像素的像素值上。具体地,针对同一第二像素单元,可以对不同曝光时间的像素值分别赋予不同的权值,在各曝光时间对应的像素值与权值相乘后,再将乘以权值后的像素值相加作为一个第二像素单元的第二合成像素值。随后,由于根据不同曝光时间的像素值计算得到的每一个第二合成像素值的灰度级别会产生变化,因此,在得到第二合成像素值后,需要对每一个第二合成像素值做灰度级别的压缩。压缩完毕后,可以根据多个压缩完毕后得到的第二合成像素值进行插值计算即可得到成像图像。如此,成像图像中暗部已经由长曝光像素输出的像素值进行补偿,或者,亮部已经由短曝光像素输出的像素值进行压制,因此成像图像不存在过曝区域或者欠曝区域,具有较高的动态范围和较佳的成像效果。
作为另一种可能的实现方式,在控制第二像素单元阵列输出采用相同曝光时间曝光得到的多个像素值,且对同一第二像素单元的像素值计算平均值,得到第二合成像素值后,可以确定整个像素单元阵列中多个第二合成像素值,而后可以根据多个第二合成像素值进行插值计算,即可得到成像图像。
为了实现上述实施例,本公开还提出一种控制装置。
图9为本公开实施例五所提供的控制装置的结构示意图。
如图9所示,该控制装置100,应用于成像设备,成像设备包括多个曝光像素组成的像素单元阵列,各曝光像素为短曝光像素、中曝光像素或长曝光像素,其中,长曝光像素 的曝光时长大于中曝光像素的曝光时长,且中曝光像素的曝光时长大于短曝光像素的曝光时长,控制装置100包括:确定模块101、调整模块102,以及成像模块103。其中,
确定模块101,用于确定环境亮度的亮度等级;亮度等级包括亮度从小到大排列的低亮度等级、中亮度等级和高亮度等级。
调整模块102,用于若拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元。
成像模块103,用于根据第一像素单元中至少两个中曝光像素输出像素值进行成像。
作为一种可能的实现方式,成像模块103,具体用于:控制第一像素单元中至少两个中曝光像素输出像素值;根据同一第一像素单元输出的像素值,生成第一合成像素值;根据第一合成像素值进行成像。
作为一种可能的实现方式,调整模块102,还用于:在确定环境亮度的亮度等级之后,若拍摄环境的亮度等于属于中亮度等级时,调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,以使各中曝光像素间隔排布,将同色滤光片覆盖的至少两中曝光像素以及与同色滤光片覆盖的至少两中曝光像素相邻的短曝光像素和/或长曝光像素作为第二像素单元。
成像模块103,还用于:根据第二像素单元中各曝光像素输出的像素值进行成像。
作为一种可能的实现方式,成像模块103,具体用于:控制第二像素单元中各曝光像素输出像素值;根据同一第二像素单元输出的像素值,生成第二合成像素值;根据第二合成像素值进行成像。
进一步地,在本公开实施例的一种可能的实现方式中,参见图10,在图9所示实施例的基础上,该控制装置100还可以包括:
第一逆光场景确定模块104,用于在确定环境亮度的亮度等级之前,根据拍摄的预览图像的直方图,确定当前拍摄环境属于逆光场景。
第二逆光场景确定模块105,用于在所述确定环境亮度的亮度等级之前,根据所述像素单元阵列测量得到的环境亮度值,确定成像对象的亮度值以及背景的亮度值,并根据所述成像对象的亮度值和所述背景的亮度值,确定当前拍摄环境属于逆光场景。
作为一种可能的实现方式,属于同一第一像素单元的至少两中曝光像素覆盖有同色滤光片。
作为一种可能的实现方式,调整模块102,具体用于:在像素单元阵列中,对各中曝光像素与相邻的短曝光像素或长曝光像素交换位置。
需要说明的是,前述对控制方法实施例的解释说明也适用于该实施例的控制装置100, 此处不再赘述。
本公开实施例的控制装置,通过确定环境亮度的亮度等级,当拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元,而后根据第一像素单元中至少两个中曝光像素输出像素值进行成像。由此,可以根据拍摄环境的亮度等级,自动调整短曝光像素、中曝光像素和/或长曝光像素在像素单元阵列中的排布位置,从而通过相邻排布的至少两个中曝光像素输出像素值进行成像,可以保留拍摄图像中较多的有效信息,提升拍摄图像的亮度,进而提升成像效果和成像质量,改善用户的拍摄体验。
为了实现上述实施例,本公开还提出一种电子设备,包括所述成像设备,所述成像设备包括多个曝光像素组成的像素单元阵列,各曝光像素为短曝光像素、中曝光像素或长曝光像素,其中,所述长曝光像素的曝光时长大于所述中曝光像素的曝光时长,且所述中曝光像素的曝光时长大于所述短曝光像素的曝光时长,所述电子设备还包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如本公开前述实施例提出的控制方法。
为了实现上述实施例,本公开还提出一种非临时性计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如本公开前述实施例提出的控制方法。
请参阅图11,本公开还提供一种电子设备200。电子设备200包括存储器50和处理器60。存储器50中存储有计算机可读指令。计算机可读指令被存储器50执行时,使得处理器60执行上述任一实施方式的控制方法。
图11为一个实施例中电子设备200的内部结构示意图。该电子设备200包括通过系统总线81连接的处理器60、存储器50(例如为非易失性存储介质)、内存储器82、显示屏83和输入装置84。其中,电子设备200的存储器50存储有操作系统和计算机可读指令。该计算机可读指令可被处理器60执行,以实现本公开实施方式的控制方法。该处理器60用于提供计算和控制能力,支撑整个电子设备200的运行。电子设备200的内存储器50为存储器52中的计算机可读指令的运行提供环境。电子设备200的显示屏83可以是液晶显示屏或者电子墨水显示屏等,输入装置84可以是显示屏83上覆盖的触摸层,也可以是电子设备200外壳上设置的按键、轨迹球或触控板,也可以是外接的键盘、触控板或鼠标等。该电子设备200可以是手机、平板电脑、笔记本电脑、个人数字助理或穿戴式设备(例如智能手环、智能手表、智能头盔、智能眼镜)等。本领域技术人员可以理解,图11中示出的结构,仅仅是与本公开方案相关的部分结构的示意图,并不构成对本公开方案所应用于 其上的电子设备200的限定,具体的电子设备200可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
请参阅图12,本公开实施例的电子设备200中包括图像处理电路90,图像处理电路90可利用硬件和/或软件组件实现,包括定义ISP(Image Signal Processing,图像信号处理)管线的各种处理单元。图12为一个实施例中图像处理电路90的示意图。如图12所示,为便于说明,仅示出与本公开实施例相关的图像处理技术的各个方面。
如图12所示,图像处理电路90包括ISP处理器91(ISP处理器91可为处理器60)和控制逻辑器92。摄像头93捕捉的图像数据首先由ISP处理器91处理,ISP处理器91对图像数据进行分析以捕捉可用于确定摄像头93的一个或多个控制参数的图像统计信息。摄像头93可包括一个或多个透镜932和图像传感器934。图像传感器934可包括色彩滤镜阵列(如Bayer滤镜),图像传感器934可获取每个成像像素捕捉的光强度和波长信息,并提供可由ISP处理器91处理的一组原始图像数据。传感器94(如陀螺仪)可基于传感器94接口类型把采集的图像处理的参数(如防抖参数)提供给ISP处理器91。传感器94接口可以为SMIA(Standard Mobile Imaging Architecture,标准移动成像架构)接口、其它串行或并行照相机接口或上述接口的组合。
此外,图像传感器934也可将原始图像数据发送给传感器94,传感器94可基于传感器94接口类型把原始图像数据提供给ISP处理器91,或者传感器94将原始图像数据存储到图像存储器95中。
ISP处理器91按多种格式逐个像素地处理原始图像数据。例如,每个图像像素可具有8、10、12或14比特的位深度,ISP处理器91可对原始图像数据进行一个或多个图像处理操作、收集关于图像数据的统计信息。其中,图像处理操作可按相同或不同的位深度精度进行。
ISP处理器91还可从图像存储器95接收图像数据。例如,传感器94接口将原始图像数据发送给图像存储器95,图像存储器95中的原始图像数据再提供给ISP处理器91以供处理。图像存储器95可为存储器50、存储器50的一部分、存储设备、或电子设备内的独立的专用存储器,并可包括DMA(Direct Memory Access,直接直接存储器存取)特征。
当接收到来自图像传感器934接口或来自传感器94接口或来自图像存储器95的原始图像数据时,ISP处理器91可进行一个或多个图像处理操作,如时域滤波。处理后的图像数据可发送给图像存储器95,以便在被显示之前进行另外的处理。ISP处理器91从图像存储器95接收处理数据,并对处理数据进行原始域中以及RGB和YCbCr颜色空间中的图像数据处理。ISP处理器91处理后的图像数据可输出给显示器97(显示器97可包括显示屏83),以供用户观看和/或由图形引擎或GPU(Graphics Processing Unit,图形处理器)进一步 处理。此外,ISP处理器91的输出还可发送给图像存储器95,且显示器97可从图像存储器95读取图像数据。在一个实施例中,图像存储器95可被配置为实现一个或多个帧缓冲器。此外,ISP处理器91的输出可发送给编码器/解码器96,以便编码/解码图像数据。编码的图像数据可被保存,并在显示于显示器97设备上之前解压缩。编码器/解码器96可由CPU或GPU或协处理器实现。
ISP处理器91确定的统计数据可发送给控制逻辑器92单元。例如,统计数据可包括自动曝光、自动白平衡、自动聚焦、闪烁检测、黑电平补偿、透镜932阴影校正等图像传感器934统计信息。控制逻辑器92可包括执行一个或多个例程(如固件)的处理元件和/或微控制器,一个或多个例程可根据接收的统计数据,确定摄像头93的控制参数及ISP处理器91的控制参数。例如,摄像头93的控制参数可包括传感器94控制参数(例如增益、曝光控制的积分时间、防抖参数等)、照相机闪光控制参数、透镜932控制参数(例如聚焦或变焦用焦距)、或这些参数的组合。ISP控制参数可包括用于自动白平衡和颜色调整(例如,在RGB处理期间)的增益水平和色彩校正矩阵,以及透镜932阴影校正参数。
例如,以下为运用图11中的处理器60或运用图12中的图像处理电路90(具体为ISP处理器91)实现控制方法的步骤:
确定环境亮度的亮度等级;所述亮度等级包括亮度从小到大排列的低亮度等级、中亮度等级和高亮度等级;
若所述拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整所述短曝光像素、中曝光像素和/或长曝光像素在所述像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元;
根据所述第一像素单元中至少两个中曝光像素输出像素值进行成像。
再例如,以下为运用图11中的处理器或运用图12中的图像处理电路90(具体为ISP处理器)实现控制方法的步骤:
控制所述第一像素单元中至少两个中曝光像素输出像素值;
根据同一所述第一像素单元输出的像素值,生成第一合成像素值;
根据所述第一合成像素值进行成像。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本公开的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进 行结合和组合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本公开的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现定制逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本公开的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本公开的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本公开的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。如,如果用硬件来实现和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本公开各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本公开的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本公开的限制,本领域的普通技术人员在本公开的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (20)

  1. 一种控制方法,其特征在于,应用于成像设备,所述成像设备包括多个曝光像素组成的像素单元阵列,各曝光像素为短曝光像素、中曝光像素或长曝光像素,其中,所述长曝光像素的曝光时长大于所述中曝光像素的曝光时长,且所述中曝光像素的曝光时长大于所述短曝光像素的曝光时长,所述方法包括以下步骤:
    确定环境亮度的亮度等级;所述亮度等级包括亮度从小到大排列的低亮度等级、中亮度等级和高亮度等级;
    若拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整所述短曝光像素、中曝光像素和/或长曝光像素在所述像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元;
    根据所述第一像素单元中至少两个中曝光像素输出像素值进行成像。
  2. 根据权利要求1所述的控制方法,其特征在于,所述根据所述第一像素单元中至少两个中曝光像素输出像素值进行成像,包括:
    控制所述第一像素单元中至少两个中曝光像素输出像素值;
    根据同一所述第一像素单元输出的像素值,生成第一合成像素值;
    根据所述第一合成像素值进行成像。
  3. 根据权利要求2所述的控制方法,其特征在于,所述根据所述第一合成像素值进行成像,包括:
    根据所述第一合成像素值进行插值计算,得到成像图像。
  4. 根据权利要求1-3任一项所述的控制方法,其特征在于,所述确定环境亮度的亮度等级之后,还包括:
    若所述拍摄环境的亮度等于属于中亮度等级时,调整所述短曝光像素、中曝光像素和/或长曝光像素在所述像素单元阵列中的排布位置,以使各中曝光像素间隔排布,将同色滤光片覆盖的至少两中曝光像素以及与所述同色滤光片覆盖的至少两中曝光像素相邻的短曝光像素和/或长曝光像素作为第二像素单元;
    根据所述第二像素单元中各曝光像素输出的像素值进行成像。
  5. 根据权利要求4所述的控制方法,其特征在于,所述根据所述第二像素单元中各曝光像素输出的像素值进行成像,包括:
    控制所述第二像素单元中各曝光像素输出像素值;
    根据同一所述第二像素单元输出的像素值,生成第二合成像素值;
    根据所述第二合成像素值进行成像。
  6. 根据权利要求1-5任一项所述的控制方法,其特征在于,所述确定环境亮度的亮度等级之前,还包括:
    根据拍摄的预览图像的直方图,确定当前拍摄环境属于逆光场景。
  7. 根据权利要求1-5任一项所述的控制方法,其特征在于,所述确定环境亮度的亮度等级之前,还包括:
    根据所述像素单元阵列测量得到的环境亮度值,确定成像对象的亮度值以及背景的亮度值;
    根据所述成像对象的亮度值和所述背景的亮度值,确定当前拍摄环境属于逆光场景。
  8. 根据权利要求1-7任一项所述的控制方法,其特征在于,属于同一所述第一像素单元的至少两中曝光像素覆盖有同色滤光片。
  9. 根据权利要求1-8任一项所述的控制方法,其特征在于,所述调整所述短曝光像素、中曝光像素和/或长曝光像素在所述像素单元阵列中的排布位置,包括:
    在所述像素单元阵列中,对各中曝光像素与相邻的短曝光像素或长曝光像素交换位置。
  10. 一种控制装置,其特征在于,应用于成像设备,所述成像设备包括多个曝光像素组成的像素单元阵列,各曝光像素为短曝光像素、中曝光像素或长曝光像素,其中,所述长曝光像素的曝光时长大于所述中曝光像素的曝光时长,且所述中曝光像素的曝光时长大于所述短曝光像素的曝光时长,所述装置包括:
    确定模块,用于确定环境亮度的亮度等级;所述亮度等级包括亮度从小到大排列的低亮度等级、中亮度等级和高亮度等级;
    调整模块,用于若拍摄环境的亮度等级属于高亮度等级或低亮度等级时,调整所述短曝光像素、中曝光像素和/或长曝光像素在所述像素单元阵列中的排布位置,以使至少两个中曝光像素相邻排布,将相邻排布的至少两个中曝光像素作为第一像素单元;
    成像模块,用于根据所述第一像素单元中至少两个中曝光像素输出像素值进行成像。
  11. 根据权利要求10所述的控制装置,其特征在于,所述成像模块,具体用于:
    控制所述第一像素单元中至少两个中曝光像素输出像素值;
    根据同一所述第一像素单元输出的像素值,生成第一合成像素值;
    根据所述第一合成像素值进行成像。
  12. 根据权利要求11所述的控制装置,其特征在于,所述成像模块,还用于:
    根据所述第一合成像素值进行插值计算,得到成像图像。
  13. 根据权利要求10-12任一项所述的控制装置,其特征在于,
    所述调整模块,还用于:在所述确定环境亮度的亮度等级之后,若所述拍摄环境的亮度等于属于中亮度等级时,调整所述短曝光像素、中曝光像素和/或长曝光像素在所述像素 单元阵列中的排布位置,以使各中曝光像素间隔排布,将同色滤光片覆盖的至少两中曝光像素以及与所述同色滤光片覆盖的至少两中曝光像素相邻的短曝光像素和/或长曝光像素作为第二像素单元;
    所述成像模块,还用于:根据所述第二像素单元中各曝光像素输出的像素值进行成像。
  14. 根据权利要求13所述的控制装置,其特征在于,所述成像模块,具体用于:
    控制所述第二像素单元中各曝光像素输出像素值;
    根据同一所述第二像素单元输出的像素值,生成第二合成像素值;
    根据所述第二合成像素值进行成像。
  15. 根据权利要求10-14任一项所述的控制装置,其特征在于,所述装置还包括:
    第一逆光场景确定模块,用于在所述确定环境亮度的亮度等级之前,根据拍摄的预览图像的直方图,确定当前拍摄环境属于逆光场景。
  16. 根据权利要求10-14任一项所述的控制装置,其特征在于,所述装置还包括:
    第二逆光场景确定模块,用于在所述确定环境亮度的亮度等级之前,根据所述像素单元阵列测量得到的环境亮度值,确定成像对象的亮度值以及背景的亮度值,并根据所述成像对象的亮度值和所述背景的亮度值,确定当前拍摄环境属于逆光场景。
  17. 根据权利要求10-16任一项所述的控制装置,其特征在于,属于同一所述第一像素单元的至少两中曝光像素覆盖有同色滤光片。
  18. 根据权利要求10-17任一项所述的控制装置,其特征在于,所述调整模块,具体用于:
    在所述像素单元阵列中,对各中曝光像素与相邻的短曝光像素或长曝光像素交换位置。
  19. 一种电子设备,其特征在于,包括成像设备,所述成像设备包括多个曝光像素组成的像素单元阵列,各曝光像素为短曝光像素、中曝光像素或长曝光像素,其中,所述长曝光像素的曝光时长大于所述中曝光像素的曝光时长,且所述中曝光像素的曝光时长大于所述短曝光像素的曝光时长,所述电子设备还包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如权利要求1-9中任一所述的控制方法。
  20. 一种非临时性计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如权利要求1-9中任一所述的控制方法。
PCT/CN2019/088244 2018-08-13 2019-05-24 控制方法、装置、电子设备和计算机可读存储介质 WO2020034702A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810915440.8A CN108965729A (zh) 2018-08-13 2018-08-13 控制方法、装置、电子设备和计算机可读存储介质
CN201810915440.8 2018-08-13

Publications (1)

Publication Number Publication Date
WO2020034702A1 true WO2020034702A1 (zh) 2020-02-20

Family

ID=64469406

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/088244 WO2020034702A1 (zh) 2018-08-13 2019-05-24 控制方法、装置、电子设备和计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN108965729A (zh)
WO (1) WO2020034702A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113676635A (zh) * 2021-08-16 2021-11-19 Oppo广东移动通信有限公司 高动态范围图像的生成方法、装置、电子设备和存储介质
CN114697537A (zh) * 2020-12-31 2022-07-01 浙江清华柔性电子技术研究院 图像获取方法、图像传感器及计算机可读存储介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108965729A (zh) * 2018-08-13 2018-12-07 Oppo广东移动通信有限公司 控制方法、装置、电子设备和计算机可读存储介质
CN109788207B (zh) * 2019-01-30 2021-03-23 Oppo广东移动通信有限公司 图像合成方法、装置、电子设备及可读存储介质
CN110176039A (zh) * 2019-04-23 2019-08-27 苏宁易购集团股份有限公司 一种针对人脸识别的摄像机调校方法和系统
CN116847202B (zh) * 2023-09-01 2023-12-05 深圳市广和通无线通信软件有限公司 基于四色滤波阵列算法的曝光调整方法及设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1134565A1 (en) * 2000-03-13 2001-09-19 CSEM Centre Suisse d'Electronique et de Microtechnique SA Imaging pyrometer
CN105578065A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 高动态范围图像的生成方法、拍照装置和终端
CN105578075A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 高动态范围图像的生成方法、拍照装置和终端
CN108200354A (zh) * 2018-03-06 2018-06-22 广东欧珀移动通信有限公司 控制方法及装置、成像设备、计算机设备及可读存储介质
CN108322669A (zh) * 2018-03-06 2018-07-24 广东欧珀移动通信有限公司 图像的获取方法及装置、成像装置、计算机可读存储介质和计算机设备
CN108353140A (zh) * 2015-11-16 2018-07-31 微软技术许可有限责任公司 图像传感器系统
CN108353134A (zh) * 2015-10-30 2018-07-31 三星电子株式会社 使用多重曝光传感器的拍摄装置及其拍摄方法
CN108965729A (zh) * 2018-08-13 2018-12-07 Oppo广东移动通信有限公司 控制方法、装置、电子设备和计算机可读存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1134565A1 (en) * 2000-03-13 2001-09-19 CSEM Centre Suisse d'Electronique et de Microtechnique SA Imaging pyrometer
CN108353134A (zh) * 2015-10-30 2018-07-31 三星电子株式会社 使用多重曝光传感器的拍摄装置及其拍摄方法
CN108353140A (zh) * 2015-11-16 2018-07-31 微软技术许可有限责任公司 图像传感器系统
CN105578065A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 高动态范围图像的生成方法、拍照装置和终端
CN105578075A (zh) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 高动态范围图像的生成方法、拍照装置和终端
CN108200354A (zh) * 2018-03-06 2018-06-22 广东欧珀移动通信有限公司 控制方法及装置、成像设备、计算机设备及可读存储介质
CN108322669A (zh) * 2018-03-06 2018-07-24 广东欧珀移动通信有限公司 图像的获取方法及装置、成像装置、计算机可读存储介质和计算机设备
CN108965729A (zh) * 2018-08-13 2018-12-07 Oppo广东移动通信有限公司 控制方法、装置、电子设备和计算机可读存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114697537A (zh) * 2020-12-31 2022-07-01 浙江清华柔性电子技术研究院 图像获取方法、图像传感器及计算机可读存储介质
CN114697537B (zh) * 2020-12-31 2024-05-10 浙江清华柔性电子技术研究院 图像获取方法、图像传感器及计算机可读存储介质
CN113676635A (zh) * 2021-08-16 2021-11-19 Oppo广东移动通信有限公司 高动态范围图像的生成方法、装置、电子设备和存储介质
CN113676635B (zh) * 2021-08-16 2023-05-05 Oppo广东移动通信有限公司 高动态范围图像的生成方法、装置、电子设备和存储介质

Also Published As

Publication number Publication date
CN108965729A (zh) 2018-12-07

Similar Documents

Publication Publication Date Title
WO2020034737A1 (zh) 成像控制方法、装置、电子设备以及计算机可读存储介质
CN108322669B (zh) 图像获取方法及装置、成像装置和可读存储介质
CN109005364B (zh) 成像控制方法、装置、电子设备以及计算机可读存储介质
KR102376901B1 (ko) 이미징 제어 방법 및 이미징 디바이스
US10630906B2 (en) Imaging control method, electronic device and computer readable storage medium
EP3609177B1 (en) Control method, control apparatus, imaging device, and electronic device
CN109788207B (zh) 图像合成方法、装置、电子设备及可读存储介质
CN109040609B (zh) 曝光控制方法、装置、电子设备和计算机可读存储介质
WO2020057199A1 (zh) 成像方法、装置和电子设备
WO2020029732A1 (zh) 全景拍摄方法、装置和成像设备
WO2020034702A1 (zh) 控制方法、装置、电子设备和计算机可读存储介质
WO2020034701A1 (zh) 成像控制方法、装置、电子设备以及可读存储介质
CN108632537B (zh) 控制方法及装置、成像设备、计算机设备及可读存储介质
CN108683861A (zh) 拍摄曝光控制方法、装置、成像设备和电子设备
CN109040607B (zh) 成像控制方法、装置、电子设备和计算机可读存储介质
US11601600B2 (en) Control method and electronic device
WO2020029679A1 (zh) 控制方法、装置、成像设备、电子设备及可读存储介质
CN109005369B (zh) 曝光控制方法、装置、电子设备以及计算机可读存储介质
CN109005363B (zh) 成像控制方法、装置、电子设备以及存储介质
CN108900785A (zh) 曝光控制方法、装置和电子设备
CN108259754B (zh) 图像处理方法及装置、计算机可读存储介质和计算机设备
CN110276730B (zh) 图像处理方法、装置、电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19849541

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19849541

Country of ref document: EP

Kind code of ref document: A1