CN114697584B - Image processing system and image processing method - Google Patents
Image processing system and image processing method Download PDFInfo
- Publication number
- CN114697584B CN114697584B CN202011634982.1A CN202011634982A CN114697584B CN 114697584 B CN114697584 B CN 114697584B CN 202011634982 A CN202011634982 A CN 202011634982A CN 114697584 B CN114697584 B CN 114697584B
- Authority
- CN
- China
- Prior art keywords
- pixel
- image
- pixels
- light
- infrared
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012545 processing Methods 0.000 title claims abstract description 174
- 238000003672 processing method Methods 0.000 title claims abstract description 19
- 230000003287 optical effect Effects 0.000 claims abstract description 73
- 238000001228 spectrum Methods 0.000 claims abstract description 9
- 230000000903 blocking effect Effects 0.000 claims abstract description 7
- 206010070834 Sensitisation Diseases 0.000 claims abstract description 3
- 230000008313 sensitization Effects 0.000 claims abstract description 3
- 230000004044 response Effects 0.000 claims description 77
- 230000003595 spectral effect Effects 0.000 claims description 71
- 230000001502 supplementing effect Effects 0.000 claims description 70
- 238000000034 method Methods 0.000 claims description 51
- 230000035945 sensitivity Effects 0.000 claims description 36
- 238000001914 filtration Methods 0.000 claims description 28
- 230000008569 process Effects 0.000 claims description 28
- 230000009467 reduction Effects 0.000 claims description 28
- 238000012937 correction Methods 0.000 claims description 14
- 238000012634 optical imaging Methods 0.000 claims description 9
- 230000002902 bimodal effect Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 41
- 230000001276 controlling effect Effects 0.000 description 117
- 238000010586 diagram Methods 0.000 description 19
- 230000002829 reductive effect Effects 0.000 description 17
- 239000000463 material Substances 0.000 description 15
- 206010034960 Photophobia Diseases 0.000 description 11
- 208000013469 light sensitivity Diseases 0.000 description 11
- 238000006243 chemical reaction Methods 0.000 description 10
- 238000005286 illumination Methods 0.000 description 9
- 230000000630 rising effect Effects 0.000 description 9
- 230000036961 partial effect Effects 0.000 description 7
- 230000000153 supplemental effect Effects 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 230000001105 regulatory effect Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000011084 recovery Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000004298 light response Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/57—Control of the dynamic range
- H04N25/58—Control of the dynamic range involving two or more exposures
- H04N25/581—Control of the dynamic range involving two or more exposures acquired simultaneously
- H04N25/585—Control of the dynamic range involving two or more exposures acquired simultaneously with pixels having different sensitivities within the sensor, e.g. fast or slow pixels or pixels having different sizes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/77—Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Color Television Image Signal Generators (AREA)
Abstract
The application provides an image processing system and an image processing method, wherein an optical unit is used for blocking a part of wavelength interval spectrum of incident light and outputting a target optical signal of an incident image sensor; the image sensor unit is used for converting the sensitization of each pixel in the pixel array of the image sensor to a target optical signal into an electric signal, and outputting an image signal in a first format after passing through a reading circuit of the image sensor; an exposure control unit for outputting an exposure control signal to the image sensor unit to control exposure of each channel of each pixel in the pixel array of the image sensor unit; an image processing unit, configured to perform first processing on the image signal in the first format to obtain a first processed image in the first format; and performing second processing on the first processed image in the first format to obtain a second processed image in a second format. The image processing system can optimize the image processing effect.
Description
Technical Field
The present disclosure relates to the field of image sensor design, and in particular, to an image processing system and an image processing method.
Background
The conventional rgbhr sensor senses a part of infrared components in the RGB channel, and the infrared components are large. In a scene with more infrared light energy, color information is lost to a certain extent after the infrared component of the visible light channel is removed. And the exposure parameters of the RGB channel and the infrared channel are the same, so that the two paths of information are difficult to achieve proper exposure.
Disclosure of Invention
In view of this, the present application provides an image sensor, an image processing system, and an image processing method.
Specifically, the application is realized by the following technical scheme:
according to a first aspect of embodiments of the present application, there is provided an image processing system including at least an image sensor unit, an optical unit, an image processing unit, an exposure control unit; wherein,
the optical unit is used for blocking a part of wavelength interval spectrum of incident light and outputting a target optical signal which is incident to the image sensor;
the image sensor unit comprises an image sensor, wherein the image sensor is used for converting the sensitization of each pixel in a pixel array of the image sensor to the target light signal into an electric signal, and outputting a first-format image signal after the electric signal passes through a reading circuit of the image sensor, and the pixel array of the image processor sensor at least comprises two types of pixels, wherein the intensity of near infrared light allowed to pass through by the first type of pixels is stronger than that of near infrared light allowed to pass through by the second type of pixels;
The exposure control unit is used for outputting an exposure control signal to the image sensor unit so as to control the exposure of each channel of each pixel in the pixel array of the image sensor unit;
the image processing unit is used for performing first processing on the image signal in the first format to obtain a first processed image in the first format; and performing second processing on the first processed image in the first format to obtain a second processed image in a second format.
According to a second aspect of embodiments of the present application, there is provided an image processing method, including:
performing first processing on an image signal in a first format output by an image sensor to obtain a first processed image in the first format, wherein a pixel array of the image processor sensor at least comprises two types of pixels, and the intensity of near infrared light allowed to pass through by the first type of pixels is stronger than that of near infrared light allowed to pass through by the second type of pixels;
and performing second processing on the processed image in the first format to obtain a second processed image in a second format.
According to the image processing system, at least an image sensor unit, an optical unit, an image processing unit and an exposure control unit are arranged, and the optical unit is used for blocking a part of wavelength interval spectrums of incident light and outputting a target light signal of the incident image sensor; the image sensor unit comprises an image sensor and is used for converting the light sensitivity of each pixel in the pixel array of the image sensor to a target light signal into an electric signal, and outputting an image signal in a first format after passing through a reading circuit of the image sensor; an exposure control unit for outputting an exposure control signal to the image sensor unit to control exposure of each channel of each pixel in a pixel array of an image sensor of the image sensor unit; an image processing unit, configured to perform first processing on the image signal in the first format to obtain a first processed image in the first format; and performing second processing on the first processed image in the first format to obtain a second processed image in a second format, so that the image processing effect is optimized.
The pixel array comprises red pixels, green pixels, blue pixels, dark pixels and white pixels, each pixel in the pixel array comprises a micro lens, a color filtering unit and a photosensitive unit, the pixel array comprises a plurality of RGBDW pixel units, the white pixels in the RGBDW pixel units account for 1/2 of all pixels, and the red pixels, the green pixels, the blue pixels and the dark pixels respectively account for 1/8 of all pixel numbers; the white light pixels in the RGBDW pixel unit are arranged in a mode that every other pixel and adjacent rows are staggered in a crossing mode, the red pixel, the green pixel, the blue pixel and the dark light pixel in the RGBDW pixel unit are filled in the rest positions, and two pixels of the same channel in the red pixel, the green pixel, the blue pixel and the dark light pixel are obliquely separated by one pixel.
Drawings
Fig. 1 is a schematic view showing a structure of an image sensor according to an exemplary embodiment of the present application;
fig. 2 is a schematic structural view of another image sensor according to still another exemplary embodiment of the present application;
fig. 3A to 3F are schematic views of pixel arrangements of rgbcw pixel units shown in exemplary embodiments of the present application;
FIG. 4 is a schematic diagram of a different channel spectral response curve as shown in an exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of signal routing in a two-way control signal scenario according to an exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of an exposure timing sequence shown in an exemplary embodiment of the present application;
FIG. 7 is a schematic diagram of an image processing system according to an exemplary embodiment of the present application;
FIG. 8 is a schematic diagram of another image processing system according to an exemplary embodiment of the present application;
fig. 9 is a schematic diagram of a structure of another image processing system shown in an exemplary embodiment of the present application;
FIG. 10 is a flow chart of an image processing method according to an exemplary embodiment of the present application;
FIG. 11 is a schematic diagram illustrating a convolution operation of a first processed image through different filter banks according to an exemplary embodiment of the present application;
FIG. 12 is a schematic diagram of a pixel arrangement according to an exemplary embodiment of the present application;
FIG. 13 is a schematic diagram of a different channel spectral response curve according to an exemplary embodiment of the present application;
FIG. 14 is a flow chart of a near infrared light compensation control method according to an exemplary embodiment of the present application;
fig. 15 is a flowchart illustrating an exposure control method according to an exemplary embodiment of the present application;
fig. 16 is a schematic structural view of an image processing apparatus according to an exemplary embodiment of the present application;
fig. 17 is a schematic diagram of a structure of an exposure control apparatus shown in an exemplary embodiment of the present application;
fig. 18 is a schematic diagram of a hardware structure of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
The terminology used in the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In order to better understand the technical solutions provided by the embodiments of the present application and make the above objects, features and advantages of the embodiments of the present application more obvious, the technical solutions in the embodiments of the present application are described in further detail below with reference to the accompanying drawings.
Referring to fig. 1, a schematic structural diagram of an image sensor according to an embodiment of the present application is shown in fig. 1, where the image sensor may include: a pixel array including red, green, blue, dark, and white pixels, each of the pixels in the pixel array including a microlens 110, a color filter unit 120, and a light sensing unit 130.
Illustratively, the pixel array includes a plurality of rgbcw (Red, green, blue, dark, white, red, green, blue, dark, white) pixel cells, which are 4*4 cells of Red, green, blue, dark, and White pixels.
Illustratively, the RGBDW pixel cell may be the smallest individual cell in the pixel array.
For example, in the RGBDW pixel unit, the white light pixel accounts for 1/2 of all pixels, and the red pixel, the green pixel, the blue pixel and the dark light pixel respectively account for 1/8 of all pixel numbers;
in the RGBDW pixel unit, white light pixels are arranged in a mode of alternately staggering every other pixel and adjacent rows;
the red pixel, the green pixel, the blue pixel and the dark light pixel in the RGBDW pixel unit are filled in the rest positions, and two pixels of the same channel in the red pixel, the green pixel, the blue pixel and the dark light pixel are obliquely separated by one pixel.
By way of example, a channel refers to a collection of pixels in an image sensor, which in this application may be understood as a 5-channel sensor, comprising an R-channel, a G-channel, a B-channel, a W-channel, and a D-channel.
For example, in an RGBDW pixel unit, a red pixel corresponds to an R channel (may be referred to as a red light channel), and two R pixels (i.e., red pixels) in the RGBDW pixel unit are diagonally spaced apart by one other pixel.
In the rgbcw pixel unit, the dark light pixel corresponds to a D channel (may be referred to as a dark light channel), and two D pixels (i.e., dark light pixels) in the rgbcw pixel unit are diagonally spaced apart by one other pixel.
In this embodiment of the present application, the image sensor may include a pixel array, where the pixel array may include a red pixel (R pixel), a green pixel (G pixel), a blue pixel (B pixel), a dark pixel (D pixel) and a white pixel (W pixel), and each pixel in the pixel array includes a microlens, a color filter unit and a photosensitive unit.
Each pixel may be sensitized by a photosensitive unit, and converts an optical signal into an electrical signal, for example.
Illustratively, the W pixel accounts for 1/2 of all pixels in the pixel array, and the R pixel, the G pixel, the B pixel and the D pixel respectively account for 1/8 of all pixel numbers.
In the pixel array, R pixels, G pixels, B pixels, D pixels, and W pixels may be arranged in 4*4 minimum individual unit repetitions (may be referred to as rgbcw pixel units). W is arranged in a mode of every other pixel and adjacent rows are staggered in a crossing way; r pixel, G pixel, B pixel and D pixel fill the rest position, and two pixels satisfying the same channel are diagonally separated by one pixel, the schematic diagram can be shown in any one of fig. 3A to 3F.
Note that, the RGBDW pixel cell arrangement shown in fig. 3A to 3F (i.e., the arrangement of R pixel, G pixel, B pixel, D pixel, and W pixel in the RGBDW pixel cell of 4*4) is merely a specific example of the RGBDW pixel cell arrangement in the embodiment of the present application, and is not limited to the scope of protection of the present application, for example, the RGBDW pixel cell arrangement shown in fig. 3A is removed from the first column (i.e., the upper left corner in the RGBDW pixel cell of 4*4 is W, the lower right corner is W), the first two columns are removed, the first three columns are removed, the first row is removed from the first column is removed from the first row, the first two columns are removed from the first row, or the first row is removed from the first three columns, and new RGBDW pixel cell arrangements (co-existence of 7 new arrangements) may be obtained, i.e., the RGBDW pixel cell arrangement shown in fig. 3A may be expanded to obtain 8 RGBDW pixel cell arrangements in total; similarly, the RGBDW pixel cell arrangement shown in any of fig. 3B to 3F may be extended to 8 arrangements, that is, 48 RGBDW pixel cell arrangements may exist in total.
In an exemplary aspect, the number of W in the pixel array is half of the total number of pixels, so that the signal-to-noise ratio of the brightness component of the interpolated image under the condition of low-illumination light filling can be ensured. On the other hand, the rest half pixels are distributed uniformly, so that the number of the photosensitive pixels of the visible channel is the same, the same information quantity of the image color information and each color component is ensured to the greatest extent, and the interpolation of the color channel can utilize the image interpolation information of the W channel, so that both the color and the image detail can be considered.
In some embodiments, the rgbcw cells are spaced one pixel between the R and D pixels in the same row.
For example, since the R pixels and the D pixels are relatively close in the spectral response, the R pixels and the D pixels are uniformly distributed in the pixel arrangement, and in the interpolation process, the symmetry of the neighborhood pixels is better, which is more beneficial to the R channel information used in the interpolation of the D pixels, so that in the 4*4 minimum independent unit of the pixel array, the R pixels and the D pixels are in the same row and the same column, and the R pixels and the D pixels in the same row are spaced by one pixel, and the schematic diagrams thereof can be shown in fig. 3A or fig. 3B.
For example, pixels of different channels are distinguished by coating different filter materials on the color filter units, so that the wavelength ranges of light rays allowed to pass through by the color filter units corresponding to the channels are different.
In some embodiments, the filter material coated by the color filter unit corresponding to the W pixel allows red light, green light, blue light, and near infrared light to pass through;
the intensity of visible light (comprising red light, green light and blue light) allowed to pass through by the filter material coated by the filter unit corresponding to the D pixel is weaker than that of visible light allowed to pass through by the filter material coated by the filter unit corresponding to the W pixel, and the near infrared light is allowed to pass through by the filter material coated by the filter unit corresponding to the D pixel;
the filter material coated on the color filter pixel unit corresponding to the R pixel allows red light and partial near infrared light to pass through;
the filter material coated on the color filter pixel unit corresponding to the G pixel allows the green light and part of near infrared light to pass through;
the filter material coated on the color filter pixel unit corresponding to the pixel B allows blue light and partial near infrared light to pass through;
by way of example, near infrared light may refer to short wave near infrared light having a wavelength within 1100 nm.
Illustratively, the filter material coated by the color filter unit corresponding to the R pixel, the G pixel, and the B pixel allows a portion of the near infrared light to pass through, i.e., the filter material coated by the color filter unit corresponding to the R pixel, the G pixel, and the B pixel allows the near infrared light to pass through at an energy smaller than the near infrared light allowed by the W.
Illustratively, the filter material coated by the color filter units corresponding to the R pixel, the G pixel, and the B pixel allows a portion of the near infrared light to pass therethrough, i.e., the filter material coated by the color filter units corresponding to the R pixel, the G pixel, and the B pixel allows a smaller energy of the near infrared light to pass therethrough than the D pixel allows.
For example, the wavelength ranges of the near infrared light that the W pixel, the D pixel, the R pixel, the G pixel, and the B pixel allow to pass may not be identical, and the wavelength ranges of the near infrared light that the R pixel, the G pixel, and the B pixel allow to pass are smaller than the wavelength ranges of the near infrared light that the W pixel and the D pixel allow to pass.
Illustratively, the intensity of near infrared light allowed to pass through by the filter material applied by the filter unit corresponding to the R pixel, the G pixel, and the B pixel is weaker than the intensity of near infrared light allowed to pass through by the filter material applied by the filter unit corresponding to the W pixel.
In some embodiments, in the first wavelength range [ T1, T2], the integral value of the spectral response curve of the W pixel is greater than the integral value of the spectral response curve of the R pixel, the integral value of the spectral response curve of the G pixel, and the integral value of the spectral response curve of the B pixel, respectively, and the integral value of the spectral response curve of the D pixel is greater than the integral value of the spectral response curve of the R pixel, the integral value of the spectral response curve of the G pixel, and the integral value of the spectral response curve of the B pixel;
Illustratively, the first wavelength range T1>700nm, T2<1000nm.
The magnitudes of the electrical signals converted by the light-transmitting color filter unit and the photoelectric conversion unit of different wavelengths of the same energy are different, and the magnitudes of the electrical signals converted by the light-transmitting color filter unit and the photoelectric conversion unit of a certain wavelength of the specified energy can be referred to as the spectral response of the light of the wavelength.
Since the larger the integral value of the W spectral response curve in the [ T1, T2] section, the higher the upper limit of the image signal-to-noise ratio thereof. The lower the integrated value of the spectral response curve of the R pixel, the integrated value of the spectral response curve of the G pixel, and the integrated value of the B spectral response curve, the lower the infrared component in the RGB image signal. And under the condition of ensuring the signal-to-noise ratio of the W channel image, the infrared components of the visible light pixels (namely R pixels, G pixels and B pixels) are less, so that the RGB color information recovery of the RGB channel is facilitated.
Further, it is considered that the higher the integrated value of the D spectral response curve is, the higher the upper limit of the signal-to-noise ratio of the infrared image is. The less image noise is removed from the infrared image by the infrared component in the visible light channel.
It should be noted that, unless otherwise specified, the infrared light mentioned in the embodiments of the present application may refer to near infrared light within 1100nm as described above.
For example, in [ T1, T2], the integral value of the spectral response curve of the W pixel in the image sensor may be made larger than the integral value of the spectral response curve of the R pixel, the integral value of the spectral response curve of the G pixel, and the integral value of the spectral response curve of the B pixel, respectively, so that the infrared component of the RGB pixel is reduced and the RGB color recovery effect of the RGB channel is improved under the condition of ensuring the signal-to-noise ratio of the W channel image; in addition, the integral value of the spectral response curve of the D pixel is respectively larger than the integral value of the spectral response curve of the R pixel, the integral value of the spectral response curve of the G pixel and the integral value of the spectral response curve of the B pixel, so that the image noise after the infrared component in the visible light channel is removed is reduced, and the image effect is optimized.
In one example, the integrated value of the spectral response curve of the D pixel is not greater than the integrated value of the spectral response curve of the W pixel in the first wavelength range [ T1, T2 ].
In one example, a ratio of an integrated value of the spectral response curve of the W pixel to integrated values of the spectral response curves of the R pixel, the G pixel, and the B pixel is 3 or more.
Illustratively, in [ T1, T2], a ratio of an integrated value of a spectral response curve of the W pixel to an integrated value of an RGB spectral response curve in the image sensor may be made 3 or more, that is, an integrated value of a spectral response curve of the W pixel to an integrated value of a spectral response curve of the R pixel is 3,W pixels or more to an integrated value of a spectral response curve of the G pixel or more to 3 or more, and an integrated value of a spectral response curve of the W pixel to an integrated value of a spectral response curve of the B pixel or more to 3 or more.
It should be noted that, in the image sensor, the integral value of the spectral response curve of the R pixel, the integral value of the spectral response curve of the G pixel, and the integral value of the spectral response curve of the B pixel are similar, that is, the difference between every two pixels is smaller than the preset threshold.
In some embodiments, in the second wavelength range [ T3, T4], the integrated values of the spectral response curves of the R pixel, the G pixel, and the B pixel are each greater than the integrated value of the spectral response curve of the D pixel.
Illustratively, the second wavelength range is 380 nm.ltoreq.T3.ltoreq.480 nm,600 nm.ltoreq.T4.ltoreq.700 nm, T4 < T1.
Illustratively, in the second wavelength range (which may also be referred to as the visible wavelength range, herein denoted as [ T3, T4 ]), the integral value of the spectral response curve of the D pixel should be lower than the integral value of the spectral response curve of the R pixel, the integral value of the spectral response curve of the G pixel, and the integral value of the spectral response curve of the B pixel.
In one example, the ratio of the spectral response of R, G, and B pixels to the spectral response of D pixels is greater than 8.
Illustratively, to ensure that visible light response interference of infrared pixels is eliminated in the infrared removal process, the ratio of the integral value of the spectral response curves of R pixels, G pixels, and B pixels to the integral value of the spectral response curve of D pixels in [ T3, T4] may be made larger than 8.
For example, the spectral response curves of the different channels after the color filters are coated with the filter material meeting the above requirements can be shown in fig. 4.
For example, as shown in fig. 2, in the image sensor, below the pixel array, a circuit portion of the sensor includes a charge readout and control circuit, and the charge readout and control circuit may include two portions of the charge readout and control circuit.
The charge readout is used for transferring the electric signals (charges) accumulated in the exposure time of the sensor, and the charges of the pixels are read out under the control signal of the circuit.
The overall physical process of image sensor processing is, illustratively: after entering the pixel array, the light rays pass through the micro lens array and the color filter, and the light energy is converted into a charge value in the pixel array according to the energy of the incident light and the spectral response curve of the sensor. And the charge readout module outputs the charge accumulated by the photosensitive device to obtain a photosensitive result.
In some embodiments, the image sensor may further include: an exposure time control circuit (not shown in the figure); the exposure time control circuit is used for outputting at least one exposure time control signal, and the first exposure time control signal is at least used for controlling the exposure time of non-white light pixels, wherein the non-white light pixels comprise R pixels, G pixels, B pixels and D pixels.
The image sensor may implement exposure time control of each pixel by an exposure time control circuit, for example.
For example, the exposure time control circuit of the image sensor may be configured to output at least one exposure time control signal.
As one possible implementation, the exposure time control circuit of the image sensor may be configured to output a first exposure time control signal (which may be referred to as a first exposure time control signal) that is configured to control the exposure times of the non-white light pixels and the W pixels.
By way of example, the exposure time control circuit of the image sensor can control the exposure time of the W pixel, the D pixel, the R pixel, the G pixel and the B pixel through one path of exposure time control signal, so that the process implementation difficulty of the image sensor is reduced.
As another possible implementation, the exposure time control circuit of the image sensor may be configured to output two exposure time control signals, where a first one of the two exposure time control signals is configured to control the exposure time of the non-white light pixel and a second one of the two exposure time control signals is configured to control the exposure time of the W pixel.
By way of example, considering that the types of the spectral response curves of the RGB channel, the D channel and the W channel are different, and considering that the independence of the W channel is high, the exposure time of the non-white light pixel can be controlled by one path of exposure time control signal, the exposure time of the W can be controlled by the other path of exposure time control signal, and the exposure time can be independently controlled by the W, so that under the condition that the exposure of the visible light is sufficient, the white light path, namely the tail blur of a moving object in the light compensation path image, can be reduced, and the image effect can be improved.
For example, a schematic diagram of the exposure time control circuit controlling the exposure time of the non-white light pixel and the W pixel by two paths of exposure time control signals, respectively, may be shown in fig. 5.
It should be noted that, in the control signals shown in fig. 5, the signal traces only represent connection relationships of signals, and do not represent implementation on physical hardware or software.
For example, let r be the row position of the pixel in the sensor array and c be the column position of the pixel in the sensor array.
All pixel positions of r% 2=1 and y% 2=0 or r% 2=0 and y% 2=1 are W pixels, where% is the remainder operation. The pixels at the rest positions represent R pixels, G pixels, B pixels or D pixels, which are all connected by a control signal line.
The connection of the signal lines may be different for other rgbcw arrangement sensors that meet the requirements.
In one example, when the exposure time control circuit is configured to output at least one exposure time control signal, the first exposure time control signal is configured to control at least an exposure time of the non-white light pixel, the first exposure time control signal is configured to control at least an exposure start time of the R pixel, the G pixel, the B pixel, and the D pixel;
or the first path of exposure time control signal is at least used for controlling the exposure ending time of the R pixel, the G pixel, the B pixel and the D pixel;
or the first path of exposure time control signal is at least used for controlling the exposure start time and the exposure end time of the R pixel, the G pixel, the B pixel and the D pixel.
For example, the exposure time control signal output from the exposure time control circuit of the image sensor may be used to control the exposure start time and/or the exposure end time to control the exposure time of the corresponding pixel.
When the exposure time control circuit is used for outputting at least one path of exposure time control signal, the first path of exposure time control signal is at least used for controlling the exposure time of the non-white light pixels, and the first path of exposure time control signal can be at least used for controlling the exposure starting time of the R pixels, the G pixels, the B pixels and the D pixels; alternatively, the first path of exposure time control signal may be used to control at least the exposure end times of the R, G, B, and D pixels; alternatively, the first path of exposure time control signal may be used to control at least the exposure start time and the exposure end time of the R pixel, the G pixel, the B pixel, and the D pixel.
In some embodiments, the image sensor may further include: and the sensor gain control circuit is used for outputting at least two paths of gain control signals, wherein the first path of gain control signal is used for controlling the gains of the R pixel, the G pixel and the B pixel, and the second path of gain control signal is at least used for controlling the gain of the W pixel.
The image sensor may implement gain control of each pixel by a sensor gain control circuit, for example.
Illustratively, the gain may include analog gain and/or digital gain.
For example, considering the spectral response characteristics of different channels, the difference between the light sensitivity of the white light channel and the RGB light channel is larger, if the sensor is configured with the gain suitable for the light sensitivity of a certain channel, the image of other channels is underexposed or overexposed, which is unfavorable for the subsequent image processing effect, so the gain of each pixel can be controlled by the sensor gain control circuit of the image sensor through at least two gain control signals to optimize the image effect.
For example, the sensor gain control circuit of the image sensor may output at least two gain control signals, the first gain control signal being used to control the gains of the R pixel, the G pixel, and the B pixel; the second path of gain control signal is at least used for controlling the gain of the W pixel.
As one possible implementation, the sensor gain control circuit may output two gain control signals, the second gain control signal being used to control the gains of the W pixel and the D pixel.
As another possible implementation, the sensor gain control circuit may output three gain control signals, a second gain control signal for controlling the gain of the W pixel and a third gain control signal for controlling the gain of the D pixel.
In one example, the exposure time control circuit is configured to output two paths of exposure time control signals, wherein a first path of exposure time control signal is configured to control exposure time of a non-white light pixel, a second path of exposure time control signal is configured to control exposure time of a W pixel, the sensor gain control circuit is configured to output three paths of gain control signals, wherein the first path of gain control signal is configured to control gains of an R pixel, a G pixel, and a B pixel, the second path of gain control signal is configured to control gains of the W pixel, and the third path of gain control signal is configured to control gains of a D pixel.
The image sensor may output two exposure time control signals to control the exposure time of each pixel by way of an exposure time control circuit, for example. One path of exposure time control signal (namely the first path of exposure time control signal) is used for controlling the exposure time of the non-white light pixels, the other path of exposure time control signal (can be called as the second path of exposure time control signal) is used for controlling the exposure time of the W pixels, and the exposure time is independently controlled by the W pixels, so that the method can ensure that under the condition of sufficient visible light exposure, the tail blurring and the like of a white light path, namely a moving object in a light supplementing path image, are reduced, and the image effect is improved.
By way of example, RGBD channels and white light may need to be properly exposed with different gains, considering that the exposure times of non-white light pixels and W pixels are controlled separately, and the exposure times of RGBD channels and white light channels may be different.
In addition, if the gain of the D pixel is kept the same as that of the RGB, the D pixel is liable to overexposure because the infrared component response is higher than that of the RGB pixel. Thus, the D pixel needs to control the gain separately from the RGB path.
Thus, the image sensor can control the gain of each pixel by outputting three gain control signals through the sensor gain control circuit. One path of gain control signal (may be referred to as a first path of gain control signal) is used for controlling the gains of the R pixel, the G pixel and the B pixel, the other path of gain control signal (may be referred to as a second path of gain control signal) is used for controlling the gain of the W pixel, and the last path of gain control signal (may be referred to as a third path of gain control signal) is used for controlling the gain of the D pixel so as to optimize the image processing effect.
It should be noted that, in the embodiment of the present application, when the exposure time control circuit outputs two paths of exposure time control signals, the sensor gain control circuit may output three paths of analog gain control signals, where the first path of analog gain control signal is used to control analog gains of R pixels, G pixels, and B pixels, the second path of analog gain control signal is used to control analog gains of W pixels, and the third path of analog gain control signal is used to control analog gains of D pixels.
Or,
under the condition that the exposure time control circuit outputs two paths of exposure time control signals, the sensor gain control circuit can output three paths of digital gain control signals, wherein the first path of digital gain control signal is used for controlling the digital gains of R pixels, G pixels and B pixels, the second path of digital gain control signal is used for controlling the digital gain of W pixels, and the third path of digital gain control signal is used for controlling the digital gain of D pixels.
Or,
under the condition that the exposure time control circuit outputs two paths of exposure time control signals, the sensor gain control circuit can output three paths of analog gain control signals and three paths of digital gain control signals, wherein the first path of analog gain control signals and the first path of digital gain control signals are used for controlling analog gains and digital gains of R pixels, G pixels and B pixels, the second path of analog gain control signals and the second path of digital gain control signals are used for controlling analog gains and digital gains of W pixels, and the third path of analog gain control signals and the third path of digital gain control signals are used for controlling analog gains and digital gains of D pixels. In one example, a first path of exposure time control signals is used to control the exposure start times of R, G, B, and D pixels, a second path of exposure time control signals is used to control the exposure start time of W,
Or the first path of exposure time control signals are used for controlling the exposure ending time of the R pixel, the G pixel, the B pixel and the D pixel, and the second path of exposure time control signals are used for controlling the exposure ending time of the W;
alternatively, the first path of exposure time control signal is used for controlling the exposure start time and the exposure end time of the R pixel, the G pixel, the B pixel and the D pixel, and the second path of exposure time control signal is used for controlling the exposure start time and the exposure end time of the W.
As one possible implementation manner, the first path of exposure time control signal is used for controlling exposure start time of the R pixel, the G pixel, the B pixel and the D pixel; the second path of exposure time control signal is used for controlling the exposure starting time of W; the exposure end time of the R pixel, the G pixel, the B pixel, the D pixel and the W pixel is the same.
Illustratively, the first path of exposure time control signals control the exposure start time of the R, G, B, and D pixels, the second path of exposure time control signals control the exposure start time of the W path, and the R, G, B, D, and W pixels use a uniform exposure end time (e.g., the exposure end is controlled by a uniform exposure end signal). The two exposure time control signal signals (i.e., the first exposure time control signal and the second exposure time control signal) are independent of each other.
The exposure end time of the R pixel, the G pixel, the B pixel and the D pixel is the same, so that the images of the R channel, the G channel, the B channel and the D channel can be ensured not to be misplaced due to different exposure time during the infrared operation, and the defect of the effect of the motion area after the infrared operation can be avoided. The advantage of individually controlling the exposure time of the W pixel is that, on the one hand, when the ambient visible light is weak, the light input amount can be increased by increasing the exposure time of the RGB channels, and on the other hand, by decreasing the exposure time of the W pixel, the image dynamic range loss caused by the unmatched exposure of the two groups of channels (RGBD channel and W channel) can be avoided, and at the same time, the problem of motion blur in the region where the image moves faster due to the longer exposure time can be reduced.
For example, taking the exposure time sequence (i.e. the exposure time control signal) shown in fig. 6 as an example, after the exposure starts, the first path of exposure time control signal and the second path of exposure time control signal strobe the corresponding pixels to start exposure (the exposure start time of the R pixel, the G pixel, the B pixel, and the D pixel is not later than the exposure start time of the W pixel, and the exposure start time of the R pixel, the G pixel, the B pixel, and the D pixel is earlier than the exposure start time of the W in the figure as an example), after the required exposure time of the sensor configuration is reached, the exposure end signal of the exposure time control circuit of the sensor rises along the unified exposure end time to end exposure. The time from the rising edge of the exposure start signal to the rising edge of the control signal output is the exposure time of the corresponding pixel. Since the two paths of exposure time control signals are independent, the pulse signal time is different and corresponds to different exposure time.
It should be noted that, the above-mentioned circuit control triggering manner may be a rising edge or a falling edge triggered by a pulse, or may be triggered according to a high-low level (i.e. the exposure time is high, the exposure time is low, and the exposure signal lasts for a period of time), which is not limited in this embodiment of the present application.
In addition, the above example is only one example of exposure control in the embodiment of the present application, it may be designed that the end exposure time of the W pixel is different from the end exposure time of the R pixel, the G pixel, the B pixel, and the D pixel, and other modes such as the same exposure start time is controlled by using the exposure start signal, which is not limited in the embodiment of the present application.
In one example, the exposure time control circuit is configured to output a first path of exposure time control signal, where the first path of exposure time control signal is configured to control exposure times of the non-white light pixels and the W pixels;
the sensor gain control circuit is used for outputting two paths of gain control signals, wherein the first path of gain control signal is used for controlling gains of R pixels, G pixels and B pixels, and the second path of gain control signal is used for controlling gains of W pixels and D pixels.
By way of example, the image sensor may output one exposure time control signal (which may be referred to as a first exposure time control signal) for controlling the exposure time of the non-white light pixels and the W pixels through the exposure time control circuit to control the exposure time of each pixel, and thus, the control cost of the image sensor may be reduced and the process implementation difficulty of the image sensor may be reduced.
Illustratively, the dark channel may be gain controlled along with the white light channel, considering that the exposure times of the rgbcw are all the same, since the infrared component is closer to the white light channel.
Thus, the image sensor may output two gain control signals through the sensor gain circuit, one of which (may be referred to as a first gain control signal) is used to control the gains of the R, G, and B pixels, and the other of which (may be referred to as a second gain control signal) is used to control the gains of the W and D pixels.
It should be noted that, in the embodiment of the present application, when the exposure time control circuit outputs one path of exposure time control signal, the sensor gain control circuit may output two paths of analog gain control signals, where the first path of analog gain control signal is used to control the analog gains of the R pixel, the G pixel, and the B pixel, and the second path of analog gain control signal is used to control the analog gains of the W pixel and the D pixel.
Or,
under the condition that the exposure time control circuit outputs one path of exposure time control signal, the sensor gain control circuit can output two paths of digital gain control signals, wherein the first path of digital gain control signal is used for controlling the digital gains of R pixels, G pixels and B pixels, and the second path of digital gain control signal is used for controlling the digital gains of W pixels and D pixels.
Or,
under the condition that the exposure time control circuit outputs one path of exposure time control signal, the sensor gain control circuit can output two paths of analog gain control signals and two paths of digital gain control signals, wherein the first path of analog gain control signal and the first path of digital gain control signal are used for controlling the analog gain and the digital gain of the R pixel, the G pixel and the B pixel, and the second path of analog gain control signal and the second path of digital gain control signal are used for controlling the analog gain and the digital gain of the W pixel and the D pixel.
In one example, the first path of exposure time control signal is used to control the exposure start time of the non-white light pixels and the W pixels,
or the first path of exposure time control signal is used for controlling the exposure end time of the non-white light and the W pixels;
alternatively, the first path of exposure time control signal is used for controlling the exposure start time and the exposure end time of the non-white light and the W pixels.
Referring to fig. 7, a schematic structural diagram of an image processing system according to an embodiment of the present application is shown in fig. 7, and the image processing system may at least include an image sensor unit 710, an optical unit 720, an image processing unit 730, and an exposure control unit 740.
The image sensor unit 710 may include, for example, the image sensor described in any of the above embodiments.
An optical unit 720 for blocking a partial wavelength interval spectrum of the incident light and outputting a target optical signal of the incident image sensor;
an image sensor unit 710, configured to convert the light sensing of each pixel in the pixel array of the image sensor to an electrical signal, and output an image signal in a first format after passing through a readout circuit of the image sensor;
for example, the pixel array of the image processor sensor may include at least two types of pixels, wherein the first type of pixels allow the passage of near infrared light at a higher intensity than the second type of pixels allow the passage of near infrared light.
In one example, the first type of pixels includes W pixels and D pixels, and the second type of pixels includes R pixels, G pixels, and B pixels.
In another example, the first type of pixels includes D pixels and the second type of pixels includes R pixels, G pixels, and B pixels.
In another example, the first type of pixels includes W pixels and the second type of pixels includes R pixels, G pixels, and B pixels.
An exposure control unit 740 for outputting an exposure control signal to the image sensor unit 710 to control exposure of each channel of each pixel in the pixel array of the image sensor unit;
an image processing unit 730, configured to perform a first process on the image signal in the first format to obtain a first processed image in the first format; and performing second processing on the first processed image in the first format to obtain a second processed image in a second format.
For example, the optical unit 720 may be configured to block a portion of the spectrum of the wavelength interval of the incident light, resulting in an optical signal (referred to herein as a target optical signal) of the incident image sensor.
In some embodiments, the optical unit 720 may include an optical imaging lens and a filtering device.
Illustratively, the filter device is positioned between the optical imaging lens and the image sensor, and the image sensor is positioned on the light-emitting side of the filter device. After the incident light passes through the optical imaging lens, the optical filtering device filters the incident light to block part of the wavelength interval spectrum, and the filtered optical signal (namely the target optical signal) enters the image sensor.
In one example, the filter device includes a first filter, a second filter, and a switching member, each of the first filter and the second filter being connected to the switching member.
Exemplary, the switching means is configured to switch the second filter to the light incident side of the image sensor or switch the first filter to the light incident side of the image sensor.
For example, in order to increase the flexibility and controllability of the filtering device, the filtering device may comprise two filters (herein referred to as a first filter and a second filter) and a switching member. The first optical filter and the second optical filter are both connected with the switching component.
The switching component can control the first optical filter to be switched to the light incident side of the image sensor according to a specified strategy, or control the second optical filter to be switched to the light incident side of the image sensor so as to realize different optical filtering requirements.
In one example, the first filter passes visible light and a portion of near infrared light; the second filter passes visible light and blocks near infrared light.
As a possible implementation manner, the first optical filter is a bimodal optical filter, and the bimodal optical filter is used for passing infrared light with a visible light wave band [ T3, T4] and a wave band range within a first wavelength range [ T1, T2] so as to filter out different parts of spectral responses of a red light channel, a blue light channel and a green light channel in an infrared wave band.
In another example, the optical imaging lens is coated with a filter for passing infrared light in the visible wavelength range [ T3, T4] and in the first wavelength range [ T1, T2 ]; the first optical filter is an all-pass optical filter, and the second optical filter enables visible light to pass through and blocks all near infrared light.
It should be noted that the first filter may be replaced by a glass slide.
For example, considering that the optical lens in the embodiments of the present application needs to pass visible light in a specified wavelength range or visible light in a specified wavelength range and infrared light in a specified wavelength range according to requirements, optical signals in other wavelength ranges need to be blocked.
In view of the above-mentioned requirements, it is possible to achieve blocking of a specified optical signal by different filters, respectively, by adding a filter film to an optical imaging lens.
For example, the optical imaging lens may be coated with a filter for passing infrared light in the visible wavelength range [ T3, T4] and in the first wavelength range [ T1, T2] to block optical signals in other wavelength ranges.
For a scene requiring near infrared light, a first optical filter which is an all-pass optical filter can be switched to the light inlet side of the image sensor; for a scene that does not require near infrared light, a second filter that passes visible light but blocks all near infrared light may be switched to the light-entering side of the image sensor to achieve blocking of near infrared light.
It should be noted that, because the first optical filter does not filter the incident light signal when the first optical filter is an all-pass optical filter, the first optical filter may be always located at the light incident side of the image sensor, and whether to switch the second optical filter to the light incident side of the image sensor is selected according to the requirement; alternatively, the first filter, which is disposed as an all-pass filter, may not be required, but whether to switch the second filter to the light-incident side of the image sensor may be selected according to the need.
In one example, as shown in fig. 8, the image processing system further includes a drive control unit 750, the drive control unit 750 being configured to control the switching means to switch the second optical filter to the light incident side of the image sensor when the image processing system uses the first operation mode; when the image processing system uses the second working mode, the switching component is controlled to switch the first optical filter to the light inlet side of the image sensor.
For example, to accommodate different image processing scenarios, the image processing system may be configured with at least two different modes of operation (referred to herein as a first mode of operation and a second mode of operation), where the requirements for the optical signal in the incident light are different.
For example, the image processing system may control the switching means to switch the first filter to the light incident side of the image sensor or to switch the second filter to the light incident side of the image sensor through the driving control unit 750 according to the operation mode used.
Illustratively, the drive control unit 750 may control the switching means to switch the second filter to the light incident side of the image sensor when the image processing system uses the first operation mode; when the image processing system uses the second working mode, the switching component is controlled to switch the first optical filter to the light inlet side of the image sensor.
For example, the second operation mode may be an operation mode for a low-illumination environment, and the first operation mode may be an operation mode for a non-low-illumination environment, and in the second operation mode, a portion of near infrared light may be allowed to pass through, so as to implement light filling, so as to optimize an image processing effect.
In one example, based on the image processing shown in fig. 7 or fig. 8, as shown in fig. 9 (for example, based on the image processing system shown in fig. 7, the image processing system may further include: a near infrared light compensating unit 760 and a near infrared light compensating unit 770;
The near-infrared light compensating unit 760 is configured to send a light compensating signal to the near-infrared light compensating unit, where the light compensating signal is at least used to control the near-infrared light compensating unit 770 to turn on and off;
the near-infrared light compensating unit 770 is used for switching on near-infrared light compensation or switching off near-infrared light compensation based on the light compensation control signal; when the second optical filter is switched to the light inlet side of the image sensor, the light supplementing control signal is used for controlling the near infrared light supplementing unit to turn off near infrared light supplementing; when the first optical filter is switched to the light inlet side of the image sensor, the light supplementing control signal is used for controlling the near infrared light supplementing unit to start near infrared light supplementing.
The image processing system may further include a near infrared light compensating unit and a near infrared light compensating unit.
The near infrared light compensating unit may transmit a light compensating control signal for controlling the on and off of the near infrared light compensating unit to the near infrared light compensating unit according to the demand.
For example, if the second optical filter is switched to the light incident side of the image sensor when the near infrared light is not required to be incident on the image sensor, the control signal for controlling the close of the infrared light compensating unit can be sent to the near infrared light compensating unit by the near infrared light compensating unit to control the close of the near infrared light compensating unit to close the near infrared light compensating unit, so as to save system resources; when the first optical filter does not block near infrared light or blocks part of near infrared light, if the first optical filter is switched to the light inlet side of the image sensor, that is, when near infrared light is required to be incident on the image sensor currently, a control signal for controlling the opening of the infrared light compensating unit can be sent to the near infrared light compensating unit through the near infrared light compensating unit so as to control the near infrared light compensating unit to open near infrared light compensating, so that light compensating is realized, and an image processing effect is improved.
In some embodiments, the image sensor of the image sensor unit 710 includes an exposure time control circuit; the exposure control unit 740 includes an exposure time control unit for outputting an exposure time control signal to an exposure time control circuit;
the exposure time control circuit is used for outputting at least a first path of exposure time control signals according to the exposure time control signals output by the exposure time control unit, wherein the first path of exposure time control signals are at least used for controlling the exposure time of non-white light pixels, and the non-white light pixels comprise R pixels, G pixels, B pixels and D pixels.
For example, the exposure control unit 740 may include an exposure time control unit that may control exposure times of pixels in the image sensor by outputting an exposure time control signal to an exposure time control circuit in the image sensor.
For example, when the exposure time control circuit in the image sensor receives the exposure time control signal output by the exposure time control unit, at least a first path of exposure time control signal may be output, where the first path of exposure time control signal is used to control the exposure time of at least the non-white light pixel and the W pixel.
By way of example, the exposure time control circuit of the image sensor can control the exposure time of the W pixel, the D pixel, the R pixel, the G pixel and the B pixel through one path of exposure time control signal, so that the process implementation difficulty of the image sensor is reduced.
As another possible implementation, the exposure time control circuit of the image sensor may be configured to output two exposure time control signals, where a first one of the two exposure time control signals is configured to control the exposure time of the non-white light pixel and a second one of the two exposure time control signals is configured to control the exposure time of the W pixel.
By way of example, considering that the types of the spectral response curves of the RGB channel, the D channel and the W channel are different, and considering that the independence of the W channel is high, the exposure time of the non-white light pixel can be controlled by one path of exposure time control signal, the exposure time of the W pixel can be controlled by the other path of exposure time control signal, and the exposure time can be independently controlled by the W pixel, so that the reduction of the white light path, namely the tailing blurring of a moving object in the light compensation path image, can be ensured under the condition of sufficient visible light exposure, and the improvement of the image effect is facilitated.
In one example, the first path of exposure time control signals is used to control at least exposure start times of the R, G, B, and D pixels;
Or the first path of exposure time control signal is at least used for controlling the exposure ending time of the R pixel, the G pixel, the B pixel and the D pixel;
or the first path of exposure time control signal is at least used for controlling the exposure start time and the exposure end time of the R pixel, the G pixel, the B pixel and the D pixel.
In one example, the image sensor of the image sensor unit 710 further includes: the gain control circuitry of the sensor is provided,
the exposure control unit 740 further includes a gain control unit for outputting a gain control signal to the sensor gain control circuit;
the sensor gain control circuit is used for outputting at least two paths of gain control signals according to the gain control signals output by the gain control unit, wherein the first path of gain control signals are used for controlling the gains of the R pixels, the G pixels and the B pixels, and the second path of gain control signals are at least used for controlling the gains of the W pixels.
Illustratively, the exposure control unit 740 may further include a gain control unit that may control the sensor gain of each pixel in the image sensor by outputting a gain control signal to a sensor gain control circuit in the image sensor.
For example, considering the spectral response characteristics of different channels, the difference between the light sensitivity of the white light channel and the RGB channel is larger, if the sensor is configured with a gain suitable for the light sensitivity of a certain channel, the image of other channels is underexposed or overexposed, which is unfavorable for the subsequent image processing effect, so the gain control unit may output a gain control signal, and the sensor gain control circuit of the image sensor may control the gain of each pixel by controlling at least two gain control signals, so as to optimize the image effect.
For example, when the sensor gain control circuit in the image sensor receives the gain control signal output by the gain control unit, at least two paths of exposure time control signals may be output.
Illustratively, the first path gain control signal is used to control the gains of the R, G, and B pixels. The second path of gain control signal is at least used for controlling the gain of the W pixel.
Illustratively, the gain may include analog gain and/or digital gain.
As a possible implementation manner, the gain control unit may control the sensor gain control circuit to output two gain control signals through the gain control signal, and the second gain control signal is used to control the gains of the W pixel and the D pixel.
As another possible implementation manner, the gain control unit may control the sensor gain control circuit to output three gain control signals through a gain control signal, where the second gain control signal is used to control the gain of W, and the third gain control signal is used to control the gain of the D pixel.
In one example, the exposure time control circuit is configured to output two paths of exposure time control signals according to the exposure time control signals output by the exposure time control unit, wherein a first path of exposure time control signals is configured to control exposure time of the non-white light pixels, a second path of exposure time control signals is configured to control exposure time of the W pixels,
the sensor gain control circuit is used for outputting three paths of gain control signals according to the gain control signals output by the gain control unit, wherein the first path of gain control signals are used for controlling the gains of R pixels, G pixels and B pixels, the second path of gain control signals are used for controlling the gain of W pixels, and the third path of gain control signals are used for controlling the gain of D pixels.
The exposure time control unit may control the exposure time of each pixel by outputting two paths of exposure time control signals through the exposure control signal control exposure time control circuit. One path of exposure time control signal (namely the first path of exposure time control signal) is used for controlling the exposure time of the non-white light pixels, the other path of exposure time control signal (can be called as the second path of exposure time control signal) is used for controlling the exposure time of the W pixels, and the exposure time is independently controlled by the W pixels, so that the method can ensure that under the condition of sufficient visible light exposure, the tail blurring and the like of a white light path, namely a moving object in a light supplementing path image, are reduced, and the image effect is improved.
By way of example, RGBD channels and white light may need to be properly exposed with different gains, considering that the exposure times of non-white light pixels and W pixels are controlled separately, and the exposure times of RGBD channels and white light channels may be different.
In addition, if the gain of the D pixel is kept the same as that of the RGB, the D pixel is liable to overexposure because the infrared component response is higher than that of the RGB pixel. Thus, the D pixel needs to control the gain separately from the RGB path.
Thus, the gain control unit can control the gain of each pixel by controlling the sensor gain control circuit to output three gain control signals through the gain control signal. One path of gain control signal (may be referred to as a first path of gain control signal) is used for controlling the gains of the R pixel, the G pixel and the B pixel, the other path of gain control signal (may be referred to as a second path of gain control signal) is used for controlling the gain of the W, and the last path of gain control signal (may be referred to as a third path of gain control signal) is used for controlling the gain of the D pixel so as to optimize the image processing effect.
Illustratively, a first path of exposure time control signals is used to control the exposure start times of the R, G, B, and D pixels, a second path of exposure time control signals is used to control the exposure start times of the W pixels,
Or the first path of exposure time control signals are used for controlling the exposure ending time of the R pixel, the G pixel, the B pixel and the D pixel, and the second path of exposure time control signals are used for controlling the exposure ending time of the W pixel;
alternatively, the first path of exposure time control signal is used for controlling the exposure start time and the exposure end time of the R pixel, the G pixel, the B pixel and the D pixel, and the second path of exposure time control signal is used for controlling the exposure start time and the exposure end time of the W.
As a possible implementation, the exposure time of the R pixel, the G pixel, the B pixel, and the D pixel is not less than the exposure time of the W pixel when the image processing system is in the second operation mode.
Illustratively, in the case where the image processing system uses a second mode of operation, i.e., allows a portion of near-infrared light to be incident on the image sensor, e.g., for scenes where ambient visible light is weak, and the exposure times of the W pixels and the non-white pixels are separately controlled, the amount of light input can be increased by increasing the RGB channel exposure time and decreasing the W pixel exposure time, avoiding the image dynamic range loss due to the mismatch in the exposure of the two sets of channels (RGBD channel and W channel), while also reducing the motion blur problem in areas where the image motion is fast due to longer exposure times.
For example, taking the exposure time sequence (i.e. the exposure time control signal) shown in fig. 6 as an example, after the exposure starts, the first control signal and the second control signal gate the corresponding pixels to start exposure (the exposure start time of the R pixel, the G pixel, the B pixel, and the D pixel is not later than the exposure start time of the W pixel, and the exposure start time of the R pixel, the G pixel, the B pixel, and the D pixel is earlier than the exposure start time of the W pixel in the figure as an example), when the required exposure time of the sensor arrangement is reached, the exposure end signal rising edge of the exposure time control circuit of the sensor unifies the exposure end time, and the exposure is ended. The time from the rising edge of the exposure start signal to the rising edge of the control signal output is the exposure time of the corresponding pixel. Because the two control signals are independent, the pulse signal time is different and corresponds to different exposure time.
Illustratively, when the image processing system is in the second mode of operation, the gains of the R, G, and B pixels are greater than the gain of the W pixel; the gains of the R pixel, the G pixel, and the B pixel are not smaller than the gain of the D pixel.
Illustratively, the gain of the RGB pixels (sum of analog gain and digital gain) may be controlled to be greater than the gain of the W pixels (sum of analog gain and digital gain) by the gain control signal, and the gain of the RGB (sum of analog gain and digital gain) may be not less than the gain of the D pixels (sum of analog gain and digital gain).
For example, the analog gain of the R pixel is greater than the analog gain of the W pixel, and neither R pixel nor W pixel enables digital gain; the analog gain of the G pixel is greater than the analog gain of the W pixel, and neither the G pixel nor the W pixel enables the digital gain; the analog gain of the B pixel is larger than that of the W pixel, and the digital gain is not started by the B pixel and the W pixel;
alternatively, the analog gain of the R pixel is the same as the gain of the W pixel, and the digital gain of the R pixel is greater than the digital gain of the W pixel; the analog gain of the G pixel is the same as the gain of the W pixel, and the digital gain of the G pixel is greater than the digital gain of the W pixel; the analog gain of the B pixel is the same as the gain of the W pixel, and the digital gain of the B pixel is greater than the digital gain of the W pixel.
Since the RGB channels will be less sensitive to low light scenes, and since the D and W channels have infrared light, the D and W channels will be better sensitive than the RGB channels. Therefore, in order to ensure the exposure effect of the RGB channel, the D channel and the W channel, the underexposure or overexposure condition is avoided, the gains of the R pixel, the G pixel and the B pixel may be controlled to be not less than the gain of the D pixel, and the gains of the R pixel, the G pixel and the B pixel are greater than the gain of the W pixel, so as to optimize the exposure effect of the image.
In one example, the exposure time control circuit is used for outputting one path of exposure time control signals according to the exposure time control signals output by the exposure time control unit, and the first path of exposure time control signals are used for controlling the exposure time of the non-white light pixels and the W pixels;
the sensor gain control circuit is used for outputting two paths of gain control signals according to the gain control signals output by the gain control unit, wherein the first path of gain control signals are used for controlling the gains of the R pixel, the G pixel and the B pixel, and the second path of gain control signals are used for controlling the gains of the W pixel and the D pixel.
By way of example, the exposure time control unit may control the exposure time control circuit to output one exposure time control signal (which may be referred to as a first exposure time control signal) for controlling the exposure times of the non-white light pixels and the W pixels through the exposure time control signal to control the exposure time of each pixel, and thus, the control cost of the image sensor may be reduced and the process implementation difficulty of the image sensor may be reduced.
Illustratively, the dark channel may be gain controlled along with the white light channel, considering that the exposure times of the rgbcw are all the same, since the infrared component is closer to the white light channel.
Thus, the gain control unit may control the sensor gain circuit to output two gain control signals through the gain control signal, wherein one gain control signal (which may be referred to as a first gain control signal) is used to control the gains of the R pixel, the G pixel, and the B pixel, and the other gain control signal (which may be referred to as a second gain control signal) is used to control the gains of the W pixel and the D pixel.
As a possible implementation, the exposure time of the R pixel, the G pixel, the B pixel, and the D pixel is equal to the exposure time of the W pixel when the image processing system is in the second operation mode.
For example, in the case where the image processing system uses the second operation mode, i.e., allows part of near infrared light to enter the image sensor, for example, for a scene where ambient visible light is weak, and the exposure times of the W pixel and the non-white light pixel are uniformly controlled, and the gains of the RGB pixel and the DW pixel are separately controlled, the exposure times of the R pixel, the G pixel, the B pixel, the D pixel, and the W pixel may be the same.
Illustratively, the gains of the R, G, and B pixels are greater than the gain of the W pixel when the image processing system is in the second mode of operation.
Illustratively, the gains of the R, G, and B pixels (sum of analog and digital gains) may be controlled to be greater than the gain of the W pixel (sum of analog and digital gains) by the gain control signal.
For example, the analog gain of the R pixel is greater than the analog gain of the W pixel, and neither R pixel nor W pixel enables digital gain; the analog gain of the G pixel is greater than the analog gain of the W pixel, and neither the G pixel nor the W pixel enables the digital gain; the analog gain of the B pixel is larger than that of the W pixel, and the digital gain is not started by the B pixel and the W pixel;
alternatively, the analog gain of the R pixel is the same as the gain of the W pixel, and the digital gain of the R pixel is greater than the digital gain of the W pixel; the analog gain of the G pixel is the same as the gain of the W pixel, and the digital gain of the G pixel is greater than the digital gain of the W pixel; the analog gain of the B pixel is the same as the gain of the W pixel, and the digital gain of the B pixel is greater than the digital gain of the W pixel.
Because the light sensitivity of the RGB channel is poor for a low-illumination scene, and the light sensitivity of the W channel is better than that of the RGB channel because of the existence of infrared light in the W channel, under the condition that the exposure time of the RGBDW is the same, the situation of underexposure or overexposure is avoided in order to ensure the exposure effect of the RGB channel and the W channel, and the gains (analog gain+digital gain) of the R pixel, the G pixel and the B pixel can be controlled to be larger than the gain of the W pixel so as to optimize the exposure effect of an image.
In one example, the near infrared light supplemental control unit 760 may include a near infrared light supplemental intensity control unit.
The near-infrared light-compensating intensity control unit can be used for determining a near-infrared light-compensating intensity related coefficient and determining the light-compensating intensity of the near-infrared light-compensating unit according to the near-infrared light-compensating intensity related coefficient;
illustratively, when the near-infrared light intensity correlation coefficient is greater than the first threshold, the current light intensity of the near-infrared light compensating unit is adjusted down; and when the near infrared light supplementing intensity related coefficient is smaller than a second threshold value, the current light supplementing intensity of the near infrared light supplementing unit is regulated to be high.
Illustratively, the first threshold is greater than the second threshold.
For example, in order to optimize the near infrared light filling effect, the light filling intensity of the near infrared light filling unit may be adjusted according to the light filling requirement.
For example, the light filling demand may be characterized by a near infrared light filling intensity correlation coefficient.
For example, the near-infrared light intensity-compensating correlation coefficient may be inversely related to the light-compensating demand, and when the near-infrared light intensity-compensating correlation coefficient is larger, for example, larger than the first threshold, it is determined that the light-compensating demand is low, and the current light intensity of the near-infrared light compensating unit may be reduced; when the correlation coefficient of the near-infrared light supplementing intensity is smaller, if the correlation coefficient is smaller than a second threshold value, the high light supplementing requirement is determined, and the current light supplementing intensity of the near-infrared light supplementing unit can be increased, so that when the near-infrared light supplementing intensity is controlled in a proper range, the signal to noise ratio of the W channel is improved, the signal to noise ratio of the RGB channel after infrared removing treatment is improved, and the image effect is optimized.
As one possible implementation, the near-infrared supplemental light intensity correlation coefficient is an energy ratio of near-infrared light energy to visible light energy of the RGB channel.
By way of example, the higher the energy ratio of the near infrared energy to the visible energy of the RGB channel is, the lower the light supplementing demand is, and thus the energy ratio of the near infrared energy to the visible energy of the RGB channel may be taken as the near infrared light supplementing intensity-related coefficient.
For example, the manner of determining the near infrared light intensity correlation coefficient may include, but is not limited to, the following two ways:
in the first aspect, the energy ratio is determined based on the average luminance of the RGB channels and the average luminance of the dark channels in the image of the image signal in the first format output from the image sensor.
For example, assuming that the average luminance of RGB channels in the image of the image signal in the first format output by the image sensor is y1 and the average luminance of the dark channel is y2, the energy ratio may be (y 2/y 1) ×k ', k' is a correction coefficient.
When the energy ratio is determined based on the image signal in the first format output by the image sensor, the energy ratio may be calculated according to the original signal of the image signal in the first format output by the image sensor, so that the determined energy ratio matches the actual scene.
As an example, performing the first processing on the image signal in the first format to obtain the first processed image in the first format may perform 3D (time-space domain) noise reduction on the image signal in the first format to optimize the image effect, so when calculating the energy duty ratio, the duty ratio may also be calculated according to the first processed image after the 3D noise reduction to optimize the light supplementing effect in the case where the 3D noise reduction processing is performed.
By way of example, a specific implementation of the 3D noise reduction processing for the image signal of the first format may be explained below.
The second mode is to acquire the respective sensitivity ISO of an R channel, a G channel and a B channel of the image sensor unit, the sensitivity ISO of a dark light channel and the ratio of integral values of spectral response curves of the dark light channel and the R channel, the G channel and the B channel in a first wavelength range; the energy duty ratio is determined based on the respective sensitivities ISO of the R, G, and B channels, the sensitivity ISO of the dark channel, and the ratio of the dark channel to the integral values of the spectral response curves of the R, G, and B channels in the first wavelength range.
For example, assuming that the sensitivity ISO of the R channel of the image sensor unit is G11, the sensitivity ISO of the G channel is G12, the sensitivity ISO of the B channel is G13, the sensitivity ISO of the dark channel is G2, the ratio of the integral value of the spectral response curve of the dark channel to the R channel in the first wavelength range, i.e., [ T1, T2], the ratio of the integral value of the spectral response curve of the dark channel to the G channel in the first wavelength range is K2, and the ratio of the integral value of the spectral response curve of the dark channel to the B channel in the first wavelength range is K3, the energy ratio may be: g 1/(g 2 x K).
For example, g1 may be determined from g11, g12 and g13, e.g. g1 is the average of g11, g12 and g 13.
For example, K may be determined from K1, K2, and K3, e.g., K is the average of K1, K2, and K3.
Referring to fig. 10, a flowchart of an image processing method according to an embodiment of the present application is provided, where the image processing method may be applied to an image processing system, and the image processing system may include the image sensor described in any of the foregoing embodiments, and as shown in fig. 10, the image processing method may include the following steps:
step S1000, performing a first process on the image signal in the first format output by the image sensor to obtain a first processed image in the first format.
Illustratively, the pixel array of the image processor sensor includes at least two types of pixels, wherein the first type of pixels allow the intensity of near infrared light to pass therethrough is stronger than the second type of pixels allow the intensity of near infrared light to pass therethrough.
For example, the first type of pixels include W pixels and D pixels, and the second type of pixels include R pixels, G pixels, and B pixels.
Alternatively, the first type of pixels include D pixels, and the second type of pixels include R pixels, G pixels, and B pixels.
Alternatively, the first type of pixels include W pixels, and the second type of pixels include R pixels, G pixels, and B pixels.
In this embodiment of the present application, the image sensor may convert an incident target optical signal into an electrical signal through photoelectric conversion, and read out the electrical signal of the specified pixel to perform signal processing, so as to obtain an image signal (referred to herein as a first format image signal).
The image signal is acquired by the image sensor after adjusting the exposure parameters of each channel according to the received exposure control signal.
Specific implementation of the image sensor for adjusting the exposure parameters of the respective channels according to the received exposure control signals can be seen from the relevant description in the above embodiments.
In the embodiment of the present application, the image signal in the first format output by the image sensor may be subjected to a specified process (referred to herein as first process), so as to obtain an image in the specified format (referred to herein as first format) (referred to herein as first process format image).
In one example, the first process in step S1000 may be 3D noise reduction.
For example, in order to reduce noise in the image signal of the first format acquired by the image sensor, the image signal of the first format may be subjected to 3D noise reduction. For example, raw domain 3D noise reduction.
Illustratively, the main effect of the Raw domain 3D noise reduction is to perform time domain spatial domain noise reduction weighting on the image in the Raw domain. Generally, the method is divided into noise estimation, motion estimation, raw domain spatial noise reduction and Raw domain temporal noise reduction.
And judging a motion area and a static area in the image signal in the first format through motion estimation. The motion region performs spatial noise reduction, wherein the noise reduction intensity can reduce noise of different intensities according to the noise size output by noise estimation, and a specific noise reduction algorithm can reduce image noise by adopting spatial filtering technologies such as BM3D and the like. The static region adopts time domain noise reduction, namely adopts weighting of a historical frame and a current frame, and reduces the noise of the static region.
Step S1010, performing a second process on the processed image in the first format to obtain a second processed image in a second format.
In the embodiment of the present application, for the first processed image in the first format obtained by the processing in step S1000, it may be processed into an image in another format (herein referred to as a second format) by processing (herein referred to as a second processing).
For example, a full resolution 3-channel (R-channel, G-channel, and B-channel) visible light image (i.e., the second processed image of the second format) from which the near infrared component in the first processed image is removed may be obtained by performing the second processing on the first processed image of the first format.
Illustratively, performing the first processing on the image signal of the first format may include: carrying out noise reduction processing on the image signal in the first format to obtain a first processed image in the first format after noise reduction;
performing the second processing on the first processed image may include: and converting the noise-reduced first processed image from the first format to the second format, and removing the near infrared component in the first processed image to obtain a second processed image in the second format.
In some embodiments, in step S1010, performing the second processing on the processed image in the first format to obtain a second processed image in the second format may include:
and inputting the first processed image in the first format into one or more convolution filter banks to carry out convolution operation, so as to obtain a visible light full-resolution image output by the one or more convolution filter banks, wherein the visible light full-resolution image is a second processed image in the second format.
Illustratively, the near infrared component of the first processed image may be removed by performing convolution operation on the first processed image through a convolution filter bank, and the visible light full resolution image may be obtained by interpolation.
In one example, the visible full resolution image may include at least a red full resolution image without an infrared light component, a green full resolution image without an infrared light component, and a blue full resolution image without an infrared light component.
For the first processed image in the first format obtained in step S1010, a convolution operation may be performed by using a convolution filter bank to obtain a full-resolution 3-channel (R-channel, G-channel, and B-channel) visible light image (i.e., the second processed image in the second format) from which the near infrared component in the first processed image is removed.
In one example, when the pixel array of the image sensor includes rgbcw pixel units, the convolution filter bank is used to output a full resolution image of visible light, and the convolution filter bank includes at least the following convolution filters:
convolution filters for interpolating and removing infrared from R pixels, G pixels and B pixels, respectively, for white light locations in RGBDW pixel units;
convolution filters for interpolating and de-infrared R, G, and B pixels for dark light locations in an rgbcw pixel unit;
a convolution filter for interpolating and de-infrared R and B pixels for green locations in the rgbcw pixel unit;
a convolution filter for interpolating and de-infrared G and B pixels for red locations in the rgbcw pixel cell;
a convolution filter for R and G pixel interpolation and de-infrared for blue locations in an rgbcw pixel cell.
Illustratively, when the pixel array of the image sensor includes rgbcw pixel units, the first processed image may be convolved with a convolution filter bank to obtain a full resolution image of visible light with the infrared light component removed.
The convolution filter bank may include, for example, convolution filters that interpolate and de-infrared different pixels for different pixel locations.
The convolution filters for performing pixel interpolation and infrared removal on different pixel positions may be the same convolution filter, or may be different convolution filters, that is, one or more convolution filters may be included in the convolution filter bank.
For example, the convolution filter set may include 5 convolution filters, and the pixel interpolation and the infrared removal are respectively performed on different pixel positions, so as to improve the image processing performance.
For example, when the convolution filter bank includes a plurality of convolution filters, each convolution filter can synchronously perform interpolation and infrared removal processing of different pixel positions, so as to improve image processing efficiency.
In another example, when the pixel array of the image sensor includes RGBW pixel units, the above convolution filter bank is used to output a visible light full resolution image, and the above convolution filter bank includes at least the following convolution filters:
Convolution filters for interpolating and removing infrared from R pixels, G pixels and B pixels, respectively, for white light locations in RGBW pixel units;
a convolution filter for interpolating and de-infrared R and B pixels for green locations in the RGBW pixel unit;
a convolution filter for interpolating and ir-removing G and B pixels for red locations in the RGBW pixel unit;
convolution filters for R and G pixel interpolation and ir removal for blue locations in RGBW pixel units.
Illustratively, when the pixel array of the image sensor includes RGBW pixel units, the first processed image may be subjected to a convolution operation by a convolution filter bank, resulting in a visible light full resolution image from which the infrared light component is removed.
The convolution filter bank may include, for example, convolution filters that interpolate and de-infrared different pixels for different pixel locations.
The convolution filters for performing pixel interpolation and infrared removal on different pixel positions may be the same convolution filter, or may be different convolution filters, that is, one or more convolution filters may be included in the convolution filter bank.
For example, the convolution filter set may include 4 convolution filters, and the pixel interpolation and the infrared removal are respectively performed on different pixel positions, so as to improve the image processing performance.
For example, when the convolution filter bank includes a plurality of convolution filters, each convolution filter can synchronously perform interpolation and infrared removal processing of different pixel positions, so as to improve image processing efficiency.
In another example, when the pixel array of the image sensor includes RGBD pixel units, the above convolution filter bank is used to output a full resolution image of visible light, and the above convolution filter bank includes at least the following convolution filters:
convolution filters for interpolating and removing infrared from R pixels, G pixels and B pixels, respectively, for dark light locations in RGBD pixel units;
a convolution filter for interpolating and de-infrared R and B pixels for green locations in the RGBD pixel unit;
a convolution filter for interpolating and ir-removing G and B pixels for red locations in the RGBD pixel cell;
convolution filters for R and G pixel interpolation and de-infrared for blue locations in RGBD pixel cells.
Illustratively, when the pixel array of the image sensor includes RGBD pixel units, the first processed image may be convolved by a convolution filter bank to obtain a visible full resolution image with the infrared light component removed.
The convolution filter bank may include, for example, convolution filters that interpolate and de-infrared different pixels for different pixel locations.
The convolution filters for performing pixel interpolation and infrared removal on different pixel positions may be the same convolution filter, or may be different convolution filters, that is, one or more convolution filters may be included in the convolution filter bank.
For example, the convolution filter set may include 4 convolution filters, and the pixel interpolation and the infrared removal are respectively performed on different pixel positions, so as to improve the image processing performance.
For example, when the convolution filter bank includes a plurality of convolution filters, each convolution filter can synchronously perform interpolation and infrared removal processing of different pixel positions, so as to improve image processing efficiency.
In one example, the above-described convolution filter bank includes a first convolution filter bank, a second convolution filter bank, and a third convolution filter bank.
The first convolution filter bank is used for outputting a red full-resolution image without infrared light components;
the second convolution filter set is used for outputting a green full-resolution image without infrared light components;
The third convolution filter bank is configured to output a blue full resolution image that does not contain an infrared light component.
Illustratively, the first convolution filter bank includes at least the following convolution filters that perform convolution filtering operations in synchronization:
a convolution filter for R-pixel interpolation and infrared removal for white light pixel locations in the rgbcw pixel unit; a convolution filter for R-pixel interpolation and infrared removal for dark light pixel locations in the rgbcw pixel unit; a convolution filter for R-pixel interpolation and infrared removal for green pixel locations in the rgbcw pixel unit; a convolution filter for R-pixel interpolation and infrared removal for blue pixel locations in the rgbcw pixel unit;
illustratively, the second convolution filter bank includes at least the following convolution filters that perform convolution filtering operations in synchronization:
a convolution filter for G pixel interpolation and infrared removal for white light pixel positions in the RGBDW pixel unit; a convolution filter for G pixel interpolation and infrared removal for dark light pixel positions in the RGBDW pixel unit; a convolution filter for G-pixel interpolation and infrared removal for red pixel locations in the rgbcw pixel unit; a convolution filter for G-pixel interpolation and infrared removal for blue pixel locations in the rgbcw pixel unit;
Illustratively, the third convolution filter bank includes at least the following convolution filters that perform convolution filtering operations in synchronization:
a convolution filter for performing B-pixel interpolation and infrared removal for white light pixel locations in the rgbcw pixel unit; a convolution filter for performing B-pixel interpolation and infrared removal for a dark pixel position in the rgbcw pixel unit; a convolution filter for B-pixel interpolation and infrared removal of red pixel locations in the rgbcw pixel unit; a convolution filter for B-pixel interpolation and infrared removal for green pixel locations in rgbcw pixel cells.
For example, the first processed image may be subjected to convolution operations by different filter banks to obtain a red full resolution image, a green full resolution image, and a blue full resolution image, respectively, which do not contain infrared light components, and a schematic diagram thereof may be shown in fig. 11.
As shown in fig. 11, the first processed image may be subjected to a convolution operation by the convolution filter bank 1 (i.e., the first convolution filter bank described above), resulting in a red full-resolution image containing no infrared light component.
The first processed image may be convolved by a convolution filter bank 2 (i.e., the second convolution filter bank described above) to obtain a green full resolution image that does not contain infrared light components.
The first processed image may be convolved by a convolution filter bank 3 (i.e. the second convolution filter bank described above) to obtain a blue full resolution image without an infrared light component.
Illustratively, each convolution filter bank may include a plurality of convolution filter banks that simultaneously perform convolution operations.
For example, the first convolution filter bank may include at least a plurality of convolution filters that simultaneously perform convolution filtering operations for R-pixel interpolation and infrared removal for different pixel locations.
For example, the first convolution filter bank may include at least a plurality of convolution filters for performing the convolution operation for R-pixel interpolation and infrared removal for white light pixel positions in the RGBDW pixel unit, performing the convolution operation for R-pixel interpolation and infrared removal for dark light pixel positions in the RGBDW pixel unit, performing the convolution operation for R-pixel interpolation and infrared removal for green pixel positions in the RGBDW pixel unit, and performing the convolution operation for R-pixel interpolation and infrared removal for blue pixel positions in the RGBDW pixel unit in synchronization.
For example, the second convolution filter bank may include at least a plurality of convolution filters that synchronously perform convolution filtering operations for G pixel interpolation and infrared removal at different pixel locations.
For example, the second convolution filter bank may include at least a plurality of convolution filters for performing a convolution filter operation for G-pixel interpolation and infrared removal for white light pixel positions in an RGBDW pixel unit, a convolution filter operation for G-pixel interpolation and infrared removal for dark light pixel positions in an RGBDW pixel unit, a convolution filter operation for G-pixel interpolation and infrared removal for red pixel positions in an RGBDW pixel unit, and a convolution filter operation for G-pixel interpolation and infrared removal for blue pixel positions in an RGBDW pixel unit in synchronization.
For example, the third convolution filter bank may include at least a plurality of convolution filters that simultaneously perform convolution filtering operations for B-pixel interpolation and infrared-removal for different pixel locations.
For example, the third convolution filter bank may include at least a plurality of convolution filters that synchronously perform convolution filtering operations for B-pixel interpolation and infrared removal for white light pixel locations in an RGBDW pixel unit, convolution filtering operations for B-pixel interpolation and infrared removal for dark light pixel locations in an RGBDW pixel unit, convolution filtering operations for B-pixel interpolation and infrared removal for red pixel locations in an RGBDW pixel unit, and convolution filtering operations for B-pixel interpolation and infrared removal for green pixel locations in an RGBDW pixel unit.
In one example, when any convolution filter performs infrared removal synchronously, the infrared removal can be performed synchronously after correcting the dark light component in the first processed image according to the dark light component exposure correction coefficient.
Illustratively, the darkness component exposure correction coefficient is determined according to the sensitivity ISO of the R channel, the sensitivity ISO of the G channel, the sensitivity ISO of the B channel, and the sensitivity ISO of the darkness channel of the image sensor unit.
For example, assume that the sensitivity ISO of the R channel is G11, the sensitivity ISO of the G channel is G12, and the sensitivity ISO of the B channel sensor is G13; the sensitivity ISO of the dark light channel sensor is g2, and the exposure correction coefficient of the dark light component can be g1/g2.
For example, g1 may be determined from g11, g12 and g13, e.g. g1 is the average of g11, g12 and g 13.
As one possible implementation, any convolution filter that performs infrared removal synchronously may include: any convolution filter is used for removing infrared rays from the D pixel and the R pixel of the current pixel position neighborhood based on the R channel infrared removal coefficient and the dim light component exposure correction coefficient.
By way of example, considering that the spectral responses of the R channel and the D channel are close, when the D channel is interpolated, the R component in the neighborhood can be used to perform infrared removal on the D pixel and the R pixel in the neighborhood of the current pixel position, so as to improve the accuracy of the infrared cost estimation of the D channel component and optimize the interpolation accuracy.
1) The W position of RGBDW interpolates R, G, and B pixels and removes infrared.
For example, taking the interpolation of R pixels as described above as an example, other G pixel and B pixel interpolations can be analogized to the R interpolation concept.
Referring to fig. 12, taking the interpolation R45 of the W45 position as an example, a part of R pixels and a part of W pixels in the 7*7 neighborhood are utilized to perform the interpolation of the pixels of the W position, and the D pixels and the R pixels are utilized to perform the infrared removal. Its 7*7 neighborhood matrix is denoted P and contains 49 pixels starting from the top left W12 and ending in the bottom right W78 in fig. 12.
The distance between adjacent pixels in the horizontal and vertical directions is 1, and the distance between adjacent pixels in the 45-degree oblique direction is
1.1 High frequency of W pixels, high frequency components of white pixels are calculated as follows:
1.2 Low frequency component of R pixel, taking W45 pixel as center and containing R 24 ,R 46 ,R 64 Weighting is carried out on the 3*5 neighborhood of the R45 pixel, and the low-frequency estimated value obtained is:
1.3 An infrared component estimate centered on the W45 pixel including R) 24 ,R 46 ,R 64 R in 3*5 neighborhood of (2) 24 ,R 46 ,R 64 ,D 44 ,R 26 ,R 66 Weighting:
wherein alpha and beta are weighting coefficients, gamma R And removing infrared coefficients for the R channel, wherein lambda is a D component exposure correction coefficient.
Let RGB channel sensor sensitivity ISO be g1; the gain sensitivity of the dark channel sensor ISO is g2, and the D component after conversion is lambda= (g 1/g 2) times of the original value.
The above calculation may be unified as a convolution filter for format conversion, and the convolution filter X with α=1 and β=0 is denoted as:
representing a matrix dot product.
For example, when interpolating G pixels at W positions, partial G pixels and partial W pixels within the 7*7 neighborhood may be used to interpolate pixels at W positions, and D pixels and R pixels may be used to perform infrared removal. When the infrared component estimation is carried out, the D pixel and the R pixel in the 3*5 adjacent domain can be used for weighting, and the specific implementation thought is similar to the implementation thought of the W-position interpolation R pixel.
For example, when interpolating the B pixel at the W position, the pixel interpolation at the W position may be performed by using a part of the B pixel and a part of the W pixel within the 7*7 neighborhood, and the infrared removal may be performed by using the D pixel and the R pixel. When the infrared component estimation is carried out, the D pixel and the R pixel in the 3*5 adjacent domain can be used for weighting, and the specific implementation thought is similar to the implementation thought of the W-position interpolation R pixel.
2) D position interpolation R, G and B pixels of RGBDW and infrared elimination
For example, when interpolating R pixels at D positions, a portion of R pixels and a portion of D pixels within a 7*7 neighborhood may be used to interpolate pixels at D positions, and the D pixels and the R pixels may be used to perform infrared removal. When the infrared component estimation is carried out, the D pixel and the R pixel in the 3*5 adjacent domain can be used for weighting, and the specific implementation thought is similar to the implementation thought of the W-position interpolation R pixel.
For example, when interpolating G pixels at D positions, partial G pixels and partial D pixels within the 7*7 neighborhood may be used to interpolate pixels at D positions, and D pixels and R pixels may be used to perform infrared removal. When the infrared component estimation is carried out, the D pixel and the R pixel in the 3*5 adjacent domain can be used for weighting, and the specific implementation thought is similar to the implementation thought of the W-position interpolation R pixel.
For example, when interpolating the B pixel at the D position, the pixel interpolation at the D position may be performed by using a portion of the B pixel and a portion of the D pixel within the 7*7 neighborhood, and the infrared removal may be performed by using the D pixel and the R pixel. When the infrared component estimation is carried out, the D pixel and the R pixel in the 3*5 adjacent domain can be used for weighting, and the specific implementation thought is similar to the implementation thought of the W-position interpolation R pixel.
Taking D44 interpolation R44 pixels as an example, a convolution filter for format conversion that calculates α=1, β=0 according to the idea of 1) may be:
3) G position of RGBDW interpolates R pixel and B pixel and removes infrared
For example, when interpolating R pixels at the G position, a portion of R pixels and a portion of G pixels within the 7*7 neighborhood may be used to interpolate pixels at the G position, and D pixels and R pixels may be used to perform infrared removal. When the infrared component estimation is carried out, the D pixel and the R pixel in the 3*5 adjacent domain can be used for weighting, and the specific implementation thought is similar to the implementation thought of the W-position interpolation R pixel.
For example, when interpolating the B pixel at the G position, the pixel interpolation at the G position may be performed by using a part of the G pixel and a part of the B pixel within the 7*7 neighborhood, and the infrared removal may be performed by using the D pixel and the R pixel. When the infrared component estimation is carried out, the D pixel and the R pixel in the 3*5 adjacent domain can be used for weighting, and the specific implementation thought is similar to the implementation thought of the W-position interpolation R pixel.
Taking the G35 interpolation R35 pixel as an example, the convolution filter for calculating the format conversion of α=1, β=0 according to the concept of 1) may be:
4) R-position interpolation G-pixel and B-pixel of RGBDW and IR-removing, and B-position interpolation G-pixel and R-pixel IR-removing
For example, when interpolating G pixels at R positions, part of R pixels and part of G pixels in the 7*7 neighborhood may be used to interpolate pixels at R positions, and D pixels and R pixels may be used to perform infrared removal. When the infrared component estimation is carried out, the D pixel and the R pixel in the 3*5 adjacent domain can be used for weighting, and the specific implementation thought is similar to the implementation thought of the W-position interpolation R pixel.
For example, when interpolating the B pixel at the R position, the pixel interpolation at the R position may be performed by using a portion of the R pixel and a portion of the B pixel within the 7*7 neighborhood, and the infrared removal may be performed by using the D pixel and the R pixel. When the infrared component estimation is carried out, the D pixel and the R pixel in the 3*5 adjacent domain can be used for weighting, and the specific implementation thought is similar to the implementation thought of the W-position interpolation R pixel.
For example, when interpolating R pixels at the B position, a portion of the B pixels and a portion of the R pixels within the 7*7 neighborhood may be used to interpolate the pixels at the B position, and the D pixels and the R pixels may be used to perform infrared removal. When the infrared component estimation is carried out, the D pixel and the R pixel in the 3*5 adjacent domain can be used for weighting, and the specific implementation thought is similar to the implementation thought of the W-position interpolation R pixel.
For example, when interpolating G pixels at B positions, part of the B pixels and part of the G pixels in the 7*7 neighborhood may be used to interpolate the pixels at B positions, and D pixels and R pixels may be used to perform infrared removal. When the infrared component estimation is carried out, the D pixel and the R pixel in the 3*5 adjacent domain can be used for weighting, and the specific implementation thought is similar to the implementation thought of the W-position interpolation R pixel.
Taking the B55 interpolation R55 pixel as an example, the format-converted convolution filter that calculates α=1, β=0 according to the concept of 1) may be:
interpolation of other position pixels is the same as in the above cases.
It should be appreciated that the above-mentioned format conversion calculation procedure is only an example of an implementation manner provided in the embodiments of the present application, and is not intended to limit the scope of protection of the present application, and other convolution filtering operations may be included in the embodiments of the present application, or a format conversion result may be obtained through multiple convolution operations.
The RGBDW pixel arrangement of other formats can also be subjected to format conversion by adopting the thought, and the main difference is in the accuracy of the infrared component estimation of the D channel component. Because of the close spectral responses of the R and D channels, the R component in the neighborhood can be utilized to optimize interpolation accuracy when interpolating the D channel.
This arrangement has two advantages over other arrangements:
i) The arrangement format interpolation D and R components are uniformly and symmetrically arranged in space, so that the symmetrical interpolation estimation is more facilitated by using the same row or the same column, and the information loss caused by using single-direction interpolation is avoided;
ii), taking the example of a W pixel which occupies half of the sensor, in the neighborhood of 3*5 or 5*3 where one W45 contains 3D pixels, the arrangement has 3D pixels and 3R pixels in the neighborhood, which is superior to other arrangements having 3D pixels and 1R pixel in the neighborhood, and more interpolation pixels can obtain better interpolation effect.
In some embodiments, when in the second operation mode, the operation of performing the first processing on the image signal in the first format in step S1010 to obtain a first processed image in the first format, and performing the second processing on the first processed image in the first format in step S1020 to obtain a second processed image in the second format is performed.
For example, considering that the image processing system is in the second operation mode, for example, when the image processing system is currently in a low-illumination environment, a part of near infrared light is generally required to be incident on the image sensor to realize light filling, and accordingly, when the image processing system is in the second operation mode, the image signal in the first format may be subjected to the first processing to obtain a first processed image in the first format according to the description in the above embodiment; and performing second processing on the first processed image in the first format to obtain a second processed image in a second format.
In one example, when in the first operation mode, the image processing method provided in the embodiment of the present application may further include:
and obtaining the target wide dynamic image based on the wide dynamic processing of the image signal in the first format.
For example, when the image processing system is in a first mode of operation, e.g., currently in a non-low light environment, all near infrared light in the optical signal incident on the image sensor may be filtered out, limiting the wavelengths entering the image sensor to the visible light band.
The visible light band may be, for example, [380 to 650] nm.
In this case, the image signal of the first format acquired by the image sensor may be subjected to wide dynamic processing to obtain a wide dynamic image (referred to herein as a target wide dynamic image).
For example, the spectral response in this case may be as shown in fig. 13, and the RGB channel and white light channel response curves may be the same as those in the embodiment shown in fig. 4, but in fig. 13, the dark light channel has a certain spectral response in the entire band, and the spectral response is 0 in the infrared light band, e.g., the band having a wavelength greater than 650 nm.
In this case, for example, a visible light image, a white light image, and a dark light image may be obtained by interpolation processing of the image signal, respectively. The dark-light image (noted img_d) is fused with the visible-light image (noted img_vis) in a wide dynamic manner.
In one example, a wide dynamic fusion may be achieved by the following policies:
img_wdrY=w*img_visY+(1-w)*img_D
illustratively, img_wdry is a wide dynamic processing result, w is a wide dynamic synthesized weight map, and the weight map of each point on the image is calculated according to img_visy and img_d. When img_visy reaches the upper limit of the processing bit width luminance, w=0, i.e. the full luminance is derived from img_d.
For the environment with sufficient illuminance, all near infrared light in the light signal of the incident image sensor can be filtered, visible light is adopted in the RGB channel and the D channel, the visible light information (such as gray information) in the D channel can be utilized to compensate the visible light information of the RGB channel by carrying out wide dynamic fusion on the dark light image and the visible light image, and therefore, the visible light information of an overexposure area can be provided through the D channel under the condition of overexposure of the RGB channel, and the image effect is optimized.
In one example, the wide dynamic processing of the image signal in the first format may include:
the image signal of the first format is input into a convolution filter bank, the convolution filter bank carries out wide dynamic processing on the image signal of the first format, and a target wide dynamic image with full resolution is output.
For example, for an image signal of a first format output from the image sensor, when in the first operation mode, the image signal of the first format may be input to the convolution filter bank, the image signal of the first format may be subjected to wide dynamic processing by the convolution filter bank, and a full-resolution target wide dynamic image may be output.
Referring to fig. 14, a flow chart of a near infrared light compensation control method according to an embodiment of the present application is provided, wherein the near infrared light compensation control method may be applied to the image processing system described in any of the above embodiments, and as shown in fig. 14, the near infrared light compensation control method may include the following steps:
step S1400, generating a light compensation control signal.
Step S1410, controlling to turn on the near infrared light compensation or turn off the near infrared light compensation according to the light compensation control signal.
For example, when the second filter is switched to the light-entering side of the image sensor, the light-compensating control signal is used for controlling the near-infrared light-compensating to be turned off; when the first optical filter is switched to the light incident side of the image sensor, the light supplementing control signal is used for controlling the on of near infrared light supplementing.
In the embodiment of the application, in order to optimize the image processing effect, the near infrared light supplement can be selectively turned on or turned off according to actual requirements.
For example, in low light environments, near infrared light is turned on; in a non-low illumination environment, the near infrared light supplement is turned off.
For example, in the case where the first filter allows a part of the near infrared light to pass and the second filter blocks the near infrared light, the near infrared light may be selectively turned on or off according to whether the first filter is switched to the light-incident side of the image sensor or the second filter is switched to the light-incident side of the image sensor.
For example, when the second filter is switched to the light incident side of the image sensor, it may be determined that near-infrared light compensation is not currently required, for example, a non-low illumination environment is currently present, and at this time, a control signal for controlling the near-infrared light compensation to be turned off may be generated and the near-infrared light compensation to be turned off may be controlled according to the control signal.
For example, when the first optical filter is switched to the light incident side of the image sensor, it may be determined that near infrared light compensation is currently required, for example, a low-illumination environment is currently required, at this time, a control signal for controlling the start of the near infrared light compensation may be generated, and the start of the near infrared light compensation may be controlled according to the control signal.
In some embodiments, the near infrared light supplementing control method provided in the embodiments of the present application may further include:
determining a near infrared light supplementing intensity correlation coefficient;
determining the light supplementing intensity of the near infrared light supplementing according to the near infrared light supplementing intensity related coefficient;
illustratively, when the near-infrared light intensity correlation coefficient is greater than the first threshold, the current light intensity of the near-infrared light is adjusted down; and when the near infrared light supplementing intensity related coefficient is smaller than a second threshold value, the current light supplementing intensity of the near infrared light supplementing is regulated to be high.
Illustratively, the first threshold is greater than the second threshold.
For example, to optimize the near-infrared light filling effect, the light filling intensity of the near-infrared light filling may be adjusted according to the light filling requirement.
For example, the light filling demand may be characterized by a near infrared light filling intensity correlation coefficient.
For example, the near-infrared light intensity correlation coefficient may be inversely related to the light-compensating demand, and when the near-infrared light intensity correlation coefficient is larger, for example, larger than the first threshold, it is determined that the light-compensating demand is low, and the current light intensity of the near-infrared light may be adjusted down; when the correlation coefficient of the near infrared light supplementing intensity is smaller, if the correlation coefficient is smaller than a second threshold value, the high light supplementing requirement is determined, and the current light supplementing intensity of the near infrared light supplementing can be increased, so that when the near infrared light supplementing intensity is controlled in a proper range, the signal to noise ratio of the W channel is improved, the signal to noise ratio of the RGB channel is improved after the infrared removing treatment is carried out on the RGB channel, and the image effect is optimized.
In one example, the near-infrared supplemental light intensity correlation coefficient is an energy ratio of near-infrared light energy to visible light energy of the RGB channel;
the determining the near infrared light compensation intensity correlation coefficient may include:
determining an energy duty ratio based on the average brightness of the RGB channels and the average brightness of the dark channels in the image of the image signal in the first format output by the image sensor;
or alternatively, the first and second heat exchangers may be,
acquiring the respective sensitivity ISO of an R channel, a G channel and a B channel of the image sensor unit, the sensitivity ISO of a dim light channel and the ratio of the dim light channel to the integral value of the spectral response curve of the R channel, the G channel and the B channel in a first wavelength range; the energy duty ratio is determined based on the respective sensitivities ISO of the R, G, and B channels, the sensitivities ISO of the dark channels, and the ratio of the dark channels to the integral values of the spectral response curves of the R, G, and B channels in the first wavelength range, respectively.
By way of example, the higher the energy ratio of the near infrared energy to the visible energy of the RGB channel is, the lower the light supplementing demand is, and thus the energy ratio of the near infrared energy to the visible energy of the RGB channel may be taken as the near infrared light supplementing intensity-related coefficient.
For example, the manner of determining the near infrared light intensity correlation coefficient may include, but is not limited to, the following two ways:
in the first aspect, the energy ratio is determined based on the average luminance of the RGB channels and the average luminance of the dark channels in the image of the image signal in the first format output from the image sensor.
For example, assuming that the average luminance of RGB channels in the image of the image signal in the first format output by the image sensor is y1 and the average luminance of the dark channel is y2, the energy ratio may be (y 2/y 1) ×k ', k' is a correction coefficient.
When the energy ratio is determined based on the image signal in the first format output by the image sensor, the energy ratio may be calculated according to the original signal of the image signal in the first format output by the image sensor, so that the determined energy ratio matches the actual scene.
Referring to fig. 15, a flowchart of an exposure control method according to an embodiment of the present application is provided, where the exposure control method may be applied to the image processing system described in any of the foregoing embodiments, and as shown in fig. 15, the exposure control method may include the following steps:
step S1500, an exposure control signal is generated.
Step S1510, controlling the exposure of each channel of each pixel in the pixel array of the image sensor according to the exposure control signal.
In the embodiment of the application, the exposure of each channel of each pixel in the pixel array of the image sensor can be controlled by the exposure control signal so as to optimize the exposure effect of each channel and further optimize the image effect.
In some embodiments, the image sensor includes an exposure time control circuit; the exposure control signals include exposure time control signals;
in step S1510, controlling the exposure of each channel of each pixel in the pixel array of the image sensor according to the exposure control signal may include:
the exposure time control signal is output to an exposure time control circuit.
The exposure time control circuit is used for outputting at least a first path of exposure time control signals according to the exposure time control signals output by the exposure time control unit, wherein the first path of exposure time control signals are at least used for controlling the exposure time of non-white light pixels, and the non-white light pixels comprise R pixels, G pixels, B pixels and D pixels.
For example, the image sensor may include an exposure time control circuit, and the exposure time control circuit of the image sensor may be controlled to output at least one exposure time control signal by the exposure time control signal to control the exposure time of each channel of each pixel in the pixel array of the image sensor.
For example, when the exposure time control circuit in the image sensor receives the exposure time control signal, at least a first path of exposure time control signal for controlling the exposure time of at least the non-white light pixel and the W pixel may be output.
As one possible implementation manner, the exposure time control circuit of the image sensor can control the exposure time of the W pixel, the D pixel, the R pixel, the G pixel and the B pixel through one path of exposure time control signal, so as to reduce the process implementation difficulty of the image sensor.
As another possible implementation, the exposure time control circuit of the image sensor may be configured to output two exposure time control signals, where a first one of the two exposure time control signals is configured to control the exposure time of the non-white light pixel and a second one of the two exposure time control signals is configured to control the exposure time of the W pixel.
By way of example, considering that the types of the spectral response curves of the RGB channel, the D channel and the W channel are different, and considering that the independence of the W channel is high, the exposure time of the non-white light pixel can be controlled by one path of exposure time control signal, the exposure time of the W pixel can be controlled by the other path of exposure time control signal, and the exposure time can be independently controlled by the W pixel, so that the reduction of the white light path, namely the tailing blurring of a moving object in the light compensation path image, can be ensured under the condition of sufficient visible light exposure, and the improvement of the image effect is facilitated.
In one example, the first path of exposure time control signals is used to control at least exposure start times of the R, G, B, and D pixels;
or the first path of exposure time control signal is at least used for controlling the exposure ending time of the R pixel, the G pixel, the B pixel and the D pixel;
or the first path of exposure time control signal is at least used for controlling the exposure start time and the exposure end time of the R pixel, the G pixel, the B pixel and the D pixel.
In one example, the image sensor may further include a sensor gain control circuit; the exposure control signal may further include a gain control signal;
in step S1510, controlling the exposure of each channel of each pixel in the pixel array of the image sensor according to the exposure control signal, including:
the gain control signal is output to the sensor gain control circuit.
The sensor gain control circuit is used for outputting at least two paths of gain control signals according to the received gain control signals, wherein the first path of gain control signal is used for controlling the gains of the R pixel, the G pixel and the B pixel, and the second path of gain control signal is at least used for controlling the gain of the W pixel.
The image sensor may further include a sensor gain control circuit that may control the sensor gain control circuit of the image sensor to output a gain control signal to control the gain of each channel of each pixel in the pixel array of the image sensor.
For example, considering the spectral response characteristics of different channels, the difference between the light sensitivity of the white light channel and the RGB channel is larger, if the sensor is configured with the gain suitable for the light sensitivity of a certain channel, the images of other channels are underexposed or overexposed, which is unfavorable for the subsequent image processing effect, so that the sensor gain control circuit of the image sensor can be controlled by the gain control signals to output at least two gain control signals, and the gains of the pixels are respectively controlled to optimize the image effect.
For example, when the gain control signal is received by the sensor gain control circuit in the image sensor, at least two paths of exposure time control signals can be output.
Illustratively, the first path gain control signal is used to control the gains of the R, G, and B pixels. The second path of gain control signal is at least used for controlling the gain of the W pixel.
Illustratively, the gain may include analog gain and/or digital gain.
As one possible implementation, the sensor gain control circuit may be controlled by a gain control signal to output two gain control signals, the second gain control signal being used to control the gains of the W pixel and the D pixel.
As another possible implementation manner, the sensor gain control circuit may be controlled by a gain control signal to output three gain control signals, a second gain control signal for controlling the gain of W, and a third gain control signal for controlling the gain of the D pixel.
In one example, the exposure time control circuit is configured to output two exposure time control signals according to the received exposure time control signals, wherein a first exposure time control signal is configured to control exposure time of the non-white light pixels, a second exposure time control signal is configured to control exposure time of the W pixels,
the sensor gain control circuit is used for outputting three paths of gain control signals according to the received gain control signals, wherein the first path of gain control signals are used for controlling the gains of R pixels, G pixels and B pixels, the second path of gain control signals are used for controlling the gains of W pixels, and the third path of gain control signals are used for controlling the gains of D pixels.
The exposure time control circuit may output two exposure time control signals to control the exposure time of each pixel. One path of exposure time control signal (namely the first path of exposure time control signal) is used for controlling the exposure time of the non-white light pixels, the other path of exposure time control signal (can be called as the second path of exposure time control signal) is used for controlling the exposure time of the W pixels, and the exposure time is independently controlled by the W pixels, so that the method can ensure that under the condition of sufficient visible light exposure, the tail blurring and the like of a white light path, namely a moving object in a light supplementing path image, are reduced, and the image effect is improved.
By way of example, RGBD channels and white light may need to be properly exposed with different gains, considering that the exposure times of non-white light pixels and W pixels are controlled separately, and the exposure times of RGBD channels and white light channels may be different.
In addition, if the gain of the D pixel is kept the same as that of the RGB, the D pixel is liable to overexposure because the infrared component response is higher than that of the RGB pixel. Thus, the D pixel needs to control the gain separately from the RGB path.
Thus, the sensor gain control circuit can be controlled to output three gain control signals to control the gain of each pixel through the gain control signal. One path of gain control signal (may be referred to as a first path of gain control signal) is used for controlling the gains of the R pixel, the G pixel and the B pixel, the other path of gain control signal (may be referred to as a second path of gain control signal) is used for controlling the gain of the W, and the last path of gain control signal (may be referred to as a third path of gain control signal) is used for controlling the gain of the D pixel so as to optimize the image processing effect.
Illustratively, a first path of exposure time control signals is used to control the exposure start times of the R, G, B, and D pixels, a second path of exposure time control signals is used to control the exposure start times of the W pixels,
Or the first path of exposure time control signals are used for controlling the exposure ending time of the R pixel, the G pixel, the B pixel and the D pixel, and the second path of exposure time control signals are used for controlling the exposure ending time of the W pixel;
alternatively, the first path of exposure time control signal is used for controlling the exposure start time and the exposure end time of the R pixel, the G pixel, the B pixel and the D pixel, and the second path of exposure time control signal is used for controlling the exposure start time and the exposure end time of the W.
As a possible implementation, the exposure time of the R pixel, the G pixel, the B pixel, and the D pixel is not less than the exposure time of the W pixel when the image processing system is in the second operation mode.
Illustratively, in the case where the image processing system uses a second mode of operation, i.e., allows a portion of near-infrared light to be incident on the image sensor, e.g., for scenes where ambient visible light is weak, and the exposure times of the W pixels and the non-white pixels are separately controlled, the amount of light input can be increased by increasing the RGB channel exposure time and decreasing the W pixel exposure time, avoiding the image dynamic range loss due to the mismatch in the exposure of the two sets of channels (RGBD channel and W channel), while also reducing the motion blur problem in areas where the image motion is fast due to longer exposure times.
For example, taking the exposure time sequence (i.e. the exposure time control signal) shown in fig. 6 as an example, after the exposure starts, the first control signal and the second control signal gate the corresponding pixels to start exposure (the exposure start time of the R pixel, the G pixel, the B pixel, and the D pixel is not later than the exposure start time of the W pixel, and the exposure start time of the R pixel, the G pixel, the B pixel, and the D pixel is earlier than the exposure start time of the W pixel in the figure as an example), when the required exposure time of the sensor arrangement is reached, the exposure end signal rising edge of the exposure time control circuit of the sensor unifies the exposure end time, and the exposure is ended. The time from the rising edge of the exposure start signal to the rising edge of the control signal output is the exposure time of the corresponding pixel. Because the two control signals are independent, the pulse signal time is different and corresponds to different exposure time.
Illustratively, when the image processing system is in the second mode of operation, the gains of the R, G, and B pixels are greater than the gain of the W pixel; the gain of the RGB pixels is not smaller than the gain of the D pixels.
Illustratively, the gain of the RGB pixels (sum of analog gain and digital gain) may be controlled to be greater than the gain of the W pixels (sum of analog gain and digital gain) by the gain control signal, and the gain of the RGB (sum of analog gain and digital gain) may be not less than the gain of the D pixels (sum of analog gain and digital gain).
For example, the analog gain of an RGB pixel is greater than the analog gain of a W pixel, and neither RGB pixel nor W pixel enables digital gain; alternatively, the analog gain of the RBG pixel is the same as the gain of the W pixel, and the digital gain of the RGB pixel is greater than the digital gain of the W pixel.
Since the RGB channels will be less sensitive to low light scenes, and since the D and W channels have infrared light, the D and W channels will be better sensitive than the RGB channels.
Therefore, in order to ensure the exposure effect of the RGB channel, the D channel and the W channel, the underexposure or overexposure condition is avoided, the gain of the RGB pixel can be controlled to be not smaller than the gain of the D pixel, and the gain of the RGB pixel is larger than the gain of the W pixel so as to optimize the exposure effect of the image.
In one example, the exposure time control circuit is configured to output a path of exposure time control signal according to the received exposure time control signal, where the first path of exposure time control signal is configured to control exposure times of the non-white light pixel and the W pixel;
the sensor gain control circuit is used for outputting two paths of gain control signals according to the received gain control signals, wherein the first path of gain control signals are used for controlling the gains of R pixels, G pixels and B pixels, and the second path of gain control signals are used for controlling the gains of W pixels and D pixels.
By way of example, the exposure time control circuit may output one exposure time control signal (which may be referred to as a first exposure time control signal) for controlling the exposure time of the non-white light pixels and the W pixels by controlling the exposure time of all pixels by one exposure time control signal to control the exposure time of each pixel, so that the control cost of the image sensor may be reduced and the process implementation difficulty of the image sensor may be reduced.
Illustratively, the dark channel may be gain controlled along with the white light channel, considering that the exposure times of the rgbcw are all the same, since the infrared component is closer to the white light channel.
Thus, the sensor gain circuit may be controlled by a gain control signal to output two gain control signals, one of which (may be referred to as a first gain control signal) is used to control the gains of the R, G, and B pixels and the other (may be referred to as a second gain control signal) is used to control the gains of the W and D pixels.
As a possible implementation, the exposure time of the R pixel, the G pixel, the B pixel, and the D pixel is equal to the exposure time of the W pixel when the image processing system is in the second operation mode.
For example, in the case where the image processing system uses the second operation mode, i.e., allows part of near infrared light to enter the image sensor, for example, for a scene where ambient visible light is weak, and the exposure times of the W pixel and the non-white light pixel are uniformly controlled, and the gains of the RGB pixel and the DW pixel are separately controlled, the exposure times of the R pixel, the G pixel, the B pixel, the D pixel, and the W pixel may be the same.
Illustratively, the gains of the R, G, and B pixels are greater than the gain of the W pixel when the image processing system is in the second mode of operation.
Illustratively, the gain of RGB (sum of analog gain and digital gain) may be controlled to be greater than the gain of W pixels (sum of analog gain and digital gain) by the gain control signal.
For example, the analog gain of an RGB pixel is greater than the analog gain of a W pixel, and neither RGB pixel nor W pixel enables digital gain; alternatively, the analog gain of the RBG pixel is the same as the gain of the W pixel, and the digital gain of the RGB pixel is greater than the digital gain of the W pixel.
Because the light sensitivity of the RGB channel is poor for a low-illumination scene, and the light sensitivity of the W channel is better than that of the RGB channel because of the existence of infrared light in the W channel, under the condition that the exposure time of the RGBDW is the same, the condition of underexposure or overexposure is avoided in order to ensure the exposure effect of the RGB channel and the W channel, and the gain (analog gain and digital gain) of the RGB pixel can be controlled to be larger than that of the W pixel so as to optimize the exposure effect of the image.
Referring to fig. 16, a schematic structural diagram of an image processing apparatus according to an embodiment of the present application is provided, and as shown in fig. 16, the image processing apparatus may include the image sensor, the first processing unit, and the second processing unit described in any of the above embodiments; wherein:
the first processing unit is used for performing first processing on the image signal in the first format output by the image sensor to obtain a first processed image in the first format;
and the second processing unit is used for performing second processing on the processed image in the first format to obtain a second processed image in a second format.
In some embodiments, the second processing unit performs a second process on the first processed image in the first format to obtain a second processed image in a second format, including:
and inputting the first processed image in a first format into one or more convolution filter groups to carry out convolution operation, so as to obtain a visible light full-resolution image output by the one or more convolution filter groups, wherein the visible light full-resolution image is the second processed image in a second format.
In some embodiments, the visible light full resolution image includes at least an R pixel full resolution image without an infrared light component, a G pixel full resolution image without an infrared light component, and a B pixel full resolution image without an infrared light component.
In some embodiments, the convolution filter bank is configured to output the full resolution image of visible light, the convolution filter bank including at least the following convolution filters:
convolution filters for interpolating and removing infrared rays of R pixels, G pixels and B pixels respectively at white light pixel positions in RGBDW pixel units;
convolution filters for R, G, and B pixel interpolation and de-infrared with dark light pixel locations in rgbcw pixel units;
a convolution filter for R and B pixel interpolation and ir removal with green pixel positions in the rgbcw pixel unit;
a convolution filter for interpolating and de-infrared G and B pixels at red locations in the rgbcw pixel cell;
convolution filters for R-pixel and G-pixel interpolation and de-infrared with blue pixel locations in rgbcw pixel cells.
In some embodiments, the convolution filter bank includes a first convolution filter bank for outputting an R-pixel full resolution image without an infrared light component, a second convolution filter bank for outputting a G-pixel full resolution image without an infrared light component, and a third convolution filter bank for outputting a B-pixel full resolution image without an infrared light component;
Wherein the first convolution filter bank at least comprises the following convolution filters which synchronously perform convolution filtering operation:
a convolution filter for R pixel interpolation and infrared removal with white light pixel locations in the rgbcw pixel unit; a convolution filter for R pixel interpolation and infrared removal with dark light pixel locations in the rgbcw pixel unit; a convolution filter for R pixel interpolation and infrared removal with green pixel positions in the rgbcw pixel unit; a convolution filter for R-pixel interpolation and infrared removal with blue pixel locations in the rgbcw pixel cell;
wherein the second convolution filter bank at least comprises the following convolution filters which synchronously perform convolution filtering operation:
a convolution filter for G pixel interpolation and infrared removal with white light pixel locations in the rgbcw pixel unit; a convolution filter for G pixel interpolation and infrared removal with dark light pixel locations in the rgbcw pixel unit; a convolution filter for G pixel interpolation and infrared removal with red pixel locations in rgbcw pixel cells; a convolution filter for G pixel interpolation and infrared removal with blue pixel locations in rgbcw pixel cells;
Wherein the third convolution filter bank at least comprises the following convolution filters which synchronously perform convolution filtering operation:
a convolution filter for B-pixel interpolation and infrared removal with white light pixel locations in the rgbcw pixel unit; a convolution filter for B-pixel interpolation and infrared removal with dark pixel locations in the rgbcw pixel unit; a convolution filter for B-pixel interpolation and infrared removal with red pixel locations in the rgbcw pixel cell; a convolution filter for B-pixel interpolation and infrared removal with green pixel locations in rgbcw pixel cells.
In some embodiments, when any of the convolution filters performs infrared removal synchronously, the infrared removal is performed synchronously after correcting the dark component in the first processed image according to a dark component exposure correction coefficient, where the dark component exposure correction coefficient is determined according to the sensitivity ISO of the RGB channel and the sensitivity ISO of the dark channel of the image sensor unit.
In some embodiments, any of the convolution filters performs infrared removal simultaneously, including: any convolution filter is used for removing infrared rays of the D pixel and the R pixel of the current pixel position neighborhood based on the R channel infrared removal coefficient and the dim light component exposure correction coefficient.
In some embodiments, the first processing unit is specifically configured to perform a first process on the image signal in the first format to obtain a first processed image in the first format when the first processing unit is in the second mode;
the second processing unit is specifically configured to perform a first process on the image signal in the first format to obtain a first processed image in the first format when the second processing unit is in the first working mode.
In some embodiments, the first processing unit is further configured to obtain, when in the first working mode, a target wide dynamic image based on wide dynamic processing performed on the image signal in the first format.
In some embodiments, first processing the image signal in the first format includes: performing noise reduction processing on the image signal in the first format to obtain a first processed image in the first format after noise reduction;
performing a second process on the first processed image includes: and converting the first processed image after noise reduction from the first format to the second format, and removing the near infrared component in the first processed image to obtain a second processed image in the second format.
The embodiment of the application also provides a near infrared light supplementing device, which can include:
A module for generating a light compensation control signal, wherein the light compensation control signal controls on near infrared light compensation or off near infrared light compensation; when the second optical filter is switched to the light inlet side of the image sensor, the light supplementing control signal is used for controlling the near infrared light supplementing to be closed; when the first optical filter is switched to the light inlet side of the image sensor, the light supplementing control signal is used for controlling the start of near infrared light supplementing.
In some embodiments, the apparatus further comprises means for performing the steps of:
determining a near infrared light supplementing intensity correlation coefficient;
determining the light supplementing intensity of the near infrared light supplementing according to the near infrared light supplementing intensity related coefficient;
when the near infrared light supplementing intensity related coefficient is larger than a first threshold value, the current light supplementing intensity of near infrared light supplementing is regulated down; when the correlation coefficient of the near infrared light supplementing intensity is smaller than a second threshold value, the current light supplementing intensity of the near infrared light supplementing is increased; the first threshold is greater than the second threshold.
In some embodiments, the near-infrared supplemental light intensity correlation coefficient is an energy ratio of near-infrared light energy to visible light energy of the RGB channel; the determining the near infrared light supplementing intensity correlation coefficient comprises the following steps:
Determining the energy duty ratio based on the average brightness of RGB channels and the average brightness of dark channels in the image of the image signal in the first format output by the image sensor;
or alternatively, the first and second heat exchangers may be,
acquiring the sensitivity of an RGB channel, the sensitivity of a dim light channel and the ratio of the integral value of a spectrum response curve of the dim light channel and the RGB channel in the first wavelength range of the image sensor unit; and determining the energy duty ratio based on the sensitivity of the RGB channel, the sensitivity of the dim light channel and the ratio of the integral value of the spectrum response curve of the dim light channel and the RGB channel in the first wavelength range.
Referring to fig. 17, a schematic structural diagram of an exposure control apparatus according to an embodiment of the present application is provided, where the exposure control apparatus may be applied to the image processing system described in any of the foregoing embodiments, and as shown in fig. 17, the exposure control apparatus may include:
a generation unit configured to generate an exposure control signal;
and the control unit is used for controlling the exposure of each channel of each pixel in the pixel array of the image sensor according to the exposure control signal.
In some embodiments, the image sensor includes an exposure time control circuit; the exposure control signals comprise exposure time control signals;
The control unit controls exposure of each channel of each pixel in the pixel array of the image sensor according to the exposure control signal, and the control unit comprises:
outputting the exposure time control signal to the exposure time control circuit; the exposure time control circuit is used for outputting at least a first path of exposure time control signals according to the exposure time control signals output by the exposure time control unit, wherein the first path of exposure time control signals are at least used for controlling the exposure time of the non-white light pixels, and the non-white light pixels comprise the red pixels, the green pixels, the blue pixels and the dark light pixels.
In some embodiments, the first path of exposure time control signal is used to control at least exposure start times of red pixels, green pixels, blue pixels, and dark pixels;
or the first path of exposure time control signal is at least used for controlling the exposure ending time of the red pixel, the green pixel, the blue pixel and the dark light pixel;
or the first path of exposure time control signal is at least used for controlling the exposure start time and the exposure end time of the red pixel, the green pixel, the blue pixel and the dark light pixel.
In some embodiments, the image sensor further comprises a sensor gain control circuit; the exposure control signal also comprises a gain control signal;
the control unit controls exposure of each channel of each pixel in the pixel array of the image sensor according to the exposure control signal, and the control unit comprises:
outputting the gain control signal to the sensor gain control circuit; the sensor gain control circuit is used for outputting at least two paths of gain control signals according to the gain control signals, wherein the first path of gain control signals are used for controlling gains of red pixels, green pixels and blue pixels, and the second path of gain control signals are at least used for controlling gains of white pixels.
In some embodiments, the exposure time control circuit is configured to output two exposure time control signals according to the exposure time control signal, wherein a first exposure time control signal is configured to control the exposure time of the non-white light pixel, a second exposure time control signal is configured to control the exposure time of the white light pixel,
the sensor gain control circuit is used for outputting three gain control signals according to the gain control signals, wherein the first gain control signal is used for controlling the gains of the red pixels, the green pixels and the blue pixels, the second gain control signal is used for controlling the gains of the white pixels, and the third gain control signal is used for controlling the gains of the dark pixels.
In some embodiments, the first path of exposure time control signals is used to control exposure start times of red, green, blue, and dark pixels, the second path of exposure time control signals is used to control exposure start times of the white pixels,
or the first path of exposure time control signals are used for controlling the exposure ending time of the red pixels, the green pixels, the blue pixels and the dark pixels, and the second path of exposure time control signals are used for controlling the exposure ending time of the white pixels;
or the first path of exposure time control signals are used for controlling the exposure start time and the exposure end time of the red pixels, the green pixels, the blue pixels and the dark pixels, and the second path of exposure time control signals are used for controlling the exposure start time and the exposure end time of the white pixels.
In some embodiments, the exposure time control circuit is configured to output a first path of exposure time control signal according to the exposure time control signal, where the first path of exposure time control signal is configured to control exposure times of the non-white light pixel and the white light pixel;
the sensor gain control circuit is used for outputting two paths of gain control signals according to the gain control signals, wherein the first path of gain control signals are used for controlling the gains of red pixels, green pixels and blue pixels, and the second path of gain control signals are used for controlling the gains of white pixels and dark pixels.
In some embodiments, the exposure time of the red, green, blue, and dark pixels is not less than the exposure time of the white pixels when the image processing system is in the second mode of operation.
In some embodiments, the exposure time of the red, green, blue, and dark pixels is equal to the exposure time of the white light pixels when the image processing system is in the second mode of operation.
In some embodiments, the gains of the red, green, and blue pixels are greater than the gain of the white light pixel when the image processing system is in the second mode of operation; the gains of the red pixel, the green pixel and the blue pixel are not smaller than the gains of the dark pixels.
In some embodiments, the gains of the red, green, and blue pixels are greater than the gain of the white light pixel when the image processing system is in the second mode of operation.
Fig. 18 is a schematic hardware structure of an electronic device according to an embodiment of the present application. The electronic device may include a processor 1801, a memory 1802 storing machine-executable instructions. The processor 1801 and the memory 1802 may communicate via a system bus 1803. Also, by reading and executing machine-executable instructions in the memory 1802, the processor 1801 can perform the image processing, near-infrared light compensation control, or exposure control methods described above.
The memory 1802 referred to herein may be any electronic, magnetic, optical, or other physical storage device that may contain or store information, such as executable instructions, data, or the like. For example, a machine-readable storage medium may be: RAM (Radom Access Memory, random access memory), volatile memory, non-volatile memory, flash memory, a storage drive (e.g., hard drive), a solid state drive, any type of storage disk (e.g., optical disk, dvd, etc.), or a similar storage medium, or a combination thereof.
In some embodiments, there is also provided a machine-readable storage medium, such as memory 1802 in fig. 18, having stored therein machine-executable instructions that when executed by a processor implement the image processing, near infrared light compensation control, or exposure control methods described above. For example, the machine-readable storage medium may be ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing description of the preferred embodiments of the present invention is not intended to limit the invention to the precise form disclosed, and any modifications, equivalents, improvements and alternatives falling within the spirit and principles of the present invention are intended to be included within the scope of the present invention.
Claims (28)
1. An image processing system, characterized by comprising at least an image sensor unit, an optical unit, an image processing unit and an exposure control unit, wherein,
the optical unit is used for blocking a part of wavelength interval spectrum of incident light and outputting a target optical signal which is incident to the image sensor;
the image sensor unit comprises an image sensor, is used for converting the sensitization of each pixel in a pixel array of the image sensor to the target light signal into an electric signal, and outputting a first-format image signal after the image signal passes through a reading circuit of the image sensor, wherein the pixel array of the image sensor comprises red pixels, green pixels, blue pixels, dark pixels and white pixels, the white pixels are uniformly and dispersedly arranged, and the red pixels, the green pixels, the blue pixels and the dark pixels are surrounded around the white pixels; the intensity of the near infrared light allowed to pass through by the first type of pixels of the pixel array is stronger than that of the near infrared light allowed to pass through by the second type of pixels; the first type of pixels comprise white light pixels and dark light pixels; the second type of pixels include at least one of red pixels, green pixels, and blue pixels;
The exposure control unit is used for outputting an exposure control signal to the image sensor unit so as to control the exposure of each channel of each pixel in the pixel array of the image sensor unit;
the image processing unit is used for performing first processing on the image signal in the first format to obtain a first processed image in the first format; and performing second processing on the first processed image in the first format to obtain a second processed image in a second format.
2. The image processing system of claim 1, wherein the optical unit comprises an optical imaging lens and a filter device, wherein the filter device is positioned between the optical imaging lens and the image sensor, and wherein the image sensor is positioned on a light exit side of the filter device.
3. The image processing system according to claim 2, wherein the light filtering device includes a first light filter, a second light filter, and a switching member, the first light filter and the second light filter each being connected to the switching member; the switching component is used for switching the second optical filter to the light incident side of the image sensor or switching the first optical filter to the light incident side of the image sensor.
4. The image processing system of claim 3, wherein the image processing system further comprises a display unit,
the first optical filter passes visible light and part of near infrared light; the second optical filter passes visible light and blocks near infrared light.
5. The image processing system according to claim 4, wherein the first filter is a bimodal filter for passing infrared light in a visible light wavelength range [ T3, T4] and a wavelength range within a first wavelength range [ T1, T2] to filter out portions of different spectral responses of red, blue, and green light channels in the infrared wavelength range.
6. The image processing system of claim 3, wherein the image processing system further comprises a display unit,
the optical imaging lens is coated with a filter film, and the filter film is used for enabling visible light wave bands [ T3, T4] and infrared light with wave band ranges within a first wavelength range [ T1, T2] to pass through; the first optical filter is an all-pass optical filter, and the second optical filter enables visible light to pass through and blocks all near infrared light.
7. The image processing system according to claim 3, further comprising a drive control unit for controlling the switching means to switch the second filter to the light-entering side of the image sensor when the image processing system uses the first operation mode; and when the image processing system uses the second working mode, controlling the switching component to switch the first optical filter to the light inlet side of the image sensor.
8. The image processing system according to any one of claims 1 to 7, further comprising: a near infrared light supplementing control unit and a near infrared light supplementing unit;
the near-infrared light supplementing control unit is used for sending a light supplementing control signal to the near-infrared light supplementing unit, and the light supplementing control signal is at least used for controlling the near-infrared light supplementing unit to be turned on and turned off;
the near-infrared light supplementing unit is used for switching on near-infrared light supplementing or switching off near-infrared light supplementing based on the light supplementing control signal; when the second optical filter is switched to the light entering side of the image sensor, the light supplementing control signal is used for controlling the near infrared light supplementing unit to turn off near infrared light supplementing; when the first optical filter is switched to the light inlet side of the image sensor, the light supplementing control signal is used for controlling the near infrared light supplementing unit to start near infrared light supplementing.
9. The image processing system of claim 8, wherein the image processing system further comprises a display device,
the image sensor of the image sensor unit includes an exposure time control circuit; the exposure control unit comprises an exposure time control unit, and the exposure time control unit is used for outputting an exposure time control signal to the exposure time control circuit;
The exposure time control circuit is used for outputting at least a first path of exposure time control signals according to the exposure time control signals output by the exposure time control unit, wherein the first path of exposure time control signals are at least used for controlling the exposure time of non-white light pixels, and the non-white light pixels comprise red pixels, green pixels, blue pixels and dark pixels.
10. The image processing system of claim 9, wherein the image processing system further comprises a display device,
the image sensor of the image sensor unit further includes: the gain control circuitry of the sensor is provided,
the exposure control unit also comprises a gain control unit, wherein the gain control unit is used for outputting a gain control signal to the sensor gain control circuit;
the sensor gain control circuit is used for outputting at least two paths of gain control signals according to the gain control signals output by the gain control unit, wherein the first path of gain control signals are used for controlling gains of red pixels, green pixels and blue pixels, and the second path of gain control signals are at least used for controlling gains of white pixels.
11. The image processing system of claim 1, wherein performing a second process on the first processed image in the first format to obtain a second processed image in a second format comprises:
And inputting the first processed image in a first format into one or more convolution filter groups to carry out convolution operation, so as to obtain a visible light full-resolution image output by the one or more convolution filter groups, wherein the visible light full-resolution image is the second processed image in a second format.
12. The image processing system of claim 11, wherein the visible full resolution image comprises a red full resolution image without an infrared light component, a green full resolution image without an infrared light component, and a blue full resolution image without an infrared light component.
13. The image processing system of claim 11, wherein when the pixel array of the image sensor comprises RGBDW pixel cells,
the convolution filter bank is used for outputting the visible light full resolution image, and at least comprises the following convolution filters:
a convolution filter for performing R, G and B pixel interpolation and infrared removal, respectively, on white light pixel positions in the rgbcw pixel unit;
a convolution filter for performing R, G and B pixel interpolation and infrared removal, respectively, on the dark light pixel position in the rgbcw pixel unit;
Convolution filters for interpolating and removing infrared from R and B pixels, respectively, for green pixel locations in RGBDW pixel units;
convolution filters for interpolating and removing infrared from the G and B pixels, respectively, for red locations in the rgbcw pixel unit;
convolution filters for interpolating and de-infrared R and G pixels, respectively, for blue pixel locations in an rgbcw pixel unit.
14. The image processing system of claim 11, wherein the convolution filter bank comprises a first convolution filter bank for outputting a red full resolution image without an infrared light component, a second convolution filter bank for outputting a green full resolution image without an infrared light component, and a third convolution filter bank for outputting a blue full resolution image without an infrared light component.
15. The image processing system of claim 14, wherein when the pixel array of the image sensor comprises rgbcw pixel cells, the first convolution filter bank comprises at least the following convolution filters that perform convolution filtering operations in synchronization:
A convolution filter for R-pixel interpolation and infrared removal for white light pixel locations in the rgbcw pixel unit; a convolution filter for R-pixel interpolation and infrared removal for dark light pixel locations in the rgbcw pixel unit; a convolution filter for R-pixel interpolation and infrared removal for green pixel locations in the rgbcw pixel unit; a convolution filter for R-pixel interpolation and infrared removal for blue pixel locations in the rgbcw pixel unit;
when the pixel array of the image sensor includes rgbcw pixel units, the second convolution filter bank includes at least the following convolution filters that perform convolution filtering operations in synchronization:
a convolution filter for G pixel interpolation and infrared removal for white light pixel positions in the RGBDW pixel unit; a convolution filter for G pixel interpolation and infrared removal for dark light pixel positions in the RGBDW pixel unit; a convolution filter for G-pixel interpolation and infrared removal for red pixel locations in the rgbcw pixel unit; a convolution filter for G-pixel interpolation and infrared removal for blue pixel locations in the rgbcw pixel unit;
when the pixel array of the image sensor includes rgbcw pixel units, the third convolution filter bank includes at least the following convolution filters that perform convolution filtering operations in synchronization:
A convolution filter for performing B-pixel interpolation and infrared removal for white light pixel locations in the rgbcw pixel unit; a convolution filter for performing B-pixel interpolation and infrared removal for a dark pixel position in the rgbcw pixel unit; a convolution filter for B-pixel interpolation and infrared removal of red pixel locations in the rgbcw pixel unit; a convolution filter for B-pixel interpolation and infrared removal for green pixel locations in rgbcw pixel cells.
16. The image processing system according to any one of claims 11 to 15, wherein,
and when any convolution filter synchronously removes infrared rays, correcting a dark light component in the first processed image according to a dark light component exposure correction coefficient, and then synchronously removing infrared rays, wherein the dark light component exposure correction coefficient is determined according to the sensitivity of an R channel, the sensitivity of a G channel, the sensitivity of a B channel and the sensitivity of a dark light channel D of the image sensor unit.
17. The image processing system according to any one of claims 1 to 7, wherein,
executing the first processing of the image signal in the first format to obtain a first processed image in the first format when the image signal is in the second working mode; and performing second processing on the first processed image in the first format to obtain a second processed image in a second format.
18. The image processing system of claim 17, wherein the image processing system further comprises a display device,
and when the first working mode is adopted, obtaining a target wide dynamic image based on the wide dynamic processing of the image signal in the first format.
19. The image processing system according to any one of claims 1 to 7, wherein,
performing a first process on the image signal in the first format includes: performing noise reduction processing on the image signal in the first format to obtain a first processed image in the first format after noise reduction;
performing a second process on the first processed image includes: and converting the first processed image after noise reduction from the first format to the second format, and removing the near infrared component in the first processed image to obtain a second processed image in the second format.
20. An image processing method, applied to the image processing system of any one of claims 1 to 19, the method comprising:
performing first processing on the image signal in the first format output by the image sensor to obtain a first processed image in the first format;
and performing second processing on the processed image in the first format to obtain a second processed image in a second format.
21. The image processing method according to claim 20, wherein performing the second processing on the first processed image in the first format to obtain a second processed image in a second format, comprises:
and inputting the first processed image in a first format into one or more convolution filter groups to carry out convolution operation, so as to obtain a visible light full-resolution image output by the one or more convolution filter groups, wherein the visible light full-resolution image is the second processed image in a second format.
22. The image processing method according to claim 21, wherein the visible light full resolution image includes at least a red full resolution image containing no infrared light component, a green full resolution image containing no infrared light component, and a blue full resolution image containing no infrared light component.
23. The image processing method of claim 22, wherein when the pixel array of the image sensor comprises RGBDW pixel cells,
the convolution filter bank is used for outputting the visible light full resolution image, and at least comprises the following convolution filters:
a convolution filter for performing R, G and B pixel interpolation and infrared removal, respectively, on white light pixel positions in the rgbcw pixel unit;
A convolution filter for performing R, G and B pixel interpolation and infrared removal, respectively, on the dark light pixel position in the rgbcw pixel unit;
convolution filters for interpolating and removing infrared from R and B pixels, respectively, for green pixel locations in RGBDW pixel units;
convolution filters for interpolating and removing infrared from the G and B pixels, respectively, for red locations in the rgbcw pixel unit;
convolution filters for interpolating and de-infrared R and G pixels, respectively, for blue pixel locations in an rgbcw pixel unit.
24. The image processing method according to claim 21, wherein the convolution filter set includes a first convolution filter set for outputting a red full resolution image containing no infrared light component, a second convolution filter set for outputting a green full resolution image containing no infrared light component, and a third convolution filter set for outputting a blue full resolution image containing no infrared light component;
wherein the first convolution filter bank at least comprises the following convolution filters which synchronously perform convolution filtering operation:
A convolution filter for R-pixel interpolation and infrared removal for white light pixel locations in the rgbcw pixel unit; a convolution filter for R-pixel interpolation and infrared removal for dark light pixel locations in the rgbcw pixel unit; a convolution filter for R-pixel interpolation and infrared removal for green pixel locations in the rgbcw pixel unit; a convolution filter for R-pixel interpolation and infrared removal for blue pixel locations in the rgbcw pixel unit;
wherein the second convolution filter bank at least comprises the following convolution filters which synchronously perform convolution filtering operation:
a convolution filter for G pixel interpolation and infrared removal for white light pixel positions in the RGBDW pixel unit; a convolution filter for G pixel interpolation and infrared removal for dark light pixel positions in the RGBDW pixel unit; a convolution filter for G-pixel interpolation and infrared removal for red pixel locations in the rgbcw pixel unit; a convolution filter for G-pixel interpolation and infrared removal for blue pixel locations in the rgbcw pixel unit;
wherein the third convolution filter bank at least comprises the following convolution filters which synchronously perform convolution filtering operation:
A convolution filter for performing B-pixel interpolation and infrared removal for white light pixel locations in the rgbcw pixel unit; a convolution filter for performing B-pixel interpolation and infrared removal for a dark pixel position in the rgbcw pixel unit; a convolution filter for B-pixel interpolation and infrared removal of red pixel locations in the rgbcw pixel unit; a convolution filter for B-pixel interpolation and infrared removal for green pixel locations in rgbcw pixel cells.
25. The image processing method according to claim 23 or 24, wherein,
and when any convolution filter synchronously removes infrared rays, correcting a dark light component in the first processed image according to a dark light component exposure correction coefficient, and then synchronously removing infrared rays, wherein the dark light component exposure correction coefficient is determined according to the sensitivity of an R channel, the sensitivity of a G channel, the sensitivity of a B channel and the sensitivity of a dark light channel of an image sensor unit.
26. The image processing method according to claim 20, wherein,
executing the first processing of the image signal in the first format to obtain a first processed image in the first format when the image signal is in the second working mode; and performing second processing on the first processed image in the first format to obtain a second processed image in a second format.
27. The image processing method according to claim 26, wherein,
while in the first mode of operation, the method further comprises: and obtaining a target wide dynamic image based on the wide dynamic processing of the image signal in the first format.
28. The method for image processing according to any one of claims 20 to 24 or 26 to 27,
performing a first process on the image signal in the first format includes: performing noise reduction processing on the image signal in the first format to obtain a first processed image in the first format after noise reduction;
performing a second process on the first processed image includes: and converting the first processed image after noise reduction from the first format to the second format, and removing the near infrared component in the first processed image to obtain a second processed image in the second format.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011634982.1A CN114697584B (en) | 2020-12-31 | 2020-12-31 | Image processing system and image processing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011634982.1A CN114697584B (en) | 2020-12-31 | 2020-12-31 | Image processing system and image processing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114697584A CN114697584A (en) | 2022-07-01 |
CN114697584B true CN114697584B (en) | 2023-12-26 |
Family
ID=82134049
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011634982.1A Active CN114697584B (en) | 2020-12-31 | 2020-12-31 | Image processing system and image processing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114697584B (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102542281A (en) * | 2010-12-27 | 2012-07-04 | 北京北科慧识科技股份有限公司 | Non-contact biometric feature identification method and system |
KR20160132209A (en) * | 2015-05-07 | 2016-11-17 | (주)이더블유비엠 | Method and apparatus for extraction of depth information of image using fast convolution based on multi-color sensor |
WO2017101641A1 (en) * | 2015-12-14 | 2017-06-22 | 比亚迪股份有限公司 | Imaging method of image sensor, imaging apparatus and electronic device |
WO2017193738A1 (en) * | 2016-05-09 | 2017-11-16 | 比亚迪股份有限公司 | Image sensor, imaging method, and imaging device |
WO2018145575A1 (en) * | 2017-02-10 | 2018-08-16 | 杭州海康威视数字技术股份有限公司 | Image fusion apparatus and image fusion method |
CN109685742A (en) * | 2018-12-29 | 2019-04-26 | 哈尔滨理工大学 | A kind of image enchancing method under half-light environment |
CN110493491A (en) * | 2019-05-31 | 2019-11-22 | 杭州海康威视数字技术股份有限公司 | A kind of image collecting device and image capture method |
CN110505377A (en) * | 2019-05-31 | 2019-11-26 | 杭州海康威视数字技术股份有限公司 | Image co-registration device and method |
CN110519489A (en) * | 2019-06-20 | 2019-11-29 | 杭州海康威视数字技术股份有限公司 | Image-pickup method and device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015199163A1 (en) * | 2014-06-24 | 2015-12-30 | 日立マクセル株式会社 | Image pickup sensor and image pickup device |
JP2016052116A (en) * | 2014-08-28 | 2016-04-11 | パナソニックIpマネジメント株式会社 | Image processing device, image processing method and computer program |
JP6834907B2 (en) * | 2017-10-25 | 2021-02-24 | トヨタ自動車株式会社 | Imaging method |
-
2020
- 2020-12-31 CN CN202011634982.1A patent/CN114697584B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102542281A (en) * | 2010-12-27 | 2012-07-04 | 北京北科慧识科技股份有限公司 | Non-contact biometric feature identification method and system |
KR20160132209A (en) * | 2015-05-07 | 2016-11-17 | (주)이더블유비엠 | Method and apparatus for extraction of depth information of image using fast convolution based on multi-color sensor |
WO2017101641A1 (en) * | 2015-12-14 | 2017-06-22 | 比亚迪股份有限公司 | Imaging method of image sensor, imaging apparatus and electronic device |
WO2017193738A1 (en) * | 2016-05-09 | 2017-11-16 | 比亚迪股份有限公司 | Image sensor, imaging method, and imaging device |
WO2018145575A1 (en) * | 2017-02-10 | 2018-08-16 | 杭州海康威视数字技术股份有限公司 | Image fusion apparatus and image fusion method |
CN108419062A (en) * | 2017-02-10 | 2018-08-17 | 杭州海康威视数字技术股份有限公司 | Image co-registration equipment and image interfusion method |
CN109685742A (en) * | 2018-12-29 | 2019-04-26 | 哈尔滨理工大学 | A kind of image enchancing method under half-light environment |
CN110493491A (en) * | 2019-05-31 | 2019-11-22 | 杭州海康威视数字技术股份有限公司 | A kind of image collecting device and image capture method |
CN110505377A (en) * | 2019-05-31 | 2019-11-26 | 杭州海康威视数字技术股份有限公司 | Image co-registration device and method |
CN110519489A (en) * | 2019-06-20 | 2019-11-29 | 杭州海康威视数字技术股份有限公司 | Image-pickup method and device |
Also Published As
Publication number | Publication date |
---|---|
CN114697584A (en) | 2022-07-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6725613B2 (en) | Imaging device and imaging processing method | |
US8624997B2 (en) | Alternative color image array and associated methods | |
US8125543B2 (en) | Solid-state imaging device and imaging apparatus with color correction based on light sensitivity detection | |
US8164651B2 (en) | Concentric exposure sequence for image sensor | |
EP2415254B1 (en) | Exposing pixel groups in producing digital images | |
US9681059B2 (en) | Image-capturing device | |
US8022994B2 (en) | Image sensor with high dynamic range in down-sampling mode | |
US20090200451A1 (en) | Color pixel arrays having common color filters for multiple adjacent pixels for use in cmos imagers | |
US8294797B2 (en) | Apparatus and method of generating a high dynamic range image | |
US20090174792A1 (en) | Solid-state imaging device for enlargement of dynamic range | |
US20090109306A1 (en) | High dynamic range sensor with reduced line memory for color interpolation | |
US20070046807A1 (en) | Capturing images under varying lighting conditions | |
US20070159542A1 (en) | Color filter array with neutral elements and color image formation | |
US20080291312A1 (en) | Imaging signal processing apparatus | |
CN116074643A (en) | Image sensor and method for HDR image capturing using image sensor system | |
CN114697584B (en) | Image processing system and image processing method | |
CN114697586B (en) | Image processing system, near infrared light supplementing control method and device | |
CN114697585B (en) | Image sensor, image processing system and image processing method | |
JPWO2011132619A1 (en) | Solid-state imaging device and imaging apparatus | |
CN116801088A (en) | Image sensor, image processing system, image processing method and device | |
JP2003087804A (en) | Solid-state image pickup device | |
JP2006060462A (en) | Imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |