WO2015190021A1 - 撮像制御装置、撮像装置、撮像システムおよび撮像制御方法 - Google Patents
撮像制御装置、撮像装置、撮像システムおよび撮像制御方法 Download PDFInfo
- Publication number
- WO2015190021A1 WO2015190021A1 PCT/JP2015/001811 JP2015001811W WO2015190021A1 WO 2015190021 A1 WO2015190021 A1 WO 2015190021A1 JP 2015001811 W JP2015001811 W JP 2015001811W WO 2015190021 A1 WO2015190021 A1 WO 2015190021A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- image
- unit
- image processing
- control
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 172
- 238000000034 method Methods 0.000 title claims abstract description 84
- 238000012545 processing Methods 0.000 claims abstract description 212
- 230000008569 process Effects 0.000 claims abstract description 70
- 238000012854 evaluation process Methods 0.000 claims description 27
- 230000037361 pathway Effects 0.000 abstract 1
- 238000011156 evaluation Methods 0.000 description 21
- 238000005516 engineering process Methods 0.000 description 15
- 238000005375 photometry Methods 0.000 description 15
- 238000001514 detection method Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/675—Focus control based on electronic image sensor signals comprising setting of focusing regions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/673—Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
- H04N25/445—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by skipping some contiguous pixels within the read portion of the array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/51—Control of the gain
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/53—Control of the integration time
- H04N25/531—Control of the integration time by controlling rolling shutters in CMOS SSIS
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/71—Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
- H04N25/75—Circuitry for providing, modifying or processing image signals from the pixel array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/77—Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/78—Readout circuits for addressed sensors, e.g. output amplifiers or A/D converters
Definitions
- the present technology relates to an imaging control apparatus and an imaging control method for controlling imaging by an imaging element, and an imaging apparatus and an imaging system including the imaging control apparatus.
- the camera obtains an optimal exposure state and in-focus state by multiple exposures.
- the detection process takes time.
- the detection process is a photometric process (for example, calculation of a brightness value) in the AE process, or a distance measurement process (for example, calculation of a contrast value) in the AF process.
- Patent Document 1 divides a pixel region of a CMOS (Complementary Metal-Oxide Semiconductor) image sensor into first to eighth divided areas, and reads and processes image signals in parallel from these divided areas. . Specifically, this imaging device simply reads out an image signal from an analog end, stores the image signal in a memory, and calculates an AF evaluation value in parallel (for example, the specification of Patent Document 1). Document paragraph [0028]).
- CMOS Complementary Metal-Oxide Semiconductor
- the imaging device described in Patent Document 2 is provided with two photosensitive regions of low sensitivity and high sensitivity for each pixel of the solid-state imaging device. Thereby, at the time of one photometry (exposure), the range of the brightness of the signal from each photosensitive area can be widened, and an appropriate exposure parameter may be obtained by one photometry (for example, (See paragraph [0045] of the specification of Patent Document 2 and FIGS. 2 and 3).
- An object of the present technology is to provide an imaging control device, an imaging device, an imaging system, and an imaging control method that can increase the speed or accuracy of imaging processing.
- an imaging control apparatus includes a control unit and a plurality of image processing units.
- the control unit reads the image pickup unit so as to read in parallel image signals generated under different image pickup conditions for each pixel line group among the plurality of pixel lines of the image pickup unit having a plurality of pixel lines. It is configured to perform control.
- the plurality of image processing units are configured to respectively process the image signals read from the imaging unit.
- the control unit executes control for reading in parallel image signals generated under different imaging conditions for each pixel line group, and the plurality of image processing units respectively process the image signals. Therefore, it is possible to increase the processing speed or accuracy according to the processing content of the image processing unit.
- the control unit may be configured to perform row thinning readout from each pixel line group, and the plurality of image processing units may be configured to respectively process the thinned images read by the row thinning readout. Good.
- the control unit may be configured to execute control to read from the imaging unit an image signal generated under different exposure conditions for each pixel line group.
- the plurality of image processing units may be configured to execute an evaluation process in an AE (Automatic Exposure) process. Since evaluation processing in AE is executed in parallel for image signals generated under different exposure conditions, high-speed evaluation processing (photometry processing) can be realized, so that AE processing can be speeded up.
- AE Automatic Exposure
- the plurality of image processing units may be configured to execute evaluation processing in AF (Automatic Focus) processing, respectively.
- AF Automatic Focus
- one image processing unit executes an evaluation process for an image set under an exposure condition suitable for the AF process, a highly accurate or optimal AF process can be realized.
- the control unit executes row thinning readout from the first pixel line group among the plurality of pixel lines, and among the entire images generated by a second pixel line group different from the first pixel line group A partial image that is an image of a partial region and is not thinned out may be read out.
- the plurality of image processing units may be configured to process the thinned image and the partial image read by the row thinning readout, respectively.
- the control unit is configured to execute control to read out an image signal generated under different exposure conditions for each pixel line group from the imaging unit, and the plurality of image processing units respectively perform evaluation processing in AE processing. It may be configured to execute. Since the resolution of the partial image is higher than the resolution of the thinned image, the photometric accuracy in the partial image can be increased in the AE process, and as a result, a partially highly accurate AE process can be performed.
- the plurality of image processing units may be configured to execute evaluation processing in AF processing, respectively.
- AF processing evaluation processing in AF processing
- one of a plurality of image processing units performs AF processing on a partial image generated under exposure conditions suitable for AF and not displayed on the display unit.
- the evaluation process can be executed. As a result, high-precision or optimal AF processing can be realized.
- the control unit is configured to execute the readout control so that the image signal is read out at a predetermined readout cycle and the readout timing in parallel is shifted by a time (for example, a half cycle) shorter than the readout cycle. May be.
- the plurality of image processing units may be configured to execute an evaluation process in the AF process. As a result, the plurality of image processing units can execute the processing every time shorter than the reading cycle (for example, a half cycle) as a whole, so that the AF processing can be speeded up.
- the control unit may be configured to read image signals in parallel through a plurality of output paths connected to the imaging unit.
- Another imaging control device of the present technology includes a control unit and a plurality of image processing units.
- the control unit is configured to read an image signal generated for each pixel line group of the plurality of pixel lines in an imaging unit having a plurality of pixel lines at a predetermined readout cycle, The readout control of the imaging unit is executed so that the readout timing is shifted by a time shorter than the readout cycle (for example, a half cycle).
- the plurality of image processing units are configured to respectively process the image signals read from the imaging unit. Since a plurality of image processing units can execute the processing every time shorter than the reading cycle (for example, a half cycle), the processing speed can be increased.
- the plurality of image processing units may be configured to execute an evaluation process in an AF (Automatic Focus) process. Thereby, the AF process can be speeded up.
- AF Automatic Focus
- the imaging device includes an imaging unit, a plurality of output paths, a control unit, and a plurality of image processing units.
- the imaging unit has a plurality of pixel lines.
- the plurality of output paths are configured to be connectable to the imaging unit for each pixel line group among the plurality of pixel lines.
- the control unit is configured to set different imaging conditions for each of the pixel line groups, and reads out image signals respectively generated in the pixel line groups under the different imaging conditions via the plurality of output paths. Further, it is configured to execute readout control of the imaging unit.
- the plurality of image processing units are configured to respectively process the image signals read from the imaging unit.
- An imaging system includes an imaging unit, a plurality of output paths, a control unit, and a plurality of image processing units of the imaging apparatus.
- the imaging unit having a plurality of pixel lines is configured to read in parallel the image signals generated under different imaging conditions for each pixel line group among the plurality of pixel lines. Including executing read control of the imaging unit. The image signals read from the imaging unit are each processed.
- FIG. 1 is a block diagram illustrating a configuration of an imaging apparatus according to an embodiment of the present technology.
- FIG. 2 shows a configuration example of the solid-state image sensor when the solid-state image sensor is a CMOS image sensor, for example.
- FIG. 3 is a block diagram mainly showing the configuration of the image processing circuit.
- FIG. 4 is a flowchart showing processing according to the first embodiment.
- FIG. 5 shows an example of an image for explaining the first embodiment.
- FIG. 6 shows an example of an image for explaining the second embodiment.
- FIG. 7 shows an example of an image for explaining the third embodiment.
- FIG. 8 is a diagram illustrating the timing for each image frame read in the third embodiment.
- FIG. 9 is a flowchart showing the processing of the third embodiment.
- FIG. 10 shows an example image for explaining the fourth embodiment.
- FIG. 11 shows a configuration of a solid-state imaging device according to another configuration example of the present technology.
- FIG. 1 is a block diagram illustrating a configuration of an imaging apparatus according to an embodiment of the present technology.
- the imaging apparatus 100 mainly includes an optical component 10, a solid-state imaging device 20, an image processing circuit 30, an output unit 51, a display unit 52, a driving unit 53, a timing generator 54, a control unit 40, and the like.
- the optical component 10 includes a lens 11, a diaphragm 12, and the like, and may include an optical filter (not shown). Light from the subject enters the solid-state image sensor 20 through the optical component 10.
- the solid-state imaging device 20 functions as an “imaging unit” and is an image sensor such as a CMOS (Complementary Metal-Oxide Semiconductor). In this case, it may be a stacked type in which the pixel array and the drive circuit are stacked, or may not be so.
- CMOS Complementary Metal-Oxide Semiconductor
- the image processing circuit 30 takes in the image signal output from the solid-state imaging device 20 and executes predetermined image processing.
- the image processing circuit 30 according to the present embodiment is a parallel processing type circuit as will be described later. For example, two image processing units execute image processing in parallel.
- the output unit 51 performs a conversion process necessary for outputting a digital signal obtained via the image processing circuit 30 to the display unit 52.
- the display unit 52 displays the image processed by the output unit 51.
- the display unit 52 according to the present embodiment mainly displays a so-called through image in which the thinned image read out by the image processing circuit 30 and read out is displayed as it is, as will be described later.
- the driving unit 53 drives the lens 11 based on an instruction from the control unit 40 in order to execute, for example, an AF (Automatic Focus) process by the control unit 40.
- the drive unit 53 may further drive the aperture 12 based on an instruction from the control unit 40 for AE (Automatic Exposure) processing.
- FIG. 2 shows a configuration example of the solid-state imaging device 20 when the solid-state imaging device 20 is a CMOS image sensor, for example.
- the timing generator 54 (see FIG. 1) generates a drive synchronization signal for the solid-state imaging device 20 based on an instruction from the control unit 40. Further, the control unit 40 sets a register value by serial communication with respect to a register group for determining an internal operation existing in the solid-state imaging device 20. Based on the drive synchronization signal from the timing generator 54 and the register setting from the control unit 40, the solid-state imaging device 20 generates a pulse necessary for taking out the electric charge accumulated in the photodiode 22 inside the solid-state imaging device 20. . For example, ON / OFF of the vertical transfer pixel selection switch 24 provided for each pixel 21 is controlled for each pixel line (that is, row) 205. Further, ON / OFF of the horizontal transfer column selection switches 28a and 28b for selecting the vertical signal lines 213a and 213b is controlled.
- control unit 40 mainly includes a CPU (Central Processing Unit) and a memory (not shown).
- the control unit 40 controls the operations of the solid-state imaging device 20, the image processing circuit 30, the output unit 51, the drive unit 53, and the timing generator 54.
- control unit 40 and the image processing circuit 30 function as an “imaging control device”.
- the imaging apparatus 100 may include an operation unit to which an operation by a user is input, an external terminal for outputting an image processed by the image processing circuit 30 to an external device, and the like.
- FIG. 3 is a block diagram mainly showing the configuration of the image processing circuit 30.
- the image processing circuit 30 includes a plurality of image processing units 31 and 32 configured to be capable of parallel processing. For convenience of explanation, it is assumed that there are two image processing units, and these are referred to as a first image processing unit 31 and a second image processing unit 32.
- the image processing circuit 30 sends a processing instruction to one of the first image processing unit 31 and the second image processing unit 32 according to the status of the processing, and a determination / instruction unit 35 for making a determination. Have.
- the operation of the determination / instruction unit 35 is controlled by the control unit 40.
- the determination / instruction unit 35 may be an element incorporated in the control unit.
- a plurality of output paths 36 and 37 for outputting image signals are connected to the solid-state imaging device 20.
- the image signal generated by the solid-state imaging device 20 is input to the first image processing unit 31 via the first output path 36, and the second image processing unit 32 via the second output path 37. Is input.
- the pixel array 201 of the solid-state imaging device 20 has the above-described plurality of pixel lines 205 arranged in the vertical direction.
- One pixel line 205 includes a plurality of pixels 21 arranged in the horizontal direction.
- Each pixel 21 includes the photodiode 22 and the amplifier 23.
- the first output path 36 is configured to be connectable by a pixel selection switch 24 to a pixel line group in an odd row, for example, 1, 3, 5,..., 2n ⁇ 1.
- the pixel line group of the odd rows is referred to as a pixel line group A.
- the second output path 37 is configured to be connectable by a pixel selection switch 24 to a pixel line group of even rows, for example, 2, 4, 6,.
- the pixel line group of even-numbered rows is referred to as a pixel line group B.
- the first pixel line group A is an odd-numbered pixel line
- the second pixel line group B is an even-numbered pixel line
- each line of the pixel line groups A and B is arranged every other row. I can't.
- the lines of the pixel line group A and the lines of the pixel line group B may be arranged every predetermined number of lines such as every two lines or every several lines.
- the solid-state imaging device 20 includes a CDS (Correlated Double Sampling) circuit 25 and an ADC (Analog Digital Digital Converter) 26 for each column, that is, for each vertical signal line 213a and each vertical signal line 213b.
- the ADC 26 includes a PGC (Programmable / Gain / Control) circuit, and can control the gain based on the register setting from the control unit 40.
- the CDS circuit 25 removes noise from the pixel signal, and the ADC 26 converts the analog signal into a digital signal.
- the vertical signal lines 213a and 213b are configured to be connectable to the horizontal signal lines 214a and 214b via the column selection switches 28a and 28b, respectively.
- the horizontal signal line 214a is connected to the first output path 36, for example.
- the horizontal signal line 214b is connected to the second output path 37, for example.
- FIG. 4 is a flowchart showing AE processing by the control unit 40 and the image processing circuit 30.
- the control unit 40 sets different imaging conditions, that is, exposure conditions for the pixel line group A and the pixel line group B (step 101).
- the exposure condition is set by at least one of the electronic shutter speed and the gain by the ADC 26 described above. For convenience of explanation, it is assumed that the exposure condition set for the pixel line group A is set A and the exposure condition set for the pixel line group B is set B.
- the setting A is initially set to minimize the exposure amount, and the setting B is initially set to maximize the exposure amount. Of course, these may be reversed.
- the controller 40 reads out pixel signals from the pixel line group A and the pixel line group B in parallel via the first output path 36 and the second output path 37 (step 102). That is, the readout timing of the pixel signal from the pixel line group A under the setting A exposure condition and the readout timing of the pixel signal from the pixel line group B under the setting B exposure condition are simultaneous.
- each image processing unit reads out an image signal by thinning out at a predetermined thinning rate (described later). That is, row thinning readout is executed. For example, the pixel signals of the pixel lines 205 having a smaller number of rows than the number of rows of the pixel line group A in the pixel line group A are read out to the first image processing unit 31. Further, the pixel signal of the pixel line 205 having a smaller number of rows than the number of rows of the pixel line group B in the pixel line group B is read out to the second image processing unit 32.
- the decimation rate can be set to 3, 5, 7, or more, for example.
- the thinning rate is 5, an image signal is read out from the pixel line 205 of one row every five rows of odd rows and one row of every five rows of even rows.
- the thinning rate may be the same between the first image processing unit 31 and the second image processing unit 32, or may be different.
- an image generated with the same number of pixels as the number of pixel lines in the pixel line group A (or B) is 1 ⁇ 2 the size of the image generated in the entire pixel array 201. It is assumed that the “thinned out” image in the specification does not hit.
- the determination / instruction unit 35 causes the first image processing unit 31 and the second image processing unit 32 to execute the evaluation processing in AE independently in parallel (step 102).
- This evaluation process includes a photometry process using an arbitrary photometry method such as a division photometry method, a center-weighted photometry method, an average photometry method, or a specific partial photometry method.
- the imaging apparatus 100 may have a program that can select one of these methods by a user operation.
- the first image processing unit 31 and the second image processing unit 32 perform photometry processing using a divided photometry method in which an image of one frame is divided into, for example, 4 ⁇ 4 regions.
- the exposure amount (luminance) of the region marked with a circle exceeds, for example, a threshold value serving as a reference standard, and is passed (OK).
- the luminance of the circled region is, for example, equal to or lower than a threshold value serving as a reference standard, and is set to OK.
- each of the image processing units 31 and 32 creates an evaluation value indicating whether the area is OK or NG as a result of the photometric processing.
- the threshold may be the same or different between the first image processing unit 31 and the second image processing unit 32.
- the threshold value may have a range (width). In this case, an evaluation value is created depending on whether or not the luminance of the measurement target is within a predetermined range.
- the luminance in the vicinity of the image area of the person is equal to or less than the threshold value and is determined as NG, and the luminance values in the other areas exceed the threshold value and are determined as OK.
- the brightness near the image area of the person is equal to or less than the threshold value and is OK, and the brightness of the other areas exceeds the threshold value and is determined to be NG.
- the determination / instruction unit 35 determines that there is at least one NG among the evaluation values of all the divided regions (No in step 103)
- the determination / instruction unit 35 sends the information to the control unit 40.
- the control unit 40 sets the exposure amount for setting A to be one step higher than the first setting described above, and sets the exposure amount for setting B to be one step lower than the first setting described above. (Step 104).
- the image processing circuit 30 further executes steps 102 and 103 in order.
- the threshold value for creating the evaluation value may be changed for each loop.
- step 102 in the second order the determination / instruction unit 35 only determines the divided areas that are determined to be NG in both the first image processing unit 31 and the second image processing unit 32 in the previous (first order) step 102. May be metered.
- each image processing unit may create again the evaluation values of all the divided regions regardless of the evaluation result in each image processing unit in the previous step 102.
- step 103 when the evaluation values of all the divided areas are OK, the control unit 40 ends the AE process.
- control unit 40 may control the aperture value of the aperture 12 by the drive unit 53 in addition to the electronic shutter and / or gain control.
- control unit 40 performs parallel readout control on image signals generated under different exposure conditions for each pixel line group A and for each pixel line group B, and the first image processing unit 31 and the 2nd image process part 32 process those image signals in parallel, respectively. That is, since the AE evaluation process in the first image processing unit 31 and the second image processing unit 32 is executed in parallel at high speed, the speed of the AE process can be increased.
- control unit 40 causes the display unit 52 to display an image processed by any one of the first image processing unit 31 and the second image processing unit 32 as a through image. be able to.
- FIG. 6 is a diagram for explaining another processing example of AE processing by the control unit 40 and the image processing unit as the second embodiment.
- the description will focus on the differences from the first embodiment, and the same matters will be omitted or simplified.
- control unit 40 transmits a predetermined pixel line group (first pixel line group) in the pixel line group A via the first output path 36 in the same manner as in the first embodiment. Read line thinning out from.
- control unit 40 is an image of a partial region (partial pixel line) of the entire image generated by the pixel line group B via the second output path 37 and is not thinned out. Read a partial image that is an image.
- the partial image is, for example, an image of the following region in the entire image generated by the pixel line group B.
- the area is, for example, the central area, the upper end area, the lower end area, an arbitrary area set by the user, or an area set statically like these, but the control unit 40 Is an area dynamically set according to a predetermined algorithm.
- the region dynamically set by the control unit 40 is a region including an image of an object detected by an algorithm such as face detection, car license plate detection, or a specific object detection.
- the number of pixel lines for generating a partial image is typically determined according to the thinning rate in the first image processing unit 31.
- the number of pixel lines is such that when the thinning rate is 5 in the first image processing unit 31, the size of the partial image is 1/5 the size of the entire image generated in the pixel line group B. Is a number.
- the partial image read out to the second image processing unit 32 is an image of an area including a human face detected by the face detection algorithm.
- the face detection process causes a delay of one frame compared to the processing by the first image processing unit 31.
- the evaluation process in AE in each image processing unit is basically the same as that in the first embodiment, and an evaluation value is created by, for example, a division photometry method (see FIG. 4).
- the photometric accuracy in the partial image can be increased in the AE process.
- a partially highly accurate AE process can be performed.
- control unit 40 can cause the display unit 52 to display the thinned image output from the first image processing unit 31 as a through image.
- FIG. 7 is a diagram for explaining the AF processing.
- control unit 40 reads the image read from the pixel line group A to the first image processing unit 31 via the first output path 36, and the second output path from the pixel line group B.
- the image read out to the second image processing unit 32 via 37 is a thinned image as in the first embodiment.
- the exposure condition setting A in the pixel line group A and the exposure condition setting B in the pixel line group B may be the same or different. Different cases will be described in a sixth embodiment to be described later.
- control unit 40 and the image processing circuit 30 execute AF processing (for example, contrast AF processing).
- AF processing for example, contrast AF processing
- the control unit 40 sends an instruction to the driving unit 53 to move the optical component 10 in a predetermined direction, for example.
- the solid-state imaging device 20 outputs image frames with different focus.
- FIG. 8 is a diagram showing the timing for each image frame read based on an instruction from the control unit 40.
- the readout period of the image frame to the first image processing unit 31 is synchronized with the synchronization signal A generated from the timing generator 54.
- the readout period of the image frame to the second image processing unit 32 is synchronized with the synchronization signal B.
- the synchronization signal B is a signal generated inside the solid-state imaging device 20 based on the synchronization signal A, for example.
- the periods of the synchronization signals A and B are the same, but the generation timings of these synchronization signals are shifted from each other within a period shorter than the period. Typically, it is shifted by 1 ⁇ 2 period.
- control unit 40 performs the read control so that the two output paths 36 and 37 are shifted by a time shorter than the read cycle (for example, a 1/2 cycle).
- the deviation of the read cycle between the two output paths 36 and 37 may not be 1/2 cycle, but may be 1/3 or 1/4.
- the deviation of the read cycle between the output paths may be 1 / n.
- FIG. 9 is a flowchart showing operations performed by the control unit 40 and the image processing circuit 30 in the third embodiment.
- control unit 40 reads out each image signal by shifting the readout cycle between the two output paths 36 and 37 (step 201).
- the first image processing unit 31 and the second image processing unit 32 sequentially acquire image frames with different focus output from the solid-state imaging device 20 based on the synchronization signal transmitted from the control unit 40.
- the image processing circuit 30 executes an evaluation process in the contrast AF process (step 202).
- the second image processing unit 32 performs the following steps 202-1, 202-2, 202-3, and 202-4 with a delay of 1 ⁇ 2 cycle from the processing of the first image processing unit 31. Run each one.
- Both image processing units 31 and 32 detect, for example, a specific area in the image (step 202-1), and execute the face detection process as described above, respectively, and set the focus position in the area including the face image.
- Set step 202-2.
- the focus position is not limited to face detection, and the focus position may be set in a statically or dynamically set region as described in the second embodiment.
- Both image processing units 31 and 32 calculate the contrast value (evaluation value) of the luminance in the predetermined area including the set focus position (step 202-3), and output the evaluation value to the control unit 40 (step step). 203-4).
- the control unit 40 controls the driving of the optical component 10 so that the evaluation value obtained from the determination / instruction unit 35 is maximized.
- control unit 40 executes the read control so as to shift the half cycle between the two output paths, so that the frame rate of the read and evaluation process is one image processing unit. Compared to twice. As a result, the speed of the evaluation process in the image processing circuit 30 is twice the speed of the evaluation process by one image processing unit. As a result, the AF process can be speeded up.
- the thinning-out rate of the image signal readout of the pixel line groups A and B may be the same or different.
- FIG. 10 is a diagram for explaining another processing example of AF processing by the control unit 40 and the image processing circuit 30 as a fourth embodiment.
- the description will focus on the differences from the third embodiment, and the same matters will be omitted or simplified.
- control unit 40 performs row thinning readout via the first output path 36.
- control unit 40 is an image that is not thinned out via the second output path 37, and the partial images of the entire image generated in the pixel line group B are received in parallel (synchronized reception at the same timing). Read in response to the signal. That is, it is the same as in the second embodiment.
- the exposure condition setting A in the pixel line group A and the exposure condition setting B in the pixel line group B may be the same or different.
- the image processing circuit 30 executes evaluation processing for AF processing similar to step 202 in FIG.
- the first image processing unit 31 and the second image processing unit 32 execute each process according to the synchronization signal received at the same timing.
- the partial image processed by the second image processing unit 32 is a high-resolution image that is not thinned out. Therefore, the evaluation accuracy in the partial image can be increased in the AF process. As a result, partially highly accurate AF processing can be performed.
- the above-described fourth embodiment and third embodiment may be combined. That is, the control unit 40 reads out the row-thinned image via the first output path 36, shifts the readout cycle as in the third embodiment, and reads out the partial image via the second output path 37. Then, the image processing circuit 30 executes each process with respect to these row-thinned images and partial images at different timings as in the third embodiment. Thereby, partially highly accurate AF processing and high-speed AF processing can be realized.
- the imaging control apparatus executes a photometric process (an evaluation process in the AE process) before the AF process in the fourth embodiment described above. You can also.
- the control unit 40 reads out the thinned image through the first output path 36 and reads out the partial image through the second output path 37.
- a threshold value that is, when the brightness of both images does not change much
- the above-described exposure condition setting A And setting B are the same setting.
- the angle of view (image range) of the thinned image at the time of line thinning readout and the angle of view of the partial image are different, it is expected that the brightness of each is different.
- the thinned-out image includes the top of the mountain as a background, and for example, the vicinity of the mountain top may be very bright.
- the setting A which is the exposure condition is set.
- B are preferably different.
- the imaging control apparatus performs photometry processing with the exposure condition settings A and B as different settings, and in the exposure conditions of the settings A and B, the fourth embodiment or 5 AF processing can be executed.
- the imaging control apparatus performs photometry processing with the exposure condition settings A and B as different settings, and in the exposure conditions of the settings A and B, the fourth embodiment or 5 AF processing can be executed.
- the first image processing unit 31 executes an evaluation process for AF processing, and outputs the thinned image as a through image to the display unit 52 via the output unit 51.
- the second image processing unit 32 acquires a partial image, and the partial image is not displayed on the display unit 52 (in the background) for high-precision AF processing. Execute the evaluation process.
- the control unit 40 can set a setting B that is suitable for AF processing and increases the exposure amount. Thereby, highly accurate AF processing becomes possible.
- FIG. 11 shows a configuration of a solid-state imaging device according to another configuration example of the present technology.
- each pixel line 205 and its output path 36 or 37 are determined on a one-to-one basis in advance.
- each pixel 21 has a plurality of output lines 29a and 29b, which can be connected to the vertical signal lines 213a and 213b by the pixel selection switches 24a and 24b, respectively.
- the vertical signal line 213a can be connected to the horizontal signal line 214a via the column selection switch 28a
- the vertical signal line 213b can be connected to the horizontal signal line 214b via the column selection switch 28b.
- the output path of one pixel line 205 can be switched by switching the pixel selection switches 24a and 24b.
- three or more vertical signal lines are provided for a group of pixels in one column, and the vertical signal lines are respectively connected to one pixel by a pixel selection switch. It may be connectable. In this case, for example, when two output paths are provided, the greater the number of vertical signal lines, the higher the degree of freedom in selecting a pixel line for outputting an image signal.
- the solid-state imaging device 20 may be a CCD (Charge Coupled Device) image sensor. Also in this case, the imaging apparatus 100 can implement each of the above embodiments by including a plurality of output paths and a plurality of image processing units.
- CCD Charge Coupled Device
- line skip reading is executed from at least one of the output paths.
- the output path of all image signals generated in the pixel line group A may be read and processed by the first image processing unit 31.
- the solid-state imaging device 20 has two output paths, but may have, for example, four or more output paths.
- the image processing circuit 30 may include four or more image processing units according to the number of output paths.
- the selection of the pixel line group is limited by hardware.
- a frame buffer that can buffer the image signal generated by the solid-state imaging device in units of frames may be provided.
- the control unit can access the frame buffer and freely select pixel lines to be output from the output paths 36 and 37 by software processing.
- the present technology can be configured as follows.
- the image pickup unit having a plurality of pixel lines performs read control of the image pickup unit so as to read in parallel the image signals generated under different image pickup conditions for each pixel line group among the plurality of pixel lines.
- a control unit configured in And a plurality of image processing units configured to respectively process the image signals read from the imaging unit.
- the imaging control device according to (1) The control unit is configured to perform row thinning readout from each pixel line group,
- the plurality of image processing units are configured to respectively process thinned images read by the row thinning readout.
- the imaging control device configured to execute control to read out an image signal generated under different exposure conditions for each pixel line group from the imaging unit,
- the plurality of image processing units are configured to execute an evaluation process in an AE (Automatic Exposure) process, respectively.
- the imaging control device is configured to execute control to read out an image signal generated under different exposure conditions for each pixel line group from the imaging unit,
- the plurality of image processing units are configured to each perform an evaluation process in an AF (Automatic Focus) process.
- the imaging control device executes row thinning readout from the first pixel line group among the plurality of pixel lines, and among the entire images generated by a second pixel line group different from the first pixel line group It is configured to perform reading of partial images that are images of some areas and not thinned out,
- the plurality of image processing units are configured to respectively process the thinned image and the partial image read by the row thinning readout.
- the imaging control device according to (5), The control unit is configured to execute control to read out an image signal generated under different exposure conditions for each pixel line group from the imaging unit,
- the plurality of image processing units are configured to respectively execute an evaluation process in an AE process.
- the imaging control device is configured to execute control to read out an image signal generated under different exposure conditions for each pixel line group from the imaging unit,
- the plurality of image processing units are configured to execute an evaluation process in an AF process, respectively.
- the imaging control device is configured to read out the image signal at a predetermined readout cycle, and to execute the readout control so as to shift the readout timing in parallel by a time shorter than the readout cycle,
- the plurality of image processing units are configured to execute an evaluation process in an AF process.
- the imaging control device according to any one of (1) to (8),
- the said control part is comprised so that an image signal may be respectively read in parallel via the several output path
- the image pickup unit having a plurality of pixel lines is configured to read out an image signal generated for each pixel line group of the plurality of pixel lines at a predetermined readout cycle, and the timing of parallel readout of the image signals is A control unit configured to execute readout control of the imaging unit so as to be shifted by a time shorter than a readout cycle; And a plurality of image processing units configured to respectively process the image signals read from the imaging unit.
- the imaging control device according to (9), The plurality of image processing units are configured to execute an evaluation process in an AF (Automatic Focus) process.
- An imaging unit having a plurality of pixel lines; A plurality of output paths configured to be connectable to the imaging unit for each pixel line group among the plurality of pixel lines; The imaging unit is configured to set different imaging conditions for each of the pixel line groups, and reads out image signals respectively generated in the pixel line groups under the different imaging conditions via the plurality of output paths.
- a control unit configured to perform read control of An imaging apparatus comprising: a plurality of image processing units configured to respectively process the image signals read from the imaging unit.
- An imaging unit having a plurality of pixel lines; A plurality of output paths configured to be connectable to the imaging unit for each pixel line group among the plurality of pixel lines; The imaging unit is configured to set different imaging conditions for each of the pixel line groups, and reads out image signals respectively generated in the pixel line groups under the different imaging conditions via the plurality of output paths.
- a control unit configured to perform read control of An imaging system comprising: a plurality of image processing units configured to respectively process the image signals read from the imaging unit.
- the readout control of the imaging unit is executed so as to read out in parallel the image signals generated under different imaging conditions for each pixel line group among the plurality of pixel lines, An imaging control method for processing each of the image signals read from the imaging unit.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Studio Devices (AREA)
Abstract
Description
本技術の目的は、撮像処理の高速化または高精度化を図ることができる撮像制御装置、撮像装置、撮像システムおよび撮像制御方法を提供することにある。
前記制御部は、複数の画素ラインを有する撮像部の、前記複数の画素ラインのうち画素ライン群ごとに異なる撮像条件でそれぞれ生成された画像信号を並列的に読み出すように、前記撮像部の読み出し制御を実行するように構成される。
前記複数の画像処理部は、前記撮像部から読み出された前記画像信号をそれぞれ処理するように構成される。
制御部は、画素ライン群ごとに異なる撮像条件でそれぞれ生成された画像信号を並列的に読み出す制御を実行し、複数の画像処理部がそれらの画像信号をそれぞれ処理する。したがって、画像処理部の処理内容に応じて、処理の高速化または高精度化を図ることができる。
異なる露光条件で生成された画像信号について、AEにおける評価処理が並列的に実行されるので、高速な評価処理(測光処理)を実現できる結果、AE処理の高速化を図ることができる。
例えば1つの画像処理部が、AF処理に適した露光条件で設定された画像について評価処理を実行することにより、高精度または最適なAF処理を実現できる。
間引き画像の解像度に比べて部分画像の解像度は高いので、AE処理において部分画像内の測光精度を高めることができる結果、部分的に高精度なAE処理を行うことができる。
例えば間引き画像が表示部に表示されるいわゆるスルー画として用いられる場合、AFに適した露光条件で生成された、表示部に表示されない部分画像について、複数の画像処理部のうちの1つがAF処理の評価処理を実行することができる。その結果、高精度または最適なAF処理を実現できる。
これにより、複数の画像処理部は、これら画像処理部の全体で、読み出し周期より短い時間(例えば半周期)ごとに処理を実行することができるので、AF処理の高速化を図ることができる。
前記制御部は、複数の画素ラインを有する撮像部の、前記複数の画素ラインの画素ライン群ごとに生成された画像信号を所定の読み出し周期で読み出すように構成され、前記画像信号の並列的な読み出しのタイミングを前記読み出し周期より短い時間分(例えば半周期)ずらすように、前記撮像部の読み出し制御を実行するように構成される。
複数の画像処理部は、前記撮像部から読み出された前記画像信号をそれぞれ処理するように構成される。
複数の画像処理部により、読出し周期より短い時間(例えば半周期)ごとに処理を実行することができるので、処理の高速化を図ることができる。
これにより、AF処理の高速化を図ることができる。
前記撮像部は、複数の画素ラインを有する。
前記複数の出力経路は、前記複数の画素ラインのうち画素ライン群ごとに前記撮像部に接続可能に構成される。
前記制御部は、前記画素ライン群ごとに異なる撮像条件を設定するように構成され、前記異なる撮像条件で前記画素ライン群でそれぞれ生成された画像信号を、前記複数の出力経路を介して読み出すように、前記撮像部の読み出し制御を実行するように構成される。
前記複数の画像処理部は、前記撮像部から読み出された前記画像信号をそれぞれ処理するように構成される。
前記撮像部から読み出された前記画像信号がそれぞれ処理される。
1.撮像装置の構成例1
固体撮像素子20はタイミングジェネレータ54からの駆動同期信号と制御部40からのレジスタ設定に基づいて、固体撮像素子20内部にてフォトダイオード22で蓄積された電荷を取り出すために必要なパルスを生成する。例えば、画素21ごとに設けられた垂直転送用の画素選択スイッチ24のON/OFFが画素ライン(つまり行)205ごとに制御される。また、垂直信号線213a、213bを選択するための水平転送用の列選択スイッチ28a、28bのON/OFFが制御される。
次に、上記のように構成された撮像装置100における主に画像処理回路30による処理の各実施形態について説明する。
1)第1の実施形態
図4は、制御部40および画像処理回路30によるAE処理を示すフローチャートである。
図6は、第2の実施形態として、制御部40および画像処理部によるAE処理の他の処理例を説明するための図である。ここでは、上記第1の実施形態と異なる点を中心に説明し、同様の事項は省略しまたは簡略化して説明する。
次に、撮像装置100における主に画像処理回路30によるAF処理の例について説明する。図7は、そのAF処理を説明するための図である。
図10は、第4の実施形態として、制御部40および画像処理回路30によるAF処理の他の処理例を説明するための図である。ここでは、上記第3の実施形態と異なる点を中心に説明し、同様の事項は省略しまたは簡略化して説明する。
第5の実施形態に係る処理として、上記した第4の実施形態と第3の実施形態とを組み合わせてもよい。すなわち、制御部40は、第1の出力経路36を介して行間引き画像を読み出し、読み出し周期を第3の実施形態のようにずらして、第2の出力経路37を介して部分画像を読み出す。そして、画像処理回路30は、それら行間引き画像および部分画像に対して、第3の実施形態のようにタイミングをずらして各処理を実行する。これにより、部分的に高精度なAF処理および高速なAF処理を実現できる。
第6の実施形態に係る処理として、例えば、撮像制御装置は、上述した第4の実施形態等のAF処理前に測光処理(AE処理における評価処理)を実行することもできる。
第7の実施形態は、上記第6の実施形態と同様に、AF処理前に測光処理(AE処理における評価処理)を実行する例について説明する。
(1)
複数の画素ラインを有する撮像部の、前記複数の画素ラインのうち画素ライン群ごとに異なる撮像条件でそれぞれ生成された画像信号を並列的に読み出すように、前記撮像部の読み出し制御を実行するように構成された制御部と、
前記撮像部から読み出された前記画像信号をそれぞれ処理するように構成された複数の画像処理部と
を具備する撮像制御装置。
(2)
前記(1)に記載の撮像制御装置であって、
前記制御部は、各画素ライン群から行間引き読み出しを実行するように構成され、
前記複数の画像処理部は、前記行間引き読み出しにより読み出された間引き画像をそれぞれ処理するように構成される
撮像制御装置。
(3)
前記(2)に記載の撮像制御装置であって、
前記制御部は、前記画素ライン群ごとに異なる露光条件で生成された画像信号を前記撮像部から読み出す制御を実行するように構成され、
前記複数の画像処理部は、AE(Automatic Exposure)処理における評価処理をそれぞれ実行するように構成される
撮像制御装置。
(4)
前記(2)に記載の撮像制御装置であって、
前記制御部は、前記画素ライン群ごとに異なる露光条件で生成された画像信号を前記撮像部から読み出す制御を実行するように構成され、
前記複数の画像処理部は、AF(Automatic Focus)処理における評価処理をそれぞれ実行するように構成される
撮像制御装置。
(5)
前記(1)に記載の撮像制御装置であって、
前記制御部は、前記複数の画素ラインのうち第1の画素ライン群から行間引き読み出しを実行し、前記第1の画素ライン群とは異なる第2の画素ライン群で生成される全体画像のうち一部の領域の画像であって間引かれない画像である部分画像の読み出しを実行するように構成され、
前記複数の画像処理部は、前記行間引き読み出しにより読み出された前記間引き画像、および前記部分画像をそれぞれ処理するように構成される
撮像制御装置。
(6)
前記(5)に記載の撮像制御装置であって、
前記制御部は、前記画素ライン群ごとに異なる露光条件で生成された画像信号を前記撮像部から読み出す制御を実行するように構成され、
前記複数の画像処理部は、AE処理における評価処理をそれぞれ実行するように構成される
撮像制御装置。
(7)
前記(5)に記載の撮像制御装置であって、
前記制御部は、前記画素ライン群ごとに異なる露光条件で生成された画像信号を前記撮像部から読み出す制御を実行するように構成され、
前記複数の画像処理部は、AF処理における評価処理をそれぞれ実行するように構成される
撮像制御装置。
(8)
前記(1)に記載の撮像制御装置であって、
前記制御部は、前記画像信号を所定の読み出し周期で読み出し、並列的なその読み出しのタイミングを、前記読み出し周期より短い時間分ずらすように、前記読み出し制御を実行するように構成され、
前記複数の画像処理部は、AF処理における評価処理を実行するように構成される
撮像制御装置。
(9)
前記(1)から(8)のうちいずれか1つに記載の撮像制御装置であって、
前記制御部は、前記撮像部に接続された複数の出力経路を介して画像信号を並列的にそれぞれ読み出すように構成される
撮像制御装置。
(10)
複数の画素ラインを有する撮像部の、前記複数の画素ラインの画素ライン群ごとに生成された画像信号を所定の読み出し周期で読み出すように構成され、前記画像信号の並列的な読み出しのタイミングを前記読み出し周期より短い時間分ずらすように、前記撮像部の読み出し制御を実行するように構成された制御部と、
前記撮像部から読み出された前記画像信号をそれぞれ処理するように構成された複数の画像処理部と
を具備する撮像制御装置。
(11)
前記(9)に記載の撮像制御装置であって、
前記複数の画像処理部は、AF(Automatic Focus)処理における評価処理を実行するように構成される
撮像制御装置。
(12)
複数の画素ラインを有する撮像部と、
前記複数の画素ラインのうち画素ライン群ごとに前記撮像部に接続可能に構成された複数の出力経路と、
前記画素ライン群ごとに異なる撮像条件を設定するように構成され、前記異なる撮像条件で前記画素ライン群でそれぞれ生成された画像信号を、前記複数の出力経路を介して読み出すように、前記撮像部の読み出し制御を実行するように構成された制御部と、
前記撮像部から読み出された前記画像信号をそれぞれ処理するように構成された複数の画像処理部と
を具備する撮像装置。
(13)
複数の画素ラインを有する撮像部と、
前記複数の画素ラインのうち画素ライン群ごとに前記撮像部に接続可能に構成された複数の出力経路と、
前記画素ライン群ごとに異なる撮像条件を設定するように構成され、前記異なる撮像条件で前記画素ライン群でそれぞれ生成された画像信号を、前記複数の出力経路を介して読み出すように、前記撮像部の読み出し制御を実行するように構成された制御部と、
前記撮像部から読み出された前記画像信号をそれぞれ処理するように構成された複数の画像処理部と
を具備する撮像システム。
(14)
複数の画素ラインを有する撮像部の、前記複数の画素ラインのうち画素ライン群ごとに異なる撮像条件でそれぞれ生成された画像信号を並列的に読み出すように、前記撮像部の読み出し制御を実行し、
前記撮像部から読み出された前記画像信号をそれぞれ処理する
撮像制御方法。
30…画像処理回路
31…第1の画像処理部
32…第2の画像処理部
36…第1の出力経路
37…第2の出力経路
40…制御部
100…撮像装置
201…画素アレイ
205…画素ライン
Claims (14)
- 複数の画素ラインを有する撮像部の、前記複数の画素ラインのうち画素ライン群ごとに異なる撮像条件でそれぞれ生成された画像信号を並列的に読み出すように、前記撮像部の読み出し制御を実行するように構成された制御部と、
前記撮像部から読み出された前記画像信号をそれぞれ処理するように構成された複数の画像処理部と
を具備する撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記制御部は、各画素ライン群から行間引き読み出しを実行するように構成され、
前記複数の画像処理部は、前記行間引き読み出しにより読み出された間引き画像をそれぞれ処理するように構成される
撮像制御装置。 - 請求項2に記載の撮像制御装置であって、
前記制御部は、前記画素ライン群ごとに異なる露光条件で生成された画像信号を前記撮像部から読み出す制御を実行するように構成され、
前記複数の画像処理部は、AE(Automatic Exposure)処理における評価処理をそれぞれ実行するように構成される
撮像制御装置。 - 請求項2に記載の撮像制御装置であって、
前記制御部は、前記画素ライン群ごとに異なる露光条件で生成された画像信号を前記撮像部から読み出す制御を実行するように構成され、
前記複数の画像処理部は、AF(Automatic Focus)処理における評価処理をそれぞれ実行するように構成される
撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記制御部は、前記複数の画素ラインのうち第1の画素ライン群から行間引き読み出しを実行し、前記第1の画素ライン群とは異なる第2の画素ライン群で生成される全体画像のうち一部の領域の画像であって間引かれない画像である部分画像の読み出しを実行するように構成され、
前記複数の画像処理部は、前記行間引き読み出しにより読み出された前記間引き画像、および前記部分画像をそれぞれ処理するように構成される
撮像制御装置。 - 請求項5に記載の撮像制御装置であって、
前記制御部は、前記画素ライン群ごとに異なる露光条件で生成された画像信号を前記撮像部から読み出す制御を実行するように構成され、
前記複数の画像処理部は、AE処理における評価処理をそれぞれ実行するように構成される
撮像制御装置。 - 請求項5に記載の撮像制御装置であって、
前記制御部は、前記画素ライン群ごとに異なる露光条件で生成された画像信号を前記撮像部から読み出す制御を実行するように構成され、
前記複数の画像処理部は、AF処理における評価処理をそれぞれ実行するように構成される
撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記制御部は、前記画像信号を所定の読み出し周期で読み出し、並列的なその読み出しのタイミングを、前記読み出し周期より短い時間分ずらすように、前記読み出し制御を実行するように構成され、
前記複数の画像処理部は、AF処理における評価処理を実行するように構成される
撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記制御部は、前記撮像部に接続された複数の出力経路を介して画像信号を並列的にそれぞれ読み出すように構成される
撮像制御装置。 - 複数の画素ラインを有する撮像部の、前記複数の画素ラインの画素ライン群ごとに生成された画像信号を所定の読み出し周期で読み出すように構成され、前記画像信号の並列的な読み出しのタイミングを前記読み出し周期より短い時間分ずらすように、前記撮像部の読み出し制御を実行するように構成された制御部と、
前記撮像部から読み出された前記画像信号をそれぞれ処理するように構成された複数の画像処理部と
を具備する撮像制御装置。 - 請求項9に記載の撮像制御装置であって、
前記複数の画像処理部は、AF(Automatic Focus)処理における評価処理を実行するように構成される
撮像制御装置。 - 複数の画素ラインを有する撮像部と、
前記複数の画素ラインのうち画素ライン群ごとに前記撮像部に接続可能に構成された複数の出力経路と、
前記画素ライン群ごとに異なる撮像条件を設定するように構成され、前記異なる撮像条件で前記画素ライン群でそれぞれ生成された画像信号を、前記複数の出力経路を介して読み出すように、前記撮像部の読み出し制御を実行するように構成された制御部と、
前記撮像部から読み出された前記画像信号をそれぞれ処理するように構成された複数の画像処理部と
を具備する撮像装置。 - 複数の画素ラインを有する撮像部と、
前記複数の画素ラインのうち画素ライン群ごとに前記撮像部に接続可能に構成された複数の出力経路と、
前記画素ライン群ごとに異なる撮像条件を設定するように構成され、前記異なる撮像条件で前記画素ライン群でそれぞれ生成された画像信号を、前記複数の出力経路を介して読み出すように、前記撮像部の読み出し制御を実行するように構成された制御部と、
前記撮像部から読み出された前記画像信号をそれぞれ処理するように構成された複数の画像処理部と
を具備する撮像システム。 - 複数の画素ラインを有する撮像部の、前記複数の画素ラインのうち画素ライン群ごとに異なる撮像条件でそれぞれ生成された画像信号を並列的に読み出すように、前記撮像部の読み出し制御を実行し、
前記撮像部から読み出された前記画像信号をそれぞれ処理する
撮像制御方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/313,394 US10623669B2 (en) | 2014-06-10 | 2015-03-30 | Image capturing control apparatus, image capturing apparatus, image capturing system and image capturing control method |
JP2016527614A JP6583268B2 (ja) | 2014-06-10 | 2015-03-30 | 撮像制御装置、撮像装置、撮像システムおよび撮像制御方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014120014 | 2014-06-10 | ||
JP2014-120014 | 2014-06-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015190021A1 true WO2015190021A1 (ja) | 2015-12-17 |
Family
ID=54833140
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/001811 WO2015190021A1 (ja) | 2014-06-10 | 2015-03-30 | 撮像制御装置、撮像装置、撮像システムおよび撮像制御方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10623669B2 (ja) |
JP (1) | JP6583268B2 (ja) |
WO (1) | WO2015190021A1 (ja) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110413805B (zh) * | 2018-04-25 | 2022-02-01 | 杭州海康威视数字技术股份有限公司 | 一种图像存储方法、装置、电子设备及存储介质 |
WO2020061813A1 (zh) * | 2018-09-26 | 2020-04-02 | 深圳市大疆创新科技有限公司 | 图像处理系统和图像处理方法 |
KR20220097967A (ko) * | 2019-11-29 | 2022-07-08 | 엘지전자 주식회사 | 방사선 디텍터 및 이를 이용한 방사선 촬영 방법 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006245784A (ja) * | 2005-03-01 | 2006-09-14 | Canon Inc | 固体撮像装置及びその駆動方法並びに撮像システム |
JP2007318708A (ja) * | 2006-04-28 | 2007-12-06 | Eastman Kodak Co | 撮像装置 |
JP2008141610A (ja) * | 2006-12-04 | 2008-06-19 | Matsushita Electric Ind Co Ltd | 固体撮像装置及び撮像システム |
WO2013164915A1 (ja) * | 2012-05-02 | 2013-11-07 | 株式会社ニコン | 撮像装置 |
JP2014023114A (ja) * | 2012-07-23 | 2014-02-03 | Nikon Corp | 撮像装置およびプログラム |
WO2015001646A1 (ja) * | 2013-07-04 | 2015-01-08 | 株式会社ニコン | 電子機器、電子機器の制御方法、及び制御プログラム |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4426885B2 (ja) | 2004-03-23 | 2010-03-03 | オリンパス株式会社 | 固体撮像装置 |
JP2007150643A (ja) * | 2005-11-28 | 2007-06-14 | Sony Corp | 固体撮像素子、固体撮像素子の駆動方法および撮像装置 |
US7653299B2 (en) | 2006-04-28 | 2010-01-26 | Eastman Kodak Company | Imaging apparatus |
-
2015
- 2015-03-30 JP JP2016527614A patent/JP6583268B2/ja active Active
- 2015-03-30 US US15/313,394 patent/US10623669B2/en active Active
- 2015-03-30 WO PCT/JP2015/001811 patent/WO2015190021A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006245784A (ja) * | 2005-03-01 | 2006-09-14 | Canon Inc | 固体撮像装置及びその駆動方法並びに撮像システム |
JP2007318708A (ja) * | 2006-04-28 | 2007-12-06 | Eastman Kodak Co | 撮像装置 |
JP2008141610A (ja) * | 2006-12-04 | 2008-06-19 | Matsushita Electric Ind Co Ltd | 固体撮像装置及び撮像システム |
WO2013164915A1 (ja) * | 2012-05-02 | 2013-11-07 | 株式会社ニコン | 撮像装置 |
JP2014023114A (ja) * | 2012-07-23 | 2014-02-03 | Nikon Corp | 撮像装置およびプログラム |
WO2015001646A1 (ja) * | 2013-07-04 | 2015-01-08 | 株式会社ニコン | 電子機器、電子機器の制御方法、及び制御プログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2015190021A1 (ja) | 2017-04-20 |
US20180234645A1 (en) | 2018-08-16 |
US10623669B2 (en) | 2020-04-14 |
JP6583268B2 (ja) | 2019-10-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8823851B2 (en) | Image capturing apparatus and control method for image capturing apparatus | |
CN105530427B (zh) | 摄像元件、摄像装置 | |
KR102652933B1 (ko) | 촬상 제어 장치, 촬상 장치 및 촬상 제어 방법 | |
US9160934B2 (en) | Image capturing apparatus obtaining high-exposure and low-exposure images, and method for controlling the same | |
US20130308044A1 (en) | Imaging apparatus, image sensor, imaging control method, and program | |
US9854178B2 (en) | Image pickup apparatus with flicker detection and having plurality of unit pixel areas, control method therefor, and storage medium | |
JP2013223054A (ja) | 撮像素子、撮像素子の制御方法、および、撮像装置 | |
US10063762B2 (en) | Image sensor and driving method thereof, and image capturing apparatus with output signal control according to color | |
US9025065B2 (en) | Solid-state imaging device, image processing method, and camera module for creating high dynamic range images | |
US10582113B2 (en) | Image pickup device, image pickup apparatus, image pickup apparatus control method and computer-readable non-transitory recording medium in which processing program is recorded | |
JP5406889B2 (ja) | 撮像装置及びその制御方法 | |
JP2019068351A (ja) | 撮像装置、撮像装置の制御方法 | |
US8223255B2 (en) | Imaging apparatus, auto-focusing method and recording medium | |
JP6583268B2 (ja) | 撮像制御装置、撮像装置、撮像システムおよび撮像制御方法 | |
KR20150137984A (ko) | 고체 촬상 장치 및 촬상 방법 | |
JP2020170923A (ja) | 撮像素子およびその制御方法、撮像装置 | |
JP2019197985A (ja) | 撮像装置及び撮像装置の制御方法 | |
US11368610B2 (en) | Image capture apparatus and control method therefor | |
KR20150098547A (ko) | 고체 촬상 장치 및 카메라 시스템 | |
JP2017098790A (ja) | 撮像装置及びその制御方法、プログラム、記憶媒体 | |
JP2008042573A (ja) | 撮像装置及びその制御方法、撮像システム並びにプログラム | |
JP2015154153A (ja) | 撮像装置を備えたカメラ | |
JP2015203774A (ja) | 撮像装置及びその制御方法 | |
JP6167473B2 (ja) | 撮像装置 | |
JP6021544B2 (ja) | 焦点検出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15805983 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016527614 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15313394 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15805983 Country of ref document: EP Kind code of ref document: A1 |