WO2022269999A1 - 制御装置、制御方法、およびプログラム - Google Patents
制御装置、制御方法、およびプログラム Download PDFInfo
- Publication number
- WO2022269999A1 WO2022269999A1 PCT/JP2022/006303 JP2022006303W WO2022269999A1 WO 2022269999 A1 WO2022269999 A1 WO 2022269999A1 JP 2022006303 W JP2022006303 W JP 2022006303W WO 2022269999 A1 WO2022269999 A1 WO 2022269999A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- main
- main subject
- subjects
- subject
- shooting
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 238000001514 detection method Methods 0.000 claims description 15
- 230000008569 process Effects 0.000 claims description 15
- 230000008685 targeting Effects 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 description 30
- 238000012545 processing Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 230000002457 bidirectional effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/675—Focus control based on electronic image sensor signals comprising setting of focusing regions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/671—Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/676—Bracketing for image capture at varying focusing conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/743—Bracketing, i.e. taking a series of images with varying exposure conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
- H04N23/959—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Definitions
- the present technology relates to a control device, a control method, and a program, and more particularly to a control device, a control method, and a program that enable efficient continuous shooting of a main subject.
- Focus bracket shooting is a shooting method in which a plurality of shots are continuously performed while changing the focus position.
- the user can shoot multiple images in which each subject is in focus with a single release operation.
- Improvements to the autofocus function have made it possible to perform focus bracket shooting while shifting the focus position (focal length) in small steps, but by using the high resolution of the focus position, it is possible to focus at all distances. is not efficient.
- This technology was created in view of this situation, and enables efficient continuous shooting of the main subject.
- the control device distinguishes the main subjects separated by a distance equal to or greater than a threshold distance between the main subjects based on the distances to the respective main subjects included in the shooting range. and a shooting control unit for controlling continuous shooting for each main subject determined as a different main subject.
- the main subjects separated by a threshold distance or more are determined as different main subjects based on the distances to the main subjects included in the shooting range. Continuous shooting is controlled for each main subject determined as a different main subject.
- FIG. 3 is a diagram showing an example of a subject photographed by an imaging device
- FIG. 10 is a diagram showing an example of main subject determination processing
- FIG. 4 is a diagram showing an example of focus bracket photography by the photography device
- It is a block diagram which shows the hardware structural example of an imaging device.
- 3 is a block diagram showing a functional configuration example of a control unit
- FIG. FIG. 4 is a diagram showing an example of a shooting scene
- FIG. 4 is a diagram showing an example of a main subject degree map
- 4 is a flowchart for explaining imaging processing of the imaging device
- FIG. 9 is a flowchart for explaining main subject determination processing performed in step S3 of FIG. 8.
- FIG. It is a block diagram which shows the structural example of the hardware of a computer.
- FIG. 1 is a diagram showing an example of a subject photographed by an imaging device 1 according to an embodiment of the present technology.
- the photographing device 1 is a device having a photographing function, such as a smartphone, digital still camera, or surveillance camera.
- the imaging device 1 has a function of focus bracket imaging. Focus bracket shooting is a shooting method in which a plurality of shots are continuously performed while changing the focus position.
- the photographing device 1 has a function of detecting a main subject, which is a main subject, from among the subjects included in the photographing range. For example, the photographing device 1 detects the main subject based on the image photographed during preview photographing before execution of focus bracket photographing. A specific type of subject such as a person, a building, or a large subject is detected as the main subject. Detection of the main subject will be described later.
- subjects #1-1, #1-2, #2, #3, #4-1 to #4-4 are the main subjects among the subjects included in the imaging range of the imaging device 1. detected.
- Subjects #1-1 and #1-2 are people, and subject #2 is an automobile.
- Subject #3 is a house, and subjects #4-1 to #4-4 are buildings.
- the subject #1-1 and the subject #1-2 are located at approximately the same distance from each other with the position of the photographing device 1 as a reference.
- the subject #2 is positioned away from the subjects #1-1 and #1-2 with the position of the photographing device 1 as a reference.
- Subject #3 is located at a position distant from subject #2 with respect to the position of photographing device 1 .
- Subjects #4-1 to #4-4 are located farther from subject #3 than the position of photographing device 1, respectively.
- Focus bracketing is performed with such a subject as the main subject.
- a plurality of images in which the respective main subjects are in focus are captured by one focus bracket shooting.
- the focus bracket shooting by the shooting device 1 is performed by referring to the depth information (distance in the depth direction) and obtaining an image in which the main subjects located close to each other are in focus in one shot.
- Main subject determination processing for determining whether a plurality of main subjects are to be photographed as different main subjects or collectively as the same main subject is performed before execution of focus bracket photography.
- FIG. 2 is a diagram showing an example of main subject determination processing.
- main subjects located at a distance equal to or more than a threshold distance are determined as different main subjects. Also, when a certain main subject is focused, the main subjects located within the depth of field are determined as the same main subject.
- the subjects #1-1 and #1-2 are positioned within the depth of field when the subject #1-1 is focused, for example.
- a bidirectional arrow A1 indicates the range of depth of field when subject #1-1 is focused.
- the photographing device 1 determines that subjects #1-1 and #1-2 are the same main subject.
- the subjects #4-1 to #4-4 are located within the depth of field when the subject #4-1 is focused, for example.
- a bidirectional arrow A2 indicates the range of depth of field when subject #4-1 is focused.
- the photographing device 1 determines that subjects #4-1 to #4-4 are the same main subject.
- Subjects #2 and #3 are determined as different main subjects because they are located at positions separated from the other main subject by a threshold distance or more.
- the depth map is map information in which the depth information up to each position of each subject included in the shooting range is recorded as the pixel value of each pixel.
- the photographing device 1 is equipped with a distance sensor such as a ToF (Time-of-Flight) sensor capable of measuring the distance to each position of each subject.
- ToF Time-of-Flight
- FIG. 3 is a diagram showing an example of focus bracket photography by the photography device 1.
- FIG. 3 is a diagram showing an example of focus bracket photography by the photography device 1.
- the focus bracket shooting is performed by, for example, focusing on main subjects in order from the shortest distance as indicated by upward arrows A11 to A14. It is done as if Four shots are taken by a series of focus bracket shots.
- the first shooting is performed with subject #1-1 in focus, for example.
- Subject #1-2 is positioned within the depth of field when the position of subject #1-1 is used as a reference.
- the image is such that subject #1-2 is also in focus.
- the second shot is taken with subject #2 in focus.
- the image obtained by the second shooting is an image in which subject #2 is in focus.
- the third shooting is performed with subject #3 in focus.
- the image obtained by the third shooting is an image in which subject #3 is in focus.
- the fourth shooting is performed with subject #4-1 in focus, for example.
- Subjects #4-2 to #4-4 are located within the depth of field when the position of subject #4-1 is used as a reference. 1 as well as the subjects #4-2 to #4-4 are in focus.
- focus bracket photography by the photography device 1 is performed by collectively photographing main subjects within a short distance range such as a distance within the depth of field. As a result, compared to the case where all the main subjects of the subjects #1-1, #1-2, #2, #3, #4-1 to #4-4 are focused and continuously photographed, This reduces the number of shots and enables efficient focus bracket shooting.
- FIG. 4 is a block diagram showing a hardware configuration example of the imaging device 1. As shown in FIG. 4
- the photographing device 1 is configured by connecting a photographing unit 12 , a microphone 13 , a sensor 14 , a display 15 , an operation unit 16 , a speaker 17 , a storage unit 18 , and a communication unit 19 to the control unit 11 .
- the control unit 11 is composed of a CPU (Central Processing Unit), ROM (Read Only Memory), RAM (Random Access Memory), and the like.
- the control unit 11 executes a predetermined program and controls the overall operation of the imaging device 1 according to user's operations.
- the imaging device 1 having the control unit 11 functions as a control device that controls imaging.
- the photographing unit 12 is composed of a lens, an imaging device, etc., and performs photographing according to control by the control unit 11 .
- the imaging unit 12 outputs image data obtained by imaging to the control unit 11 .
- the microphone 13 outputs audio data such as collected sound to the control unit 11 .
- the sensor 14 is configured by a ToF sensor or the like.
- the sensor 14 measures the distance to each position of the subject included in the shooting range and outputs sensor data to the control unit 11 .
- the display 15 is configured by an LCD or the like, and displays various information such as menu screens and images being captured according to control by the control unit 11 .
- the operation unit 16 is composed of operation buttons, a touch panel, etc. provided on the surface of the housing of the photographing device 1 .
- the operation unit 16 outputs to the control unit 11 information representing the details of the user's operation.
- the speaker 17 outputs sound based on the audio signal supplied from the control unit 11.
- the storage unit 18 is composed of a flash memory or a memory card inserted into a card slot provided in the housing.
- the storage unit 18 stores various data such as image data supplied from the control unit 11 .
- the communication unit 19 performs wireless or wired communication with an external device.
- the communication unit 19 transmits various data such as image data supplied from the control unit 11 to a computer, a smartphone, or the like.
- FIG. 5 is a block diagram showing a functional configuration example of the control unit 11. As shown in FIG.
- control unit 11 is composed of a main subject detection unit 31, a depth map generation unit 32, a main subject determination unit 33, and a shooting control unit 34.
- Image data supplied from the photographing unit 12 is input to the main subject detection unit 31
- sensor data supplied from the sensor 14 is input to the depth map generation unit 32 .
- the main subject detection unit 31 detects a predetermined subject as the main subject from among the subjects included in the shooting range.
- FIG. 6 the method of detecting the main subject and the main subject intensity map will be explained using FIGS. 6 and 7.
- FIG. 6 is a diagram showing an example of a scene shot by the shooting device 1.
- FIG. 6 is a diagram showing an example of a scene shot by the shooting device 1.
- subjects #1-1, #1-2, #2, #3, #4-1 to #4-4 are included in the imaging range of the imaging device 1.
- subjects not shown in FIG. 1 and the like are also included in the photographing range.
- a main subject degree map as shown in FIG. 7 is generated as the main subject detection result based on the image obtained by photographing such a scene.
- the main subject degree map is map information in which the pixel value of each pixel is the main subject degree, which is a value representing the "likeness of the main subject" of the subject within the shooting range. For example, a bright-colored pixel is a pixel with a high main subject degree of the content captured in that pixel.
- pixels in which subjects #1-1, #1-2, #2, #3, #4-1 to #4-4 are shown are detected as pixels with a high degree of main subject. there is Pixels containing other subjects are detected as pixels with a low degree of main subject.
- Such a main subject intensity map is generated, for example, using an inference model obtained by machine learning.
- an inference model for a main subject intensity map is generated by performing machine learning using a plurality of images labeled with information indicating which subject is the main subject as training data.
- the main subject detection unit 31 inputs the image obtained by shooting to the inference model for the main subject degree map, and acquires the main subject degree map based on the output of the inference model.
- the main subject detection unit 31 detects a predetermined subject as the main subject based on the main subject degree map.
- a main subject degree map may be generated by analyzing the image captured by the imaging unit 12 . For example, an image captured by the imaging unit 12 is analyzed, and a specific type of subject, a large subject, or the like is detected as the main subject.
- the depth map generation unit 32 generates a depth map based on sensor data representing the distance to each position of each subject included in the shooting range.
- Depth map generation may be performed using the output of the ToF sensor, or may be performed using AI (Artificial Intelligence).
- AI Artificial Intelligence
- an inference model that inputs the image captured by the imaging unit 12 and outputs the distance to each position of each subject included in the imaging range is the depth map. It is prepared in the generation unit 32 .
- the depth map generation unit 32 inputs the image captured by the imaging unit 12 to the inference model generated by machine learning, and acquires the distance to each position of each subject.
- the photographing unit 12 is configured by a stereo camera
- the distance to each position of each subject may be obtained based on the image photographed by the stereo camera.
- the depth map information generated by the depth map generation unit 32 is output to the main subject determination unit 33 .
- the main subject determination unit 33 performs the main subject determination process described above based on the supplied main subject information and depth map information. Information representing the determination result is output to the imaging control unit 34 . It should be noted that another method may be used to determine whether different main subjects are to be photographed or the same main subject is to be photographed. Determination of the main subject is disclosed in Japanese Patent Application Laid-Open No. 2013-120949, for example.
- the shooting control unit 34 controls the shooting unit 12 to perform focus bracket shooting. That is, based on the information supplied from the main subject determination section 33, the shooting control section 34 performs continuous shooting while focusing on each main subject determined as a different main subject.
- step S1 the main subject detection section 31 detects the main subject based on the captured image.
- step S2 the depth map generation unit 32 generates a depth map based on sensor data representing the measurement results of the ToF sensor.
- steps S1 and S2 are processing during preview photography, and the processing after step S3 is processing during execution of focus bracket photography.
- step S3 the main subject determination unit 33 performs main subject determination processing based on the information on the main subject and the information on the depth map.
- the main subject determination process will be described later with reference to the flowchart of FIG.
- step S4 the shooting control unit 34 determines whether or not there are two or more main subjects at different distances based on the determination result of the main subject determination unit 33.
- step S5 When it is determined in step S4 that there are two or more main subjects at different distances, in step S5, the shooting control unit 34 focuses on each main subject at a distant position and performs continuous shooting.
- step S4 determines whether there are not two or more main subjects at different distances. If it is determined in step S4 that there are not two or more main subjects at different distances, the shooting control unit 34 controls shooting with normal autofocus processing in step S6. One shot is taken with one main subject in focus.
- the photographing device 1 can efficiently perform focus bracket photography targeting the main subject.
- step S3 of FIG. 8 The main subject determination process performed in step S3 of FIG. 8 will be described with reference to the flowchart of FIG.
- the main subject determination unit 33 divides the main subject area, which is the area of the main subject, into small areas of a predetermined size on the main subject degree map. For example, an area composed of pixels whose main subject degree is equal to or greater than a predetermined threshold is determined as the main subject area, and such main subject area is divided into small areas.
- step S12 the main subject determination unit 33 acquires depth information corresponding to each small area based on the depth map.
- step S13 the main subject determination unit 33 determines main subjects having a depth difference as different main subjects. Main subjects separated from each other by a predetermined threshold distance or more are determined as different main subjects.
- step S14 the main subject determination unit 33 determines main subjects at positions within the depth of field as the same main subject when focusing on a certain main subject.
- subject #1-1 and subject #2 in FIG. 1 are determined as different main subjects. Also, subject #1-1 and subject #1-2 are determined to be the same main subject. After that, the process returns to step S3 in FIG. 8 and the subsequent processes are performed.
- the photographing device 1 can efficiently perform focus bracket photography targeting the main subject.
- the above-described focus bracket photography in a device having a camera function may be performed according to control by an external device.
- the external control device is provided with the same configuration as the configuration of the control section 11 described with reference to FIG.
- the series of processes described above can be executed by hardware or by software.
- a program that constitutes the software is installed from a program recording medium into a computer built into dedicated hardware or a general-purpose personal computer.
- FIG. 10 is a block diagram showing a hardware configuration example of a computer that executes the series of processes described above by a program.
- a CPU Central Processing Unit 51
- a ROM Read Only Memory
- RAM Random Access Memory
- An input/output interface 55 is further connected to the bus 54 .
- An input unit 56 , an output unit 57 , a storage unit 58 , a communication unit 59 and a drive 60 are connected to the input/output interface 55 .
- a drive 60 drives a removable medium 61 such as a magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
- the CPU 51 loads, for example, a program stored in the storage unit 58 into the RAM 53 via the input/output interface 55 and the bus 54 and executes the above-described series of programs. is processed.
- Programs executed by the CPU 51 are, for example, recorded on the removable media 61 or provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and installed in the storage unit 58 .
- the program executed by the computer may be a program in which processing is performed in chronological order according to the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
- Embodiments of the present technology are not limited to the above-described embodiments, and various modifications are possible without departing from the gist of the present technology.
- this technology can take the configuration of cloud computing in which one function is shared by multiple devices via a network and processed jointly.
- each step described in the flowchart above can be executed by a single device, or can be shared by a plurality of devices.
- one step includes multiple processes
- the multiple processes included in the one step can be executed by one device or shared by multiple devices.
- the present technology can also take the following configurations.
- a control device comprising: a shooting control unit that controls continuous shooting for each of the main subjects determined as different main subjects.
- the main subject determination unit determines that other main subjects that fall within a depth of field when focused on the predetermined main subject are the same main subject, The control device according to (1), wherein the photographing control unit performs once photographing of a plurality of main subjects determined as the same main subject.
- the control device (4) The control device according to (3), wherein the depth map generation unit generates the depth map based on information obtained from a ToF sensor. (5) The control device according to (3) or (4), further comprising a main subject detection unit that generates a main subject degree map in which the main subject degree indicating the degree of main subject likeness is a pixel value of each pixel. (6) The main subject determination unit divides the area of the main subject on the main subject degree map into small areas of a predetermined size, and determines the main subject based on the distance of each small area obtained from the depth map. The control device according to (5) above.
- the control device (7) The control device according to (5) or (6), wherein the main subject detection unit receives an image and generates the main subject degree map using an inference model that outputs the main subject degree map.
- the shooting control unit controls focus bracket shooting in which each main subject is focused and continuously shot.
- the shooting control unit controls exposure bracket shooting in which continuous shooting is performed by adjusting exposure of each main subject.
- the control device determining, based on the distance to each main subject included in the photographing range, that the main subjects separated by a threshold distance or more are different main subjects; A control method for controlling continuous shooting targeting each of the main subjects determined as different main subjects.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Studio Devices (AREA)
Abstract
Description
1.フォーカスブラケット撮影について
2.撮影装置の構成
3.撮影装置の動作
4.その他
図1は、本技術の一実施形態に係る撮影装置1が撮影する被写体の例を示す図である。
図4は、撮影装置1のハードウェア構成例を示すブロック図である。
図8のフローチャートを参照して、撮影装置1の撮影処理について説明する。
主要被写体判定処理の結果に基づいてフォーカスブラケット撮影を行う場合について説明したが、露出を調整しながら連続撮影を行う露出ブラケット撮影にも上述した技術は適用可能である。すなわち、それぞれの主要被写体の明るさに応じて露出を調整しながら連続撮影を行う場合、被写界深度に収まる距離などの近い距離の範囲内にある主要被写体については、同じ露出の調整値を用いてまとめて撮影される。
上述した一連の処理は、ハードウェアにより実行することもできるし、ソフトウェアにより実行することもできる。一連の処理をソフトウェアにより実行する場合には、そのソフトウェアを構成するプログラムが、専用のハードウェアに組み込まれているコンピュータ、または汎用のパーソナルコンピュータなどに、プログラム記録媒体からインストールされる。
本技術は、以下のような構成をとることもできる。
撮影範囲に含まれるそれぞれの主要被写体までの距離に基づいて、前記主要被写体間の距離が閾値となる距離以上離れている前記主要被写体をそれぞれ異なる前記主要被写体として判定する主要被写体判定部と、
異なる前記主要被写体として判定されたそれぞれの前記主要被写体を対象とした連続撮影を制御する撮影制御部と
を備える制御装置。
(2)
前記主要被写体判定部は、所定の前記主要被写体にフォーカスを合わせたときに被写界深度に収まる他の前記主要被写体を同一の前記主要被写体として判定し、
前記撮影制御部は、同一の前記主要被写体として判定した複数の前記主要被写体を対象とした撮影を1回行う
前記(1)に記載の制御装置。
(3)
前記撮影範囲に含まれるそれぞれの前記主要被写体までの距離を表すDepthマップを生成するDepthマップ生成部をさらに備える
前記(1)または(2)に記載の制御装置。
(4)
前記Depthマップ生成部は、ToFセンサから得られた情報に基づいて前記Depthマップを生成する
前記(3)に記載の制御装置。
(5)
前記主要被写体らしさの度合いを表す主要被写体度を各画素の画素値とする主要被写体度マップを生成する主要被写体検出部をさらに備える
前記(3)または(4)に記載の制御装置。
(6)
前記主要被写体判定部は、前記主要被写体度マップ上の前記主要被写体の領域を所定のサイズの小領域に分割し、前記Depthマップから取得される前記小領域ごとの距離に基づいて、前記主要被写体の判定を行う
前記(5)に記載の制御装置。
(7)
前記主要被写体検出部は、画像を入力とし、前記主要被写体度マップを出力とする推論モデルを用いて前記主要被写体度マップを生成する
前記(5)または(6)に記載の制御装置。
(8)
前記撮影制御部は、それぞれの前記主要被写体にフォーカスを合わせて連続撮影を行うフォーカスブラケット撮影を制御する
前記(1)乃至(7)のいずれかに記載の制御装置。
(9)
前記撮影制御部は、それぞれの前記主要被写体の露出を調整して連続撮影を行う露出ブラケット撮影を制御する
前記(1)乃至(7)のいずれかに記載の制御装置。
(10)
制御装置が、
撮影範囲に含まれるそれぞれの主要被写体までの距離に基づいて、前記主要被写体間の距離が閾値となる距離以上離れている前記主要被写体をそれぞれ異なる前記主要被写体として判定し、
異なる前記主要被写体として判定されたそれぞれの前記主要被写体を対象とした連続撮影を制御する
制御方法。
(11)
コンピュータに、
撮影範囲に含まれるそれぞれの主要被写体までの距離に基づいて、前記主要被写体間の距離が閾値となる距離以上離れている前記主要被写体をそれぞれ異なる前記主要被写体として判定し、
異なる前記主要被写体として判定されたそれぞれの前記主要被写体を対象とした連続撮影を制御する
処理を実行させるためのプログラム。
Claims (11)
- 撮影範囲に含まれるそれぞれの主要被写体までの距離に基づいて、前記主要被写体間の距離が閾値となる距離以上離れている前記主要被写体をそれぞれ異なる前記主要被写体として判定する主要被写体判定部と、
異なる前記主要被写体として判定されたそれぞれの前記主要被写体を対象とした連続撮影を制御する撮影制御部と
を備える制御装置。 - 前記主要被写体判定部は、所定の前記主要被写体にフォーカスを合わせたときに被写界深度に収まる他の前記主要被写体を同一の前記主要被写体として判定し、
前記撮影制御部は、同一の前記主要被写体として判定した複数の前記主要被写体を対象とした撮影を1回行う
請求項1に記載の制御装置。 - 前記撮影範囲に含まれるそれぞれの前記主要被写体までの距離を表すDepthマップを生成するDepthマップ生成部をさらに備える
請求項1に記載の制御装置。 - 前記Depthマップ生成部は、ToFセンサから得られた情報に基づいて前記Depthマップを生成する
請求項3に記載の制御装置。 - 前記主要被写体らしさの度合いを表す主要被写体度を各画素の画素値とする主要被写体度マップを生成する主要被写体検出部をさらに備える
請求項3に記載の制御装置。 - 前記主要被写体判定部は、前記主要被写体度マップ上の前記主要被写体の領域を所定のサイズの小領域に分割し、前記Depthマップから取得される前記小領域ごとの距離に基づいて、前記主要被写体の判定を行う
請求項5に記載の制御装置。 - 前記主要被写体検出部は、画像を入力とし、前記主要被写体度マップを出力とする推論モデルを用いて前記主要被写体度マップを生成する
請求項5に記載の制御装置。 - 前記撮影制御部は、それぞれの前記主要被写体にフォーカスを合わせて連続撮影を行うフォーカスブラケット撮影を制御する
請求項1に記載の制御装置。 - 前記撮影制御部は、それぞれの前記主要被写体の露出を調整して連続撮影を行う露出ブラケット撮影を制御する
請求項1に記載の制御装置。 - 制御装置が、
撮影範囲に含まれるそれぞれの主要被写体までの距離に基づいて、前記主要被写体間の距離が閾値となる距離以上離れている前記主要被写体をそれぞれ異なる前記主要被写体として判定し、
異なる前記主要被写体として判定されたそれぞれの前記主要被写体を対象とした連続撮影を制御する
制御方法。 - コンピュータに、
撮影範囲に含まれるそれぞれの主要被写体までの距離に基づいて、前記主要被写体間の距離が閾値となる距離以上離れている前記主要被写体をそれぞれ異なる前記主要被写体として判定し、
異なる前記主要被写体として判定されたそれぞれの前記主要被写体を対象とした連続撮影を制御する
処理を実行させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/571,277 US20240284039A1 (en) | 2021-06-25 | 2022-02-17 | Control device, control method, and program |
JP2023529505A JPWO2022269999A1 (ja) | 2021-06-25 | 2022-02-17 | |
EP22827922.0A EP4362448A4 (en) | 2021-06-25 | 2022-02-17 | CONTROL DEVICE, CONTROL METHOD AND PROGRAM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-105418 | 2021-06-25 | ||
JP2021105418 | 2021-06-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022269999A1 true WO2022269999A1 (ja) | 2022-12-29 |
Family
ID=84543792
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/006303 WO2022269999A1 (ja) | 2021-06-25 | 2022-02-17 | 制御装置、制御方法、およびプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240284039A1 (ja) |
EP (1) | EP4362448A4 (ja) |
JP (1) | JPWO2022269999A1 (ja) |
WO (1) | WO2022269999A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008244991A (ja) * | 2007-03-28 | 2008-10-09 | Fujifilm Corp | 撮像装置および撮像方法 |
JP2013120949A (ja) | 2011-12-06 | 2013-06-17 | Sony Corp | 画像処理装置、画像処理方法、プログラム |
JP2014127966A (ja) * | 2012-12-27 | 2014-07-07 | Canon Inc | 画像処理装置及び画像処理方法 |
WO2015156149A1 (ja) * | 2014-04-10 | 2015-10-15 | ソニー株式会社 | 画像処理装置および画像処理方法 |
JP2016040578A (ja) * | 2014-08-12 | 2016-03-24 | リコーイメージング株式会社 | 撮像装置 |
JP2016208530A (ja) * | 2016-07-15 | 2016-12-08 | カシオ計算機株式会社 | 画像生成装置、画像生成方法及びプログラム |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5163446B2 (ja) * | 2008-11-25 | 2013-03-13 | ソニー株式会社 | 撮像装置、撮像方法及びプログラム |
JP6204660B2 (ja) * | 2012-12-21 | 2017-09-27 | キヤノン株式会社 | 撮像装置及びその制御方法 |
CN104919791A (zh) * | 2013-01-09 | 2015-09-16 | 索尼公司 | 图像处理设备、图像处理方法以及程序 |
US10984513B1 (en) * | 2019-09-30 | 2021-04-20 | Google Llc | Automatic generation of all-in-focus images with a mobile camera |
-
2022
- 2022-02-17 JP JP2023529505A patent/JPWO2022269999A1/ja active Pending
- 2022-02-17 EP EP22827922.0A patent/EP4362448A4/en active Pending
- 2022-02-17 US US18/571,277 patent/US20240284039A1/en active Pending
- 2022-02-17 WO PCT/JP2022/006303 patent/WO2022269999A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008244991A (ja) * | 2007-03-28 | 2008-10-09 | Fujifilm Corp | 撮像装置および撮像方法 |
JP2013120949A (ja) | 2011-12-06 | 2013-06-17 | Sony Corp | 画像処理装置、画像処理方法、プログラム |
JP2014127966A (ja) * | 2012-12-27 | 2014-07-07 | Canon Inc | 画像処理装置及び画像処理方法 |
WO2015156149A1 (ja) * | 2014-04-10 | 2015-10-15 | ソニー株式会社 | 画像処理装置および画像処理方法 |
JP2016040578A (ja) * | 2014-08-12 | 2016-03-24 | リコーイメージング株式会社 | 撮像装置 |
JP2016208530A (ja) * | 2016-07-15 | 2016-12-08 | カシオ計算機株式会社 | 画像生成装置、画像生成方法及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP4362448A4 |
Also Published As
Publication number | Publication date |
---|---|
US20240284039A1 (en) | 2024-08-22 |
EP4362448A1 (en) | 2024-05-01 |
EP4362448A4 (en) | 2024-10-16 |
JPWO2022269999A1 (ja) | 2022-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI549501B (zh) | An imaging device, and a control method thereof | |
WO2018201809A1 (zh) | 基于双摄像头的图像处理装置及方法 | |
US9225947B2 (en) | Image pickup apparatus, method of providing composition of image pickup and computer-readable recording medium | |
US8917333B2 (en) | Digital image processing apparatus, digital image processing method, and recording medium storing the digital image processing method | |
JP5713055B2 (ja) | 撮像装置、撮像方法及びプログラム | |
US8648960B2 (en) | Digital photographing apparatus and control method thereof | |
JP2008288975A (ja) | 撮像装置、撮像方法および撮像プログラム | |
US8610812B2 (en) | Digital photographing apparatus and control method thereof | |
US10999489B2 (en) | Image processing apparatus, image processing method, and image capture apparatus | |
US8571404B2 (en) | Digital photographing apparatus, method of controlling the same, and a computer-readable medium storing program to execute the method | |
KR20120080376A (ko) | 디지털 영상 촬영 장치 및 이의 제어 방법 | |
JP2017011451A (ja) | 検出装置、検出方法及びプログラム | |
WO2022269999A1 (ja) | 制御装置、制御方法、およびプログラム | |
JP6483661B2 (ja) | 撮像制御装置、撮像制御方法およびプログラム | |
US12114078B2 (en) | Low-light autofocus technique | |
JP5949591B2 (ja) | 撮像装置、制御方法、及び、プログラム | |
JP5832618B2 (ja) | 撮像装置、その制御方法及びプログラム | |
JP2019152807A (ja) | 焦点検出装置および焦点検出方法 | |
JP2014225763A (ja) | 撮像装置及びその制御方法、プログラム、並びに記憶媒体 | |
WO2023106119A1 (ja) | 制御装置、制御方法、情報処理装置、生成方法、およびプログラム | |
US20230199299A1 (en) | Imaging device, imaging method and program | |
WO2023106118A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US20240089597A1 (en) | Image capturing apparatus for capturing and compositing images different in in-focus position, control method, and storage medium | |
JP2012242759A (ja) | 撮像装置、その制御方法及びプログラム | |
KR101660838B1 (ko) | 촬상장치 및 그 제어방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22827922 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023529505 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18571277 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022827922 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022827922 Country of ref document: EP Effective date: 20240125 |