WO2015182021A1 - 撮像制御装置、撮像装置および撮像制御方法 - Google Patents
撮像制御装置、撮像装置および撮像制御方法 Download PDFInfo
- Publication number
- WO2015182021A1 WO2015182021A1 PCT/JP2015/001300 JP2015001300W WO2015182021A1 WO 2015182021 A1 WO2015182021 A1 WO 2015182021A1 JP 2015001300 W JP2015001300 W JP 2015001300W WO 2015182021 A1 WO2015182021 A1 WO 2015182021A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- detection
- display
- image
- imaging
- subject image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 21
- 238000001514 detection method Methods 0.000 claims abstract description 235
- 238000003702 image correction Methods 0.000 claims abstract description 30
- 238000003384 imaging method Methods 0.000 claims description 102
- 230000004075 alteration Effects 0.000 claims description 13
- 230000003287 optical effect Effects 0.000 claims description 13
- 210000001747 pupil Anatomy 0.000 claims description 5
- 230000000875 corresponding effect Effects 0.000 description 20
- 238000010586 diagram Methods 0.000 description 10
- 238000012634 optical imaging Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 5
- ORQBXQOJMQIAOY-UHFFFAOYSA-N nobelium Chemical compound [No] ORQBXQOJMQIAOY-UHFFFAOYSA-N 0.000 description 5
- 238000007781 pre-processing Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 241000226585 Antennaria plantaginifolia Species 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/635—Region indicators; Field of view indicators
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/18—Signals indicating condition of a camera member or suitability of light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/675—Focus control based on electronic image sensor signals comprising setting of focusing regions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
Definitions
- the present technology relates to an imaging control apparatus, an imaging apparatus, and an imaging control method that execute an autofocus function and display control thereof.
- electronic devices such as digital cameras typically include a monitor that displays a captured image in order to focus or determine the composition during shooting.
- a range-finding frame is displayed on the monitor so that the user can more easily focus on the subject, and the color of the focused range-finding frame is changed, or only the focused range-finding frame is displayed.
- Patent Literature 1 discloses that an image having a near-joining focus area is preferentially enlarged and displayed on an LCD at the time of photographing a focusing frame of a photographed image, so that a more focused point of the photographed image is displayed to the user.
- a possible imaging device is described.
- an object of the present technology is to provide an imaging control device, an imaging device, and an imaging control method capable of realizing appropriate imaging control such as focusing control.
- An imaging control device includes a detection control unit, an image correction unit, and a display control unit.
- the detection control unit controls detection based on a detection region corresponding to a predetermined region of the subject image acquired by the image sensor.
- the image correction unit corrects the subject image displayed on the screen based on lens information regarding the photographing lens.
- the display control unit displays a detection area display indicating the detection area at a predetermined position on the screen based on the detection area and the lens information.
- the imaging control apparatus includes a display control unit that displays a detection area display indicating the detection area at a predetermined position on the screen based on the detection area and the lens information. As a result, it is possible to display the detection area display in association with the subject image on the screen, so that, for example, focusing control intended by the user can be realized.
- the detection area display typically includes a range-finding frame for displaying an auto-focus (AF) area, a display frame for displaying an auto-exposure (AE) area, and the like.
- AF auto-focus
- AE auto-exposure
- the display control unit may be configured to determine whether or not the correction of the display position of the detection area display is necessary based on the lens information. By configuring so that the display position is corrected only when correction is necessary, the efficiency of display processing can be improved.
- the display control unit may be configured to display the detection display area on the screen when the subject image area corresponding to the detection area is focused. Thereby, the focus position can be displayed to the user.
- the detection area may include a plurality of detection areas.
- the detection control unit may be configured to control detection based on a detection region selected from the plurality of detection regions. Thereby, for example, focus control of the subject in the detection region intended by the user can be realized.
- the detection control unit may be configured to determine whether correction of the selected detection area is necessary based on the lens information. Thereby, it is possible to suppress the position shift of the detection region due to distortion.
- the lens information typically includes photographic lens distortion information.
- the image correction unit is configured to correct distortion of the subject image based on the distortion information.
- the lens information may be acquired from the photographing lens.
- the detection control section typically detects the in-focus position of the subject image area corresponding to the detection area.
- the present invention is not limited to this, and the detection control unit may detect the exposure amount of the area of the subject image corresponding to the detection area.
- the imaging control apparatus may further include an acquisition unit that acquires the lens information.
- the imaging control apparatus may further include a detection unit controlled by the detection control unit.
- the said detection part performs the detection process based on the said detection area
- An imaging apparatus includes an imaging optical system, an imaging element, a detection control unit, an image correction unit, a display unit, and a display control unit.
- the photographing optical system includes a photographing lens.
- the image pickup device acquires a subject image formed by a photographing optical system.
- the detection control unit controls detection based on a detection region corresponding to a predetermined region of a subject image acquired by the imaging element.
- the image correction unit corrects the subject image acquired by the image sensor based on lens information regarding the photographing lens.
- the display unit includes a screen that displays the subject image corrected by the image correction unit and a detection area display indicating the detection area.
- the display control unit displays the detection area display at a predetermined position on the screen based on the detection area and the lens information.
- the image sensor may include a plurality of phase difference detection pixels that perform pupil division of the photographing lens.
- high-speed autofocus control can be realized.
- the calculation load required for detection can be reduced and the time required for detection processing can be shortened compared to the case where detection control is executed based on the corrected subject image.
- the detection area may include a plurality of detection areas.
- the display unit may further include a touch sensor that detects a selection operation for any detection area display among the plurality of detection area displays indicating the plurality of detection areas.
- An imaging control method includes controlling detection based on a detection area corresponding to a predetermined area of a subject image acquired by an imaging element.
- the subject image displayed on the screen is corrected based on the lens information regarding the photographing lens.
- a detection area display indicating the detection area is displayed at a predetermined position on the screen.
- 1 is a block diagram illustrating an overall configuration of an imaging apparatus according to an embodiment. It is a schematic diagram which shows an example of the arrangement
- FIG. 1 is a block diagram illustrating an overall configuration of an imaging apparatus 100 according to the present embodiment.
- the imaging apparatus 100 includes an optical imaging system 101, an imaging element 102, a preprocessing circuit 103, a camera processing circuit 104, an image memory 105, a control unit 106, a graphic I / F (Interface) 107, a display unit 108, an input unit 109, and an R. / W (reader / writer) 110 is provided.
- the control unit 106 functions as the imaging control device 150.
- the optical imaging system 101 includes a photographing lens 101A for condensing light from a subject on the imaging element 102, a driving mechanism for moving the photographing lens 101A to perform focusing and zooming, a shutter mechanism, an iris mechanism, and the like. . These are driven based on a control signal from the control unit 106.
- the optical image of the subject obtained through the optical imaging system 101 is formed on the imaging element 102 as an imaging device.
- the image sensor 102 includes R (Red) pixels, G (Green) pixels, and B (Blue) pixels, which are normal captured images, and phase difference detection pixels for phase difference detection.
- Each pixel constituting the image sensor 102 photoelectrically converts incident light from a subject to convert it into a charge amount, and outputs a pixel signal. Then, the image sensor 102 finally outputs an image signal composed of a pixel signal to the preprocessing circuit 103.
- a CCD Charge-Coupled Device
- CMOS Complementary Metal-Oxide Semiconductor
- FIG. 2 is a schematic diagram illustrating an example of the arrangement of normal pixels and phase difference detection pixels in the image sensor 102.
- R is an R (Red) pixel
- G is a G (Green) pixel
- B is a B (Blue) pixel, each representing a normal imaging pixel.
- P1 represents a first phase difference detection pixel
- P2 represents a second phase difference detection pixel.
- the phase difference detection pixels have a pair of P1 and P2, and perform pupil division of the photographing lens 101A.
- the phase difference detection pixels P1 and P2 have different optical characteristics from normal imaging pixels.
- some of the G pixels are used as phase difference detection pixels. This is because there are twice as many G pixels as there are R pixels and B pixels.
- the phase difference detection pixel is not limited to the G pixel.
- the image pickup element 102 has a plurality of phase difference detection pixels P1 and P2 that perform pupil division of the photographing lens 101A in addition to the normal pixels R, G, and B.
- the imaging apparatus 100 is configured to perform a so-called image plane phase difference AF (Auto-Focus) based on outputs from the plurality of phase difference detection pixels P1 and P2.
- image plane phase difference AF Auto-Focus
- the pre-processing circuit 103 performs a sample hold or the like on the image signal output from the image sensor 102 so as to maintain a good S / N (Signal / Noise) by CDS (Correlated Double Sampling) processing. Furthermore, the gain is controlled by AGC (Auto Gain Control) processing, A / D (Analog / Digital) conversion is performed, and a digital image signal is output.
- AGC Auto Gain Control
- a / D Analog / Digital
- the camera processing circuit 104 performs signal processing such as white balance adjustment processing, color correction processing, gamma correction processing, Y / C conversion processing, and AE (Auto-Exposure) processing on the image signal from the preprocessing circuit 103.
- signal processing such as white balance adjustment processing, color correction processing, gamma correction processing, Y / C conversion processing, and AE (Auto-Exposure) processing on the image signal from the preprocessing circuit 103.
- the image memory 105 is a volatile memory, for example, a buffer memory composed of DRAM (Dynamic Random Access Memory), and temporarily stores image data subjected to predetermined processing by the preprocessing circuit 103 and the camera processing circuit 104. store.
- DRAM Dynamic Random Access Memory
- the control unit 106 includes, for example, a CPU, a RAM, a ROM, and the like.
- the ROM stores a program that is read and operated by the CPU.
- the RAM is used as a work memory for the CPU.
- the CPU controls the entire imaging apparatus 100 by executing various processes in accordance with programs stored in the ROM.
- control unit 106 functions as the imaging control device 150 by executing a predetermined program.
- the imaging control device 150 is not only realized by a program, but may be realized as a dedicated device by hardware having a function as an imaging control device.
- the imaging device 100 includes an imaging control device as hardware.
- the imaging control device 150 includes an acquisition unit 151, a detection unit 152, a detection control unit 153, an image correction unit 154, and a display control unit 155.
- the acquisition unit 151 and the detection unit 152 are not limited to being configured as a part of the imaging control device 150, and may be configured independently of the imaging control device.
- the acquisition unit 151 is configured to acquire control parameters necessary for focus adjustment by the detection unit 152 and correction values necessary for correcting distortion aberration of the subject image by the image correction unit 154.
- the acquisition unit 151 acquires lens information and a phase difference signal.
- the lens information includes lens information related to the photographic lens 101A, that is, distortion information such as a distortion aberration amount, in addition to zoom (focal distance), focus (shooting distance), aperture (F value) in the photographic lens 101A.
- the acquisition unit 151 acquires lens information from a microcomputer (not shown) that manages the photographing lens 101A or the like, and acquires a phase difference signal from the image sensor 102 having the phase difference detection pixels P1 and P2.
- the lens information, phase difference signal, and the like acquired by the acquisition unit 151 may be stored in the ROM of the control unit 106 that functions as the imaging control device 150, or may be stored in a separate storage medium.
- the detection unit 152 detects a detection region (detection area) corresponding to a predetermined region of the subject image acquired by the image sensor 102 based on the lens information, the phase difference signal, and the like acquired by the acquisition unit 151. Composed.
- the detection control unit 153 controls detection based on the detection region. In other words, the detection control unit 153 controls the detection process of the detection region executed by the detection unit 152. Typically, a plurality of detection areas are set so as to respectively correspond to a plurality of predetermined areas of the subject image. Each detection area is displayed as a detection area display of a predetermined form on the screen of the display unit 108.
- the detection area display is typically displayed at a plurality of locations on the screen of the display unit 108.
- Each detection area includes a plurality of phase difference detection pixels, and the detection area display is displayed on the screen as a distance measurement frame for AF and a display frame for AE.
- the plurality of detection area displays may be displayed in a grid pattern on the screen, or may be displayed discretely at arbitrary positions.
- the detection unit 152 executes a detection process for a predetermined detection region that is set in advance or selected by the user.
- the detection control unit 153 causes the detection unit 152 to detect the in-focus position of the subject image region corresponding to the detection region. That is, the detection unit 152 calculates the difference between the focal position obtained by the image sensor 102 and the best image plane position in the photographing lens 101A based on the phase difference signal in the detection region, and the photographing lens 101A in the optical axis direction is calculated. Determine the amount of movement.
- the detection unit 152 outputs an AF control signal for driving the photographing lens 101A to the optical imaging system 101, and executes focus control of the subject image.
- the detection unit 152 since the detection unit 152 is configured to execute image autofocus control using a phase difference signal from a phase difference detection pixel, high-speed autofocus control can be realized.
- the detection unit 152 is configured to perform autofocus control on an image before distortion correction by the image correction unit 154. Thereby, compared with the case where detection control is performed based on the corrected subject image, the calculation load necessary for detection can be reduced and the time required for detection processing can be shortened.
- the image correction unit 154 is configured to correct the distortion aberration of the subject image displayed on the screen of the display unit 108 based on the lens information (distortion information) acquired by the acquisition unit 151.
- optical distortion occurs in the subject image formed through the taking lens 101A. Since the image sensor 102 converts the subject image formed by the optical imaging system 101 into an image signal as it is, the image signal is also affected by the optical distortion of the subject image.
- Typical optical distortion includes, for example, distortion aberration such as “barrel distortion” as shown in FIG. 3A and “pincushion distortion” as shown in FIG. 3B.
- 3A and 3B show a state in which an image of a subject in a range indicated by a two-dot chain line is formed into a solid line shape by distortion.
- the image correction unit 154 is configured to correct the subject image displayed on the screen based on the lens information regarding the photographing lens. That is, the image correction unit 154 determines the amount of distortion of the photographing lens 101A from the acquired lens information, and corrects the subject image acquired by the image sensor 102 to an image without aberration.
- the control unit 106 generates an image signal for displaying the subject image corrected by the image correction unit 154 on the screen of the display unit 108.
- the distortion of the subject image can be corrected by, for example, converting the subject image into digital data and rearranging the pixel data in accordance with the amount of distortion. For example, when correcting an image in which a barrel distortion has occurred, pixels at points a to d are read out as pixels at points A to D, respectively, as shown in FIG. As a result, an image with corrected distortion can be generated. At this time, the number of pixels may be adjusted by interpolation processing, thinning-out processing, or the like.
- control unit 106 may be configured to generate a control signal that causes the subject image acquired by the image sensor 102 to be displayed on the screen of the display unit 108 as it is without correcting the aberration according to the selection of the user.
- the display control unit 155 displays a detection area display indicating the detection area at a predetermined position on the screen based on the detection area and the lens information. That is, the display control unit 155 displays an AF frame (detection frame) on the screen of the display unit 108 as a detection area display.
- FIG. 5 shows a display example of the AF frame.
- the subject image V whose distortion has been corrected by the image correction unit 154 is displayed.
- the AF frame 120 is displayed superimposed on the subject image. All the AF frames 120 may be displayed, or only the focused AF frame may be displayed. In the former case, the focused AF frame may be displayed in a different form from the unfocused AF frame.
- the AF frame 120 shows an example in which seven AF frames are arranged vertically and eight AF frames are arranged in a grid, but the number and positions of the AF frames are not limited to the above example.
- the AF frames 120 are not limited to being displayed adjacent to each other, and may be displayed apart from each other.
- the detection unit 152 performs autofocus control of the subject image for the predetermined AF frame 120.
- the predetermined AF frame 120 may be all the AF frames 120, or may be one or a plurality of AF frames designated by the user.
- the detection unit 152 performs AF frame ranging control based on the subject image before distortion correction. Therefore, the detection area is set based on the subject image before distortion correction.
- the subject image after distortion correction is displayed on the screen 108A
- the AF frame set based on the subject image before distortion correction is displayed on the screen 108A as it is
- the subject image V and AF on the screen 108A are displayed. Deviation occurs in the relative position to the frame 120. If so, there may be a case where proper AF control intended by the user cannot be realized.
- the display control unit 155 is configured to correct the display position of the detection region displayed on the screen of the display unit 108 based on the lens information acquired by the acquisition unit 151. Accordingly, the AF frame 120 can be displayed in correspondence with the subject image V on the screen 108A, so that the focus control intended by the user can be realized.
- the display control unit 155 determines whether it is necessary to correct the display position of the AF frame 120 (detection area display) on the screen 108A based on the lens information of the photographing lens 101A. That is, it is determined whether or not the display position of the AF frame 120 needs to be corrected according to the distortion aberration amount of the photographing lens 101A, and the display position is corrected only when correction is necessary. Processing efficiency can be improved.
- the distortion amount is determined based on the lens information of the photographing lens 101A, but may be determined based on the correction amount by the image correction unit 154.
- the graphic I / F 107 generates an image signal to be displayed on the display unit 108 from the image signal supplied from the control unit 106 and supplies the signal to the display unit 108 to display the image.
- the display unit 108 has a screen for displaying a subject image whose distortion has been corrected by the image correction unit 154, a detection region whose display position has been corrected by the display control unit 155, and the like.
- the display unit 108 includes, for example, an LCD (Liquid Crystal Display), an organic EL (Electro Luminescence) panel, or the like.
- the display unit 108 may include a touch sensor that can detect a touch input on the screen. On the screen, a live view image being captured, an image recorded in the storage medium 111, and the like are displayed.
- the input unit 109 includes, for example, a power button for switching power on / off, a release button for supporting the start of recording of a captured image, an operation element for zoom adjustment, and a touch screen configured integrally with the display unit 108. (Touch panel).
- a control signal corresponding to the input is output to the generated control unit 106.
- the control unit 106 performs arithmetic processing and control corresponding to the control signal.
- R / W 110 is an interface to which a storage medium 111 that records image data generated by imaging is connected.
- the R / W 110 writes the data supplied from the control unit 106 to the storage medium 111, and outputs the data read from the storage medium 111 to the control unit 106.
- the storage medium 111 is composed of a large-capacity storage medium such as a hard disk, a card-type storage medium, or a USB memory.
- FIG. 6 is a flowchart for explaining an operation example of the imaging control apparatus 150.
- the imaging control device 150 automatically performs focusing control on a specific area of the subject image will be described as an example.
- the subject image formed by the optical imaging system 101 is acquired by the imaging element 102, and the framing image is displayed on the screen 108A.
- the acquisition unit 151 acquires the lens information and the phase difference signal of the photographing lens 101A (Steps 101 and 102).
- the lens information may be acquired directly from the optical imaging system 101 or may be read from a memory that stores lens information acquired in advance.
- the phase difference signal is acquired from the phase difference detection pixels P1 and P2 of the image sensor 102.
- the detection control unit 153 causes the detection unit 152 to perform distance measurement control for all AF frames (detection areas) 120, and determines the main subject (steps 103 and 104). As the main subject, for example, a subject closest to the taking lens 101A or a subject located in the center is selected.
- the detector 152 drives the taking lens 101A to perform focusing control of the determined main subject (step 105).
- the image correction unit 154 performs distortion correction of the subject image based on the lens information acquired by the acquisition unit 151, and displays the subject image without distortion on the screen 108A.
- the display control unit 155 determines whether or not the display position of the AF frame 120 needs to be corrected based on the lens information (step 106).
- the display position of the AF frame 120 is corrected according to the fluctuation amount, and the fluctuation If the amount is smaller than the predetermined value, the position of the AF frame 120 is displayed as it is without correction (steps 107 and 108).
- FIG. 7A shows a state in which the images belonging to the AF frames 124 and 125 before the distortion correction are moved into the AF frames 121 and 122 by the distortion aberration correction.
- the detector 152 detects the phase difference signals in the AF frames 124 and 125 for focus control of the image V0 before distortion correction.
- the image correction unit 154 displays the image V1 corrected for distortion on the screen 108A instead of the image V0.
- the display control unit 155 displays the AF frames 121 and 122 to which the image V1 belongs on the screen 108A as shown in FIG. 7B. Lights up.
- the focus position can be properly displayed. As a result, it is possible to realize proper focusing control intended by the user.
- FIG. 8 is a flowchart for explaining another example of the operation of the imaging control apparatus 150.
- a case where focus control is performed on a specific area of a subject image designated by the user will be described as an example.
- the acquisition unit 151 acquires the lens information and the phase difference signal of the photographing lens 101A (step S1). 201, 202).
- the image correction unit 154 performs distortion correction of the subject image based on the lens information acquired by the acquisition unit 151, and displays the subject image without distortion on the screen 108A.
- the display control unit 155 receives an input of a specific focusing frame (detection region) designated by the user.
- the focusing frame may be any one or two or more areas in the plurality of AF frames 120 displayed on the screen 108A as shown in FIG. Or one or more areas. In the latter case, the AF frame 120 itself does not need to be displayed on the screen 108A, and the focusing frame corresponding to the position designated by the user may be determined.
- the designation of the focusing frame may be a touch input on the screen 108A or a selection input operation (for example, a cursor operation) via a predetermined operation button.
- the touch sensor detects which in-focus frame is selected from the plurality of in-focus frames, and outputs the detection result to the control unit 106.
- the focusing frame to be selected and operated is not limited to one and may be a plurality.
- the detection control unit 153 After detecting the focusing frame designated by the user (step 203), the detection control unit 153 determines whether or not the designated focusing frame needs to be corrected based on the lens information (step 204). ).
- the detection control unit 153 corrects the position of the focusing frame according to the fluctuation amount. If the fluctuation amount is smaller than the predetermined amount, the focus frame at the designated position is adopted without correcting the position of the focus frame (steps 205 and 206).
- the detection control unit 153 causes the detection unit 152 to perform distance measurement control for the designated focusing frame, and drives the photographing lens 101A to control focusing on the image in the focusing frame (step S1). 207, 208).
- the focusing frame is lit on the screen 108A (step 209). In this case, the display position of the focusing frame is determined according to the determination result of step 204.
- FIG. 9 is a schematic diagram for explaining the relationship between the display position of the focusing frame and the detection area.
- the detection area (focusing frame) is selected by the user.
- subject image detection control is performed on the image V0 before distortion correction, and the image V1 after distortion correction is displayed on the screen 108A. Therefore, in FIG. 8, for example, when the focusing frame 122 on the screen 108A is designated by the user, the detection unit 152 assumes that the focusing frame 125 where the pre-distortion image V0 is located is designated as detection control. After focusing is completed, the focusing frame 122 corresponding to the display position of the image V1 after distortion correction is turned on.
- the subject image autofocus control (AF control) has been described as an example, but the present technology can also be applied to automatic exposure control (AE control) instead of or in addition to this. It is.
- the detection control unit 153 causes the detection unit 152 to detect at least one of the in-focus position and the exposure amount of the subject image region corresponding to the detection region.
- phase difference detection pixels P1 and P2 for phase difference detection are incorporated in the image sensor 102 .
- a dedicated image sensor for phase difference detection is used separately to capture the subject image. You may be comprised so that AF control may be performed.
- the present invention is not limited to the image plane phase difference detection method, and an AF control technique based on a so-called contrast method may be employed.
- this technique can also take the following structures.
- a detection control unit that controls detection based on a detection region corresponding to a predetermined region of a subject image acquired by the image sensor;
- An image correction unit that corrects the subject image displayed on the screen based on lens information about the photographing lens;
- An imaging control apparatus comprising: a display control unit configured to display a detection area display indicating the detection area at a predetermined position on the screen based on the detection area and the lens information.
- the imaging control device determines whether or not correction of the display position of the detection area display is necessary based on the lens information.
- the imaging control device displays the detection area display on the screen when the area of the subject image corresponding to the detection area is focused.
- the detection region includes a plurality of detection regions, The detection control unit controls detection based on a detection region selected from the plurality of detection regions.
- the imaging control device determines the necessity of correction
- the imaging control device includes distortion information of the photographing lens, The image correction unit corrects distortion aberration of the subject image based on the distortion information.
- the imaging control device according to any one of (1) to (6) above, The lens information is acquired from the photographing lens.
- the imaging control device is configured to detect an exposure amount of a region of the subject image corresponding to the detection region.
- An imaging control apparatus further comprising an acquisition unit that acquires the lens information.
- An imaging control apparatus further comprising: a detection unit that is controlled by the detection control unit and executes a detection process based on the detection region.
- a photographing optical system having a photographing lens; An image sensor for acquiring a subject image formed by the photographing optical system; A detection control unit that controls detection based on a detection region corresponding to a predetermined region of a subject image acquired by the imaging device; An image correction unit that corrects a subject image acquired by the image sensor based on lens information about the photographing lens; A display unit having a screen for displaying the subject image corrected by the image correction unit and a detection region display indicating the detection region; An imaging apparatus comprising: a display control unit configured to display the detection area display at a predetermined position on the screen based on the detection area and the lens information.
- the imaging device includes a plurality of phase difference detection pixels that perform pupil division of the photographing lens.
- the detection region includes a plurality of detection regions,
- the display unit further includes a touch sensor that detects a selection operation with respect to any detection region display among a plurality of detection region displays indicating the plurality of detection regions.
- DESCRIPTION OF SYMBOLS 100 ... Imaging device 101 ... Optical imaging system 101A ... Shooting lens 102 ... Imaging element 106 ... Control part 108 ... Display part 108A ... Screen 150 ... Imaging control apparatus 151 ... Acquisition part 152 ; Detection part 153 ... Detection control part 154 ... Image correction 155 ... Display control unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Geometry (AREA)
- Focusing (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
Description
上記検波制御部は、撮像素子により取得される被写体像の所定の領域に対応する検波領域に基づく検波を制御する。
上記画像補正部は、撮影レンズに関するレンズ情報に基づいて、画面に表示される上記被写体像を補正する。
上記表示制御部は、上記検波領域及び上記レンズ情報に基づいて、上記検波領域を示す検波領域表示を上記画面上の所定の位置に表示させる。
補正が必要なときのみ表示位置を補正するように構成することで、表示処理の効率化を図ることができる。
これにより、ユーザへ合焦位置を表示させることができる。
これにより、例えば、ユーザが意図する検波領域での被写体の合焦制御を実現することができる。
これにより歪曲収差に起因する検波領域の位置ずれを抑制することができる。
上記撮影光学系は、撮影レンズを有する。
上記撮像素子は、撮影光学系により結像された被写体像を取得する。
上記検波制御部は、上記撮像素子により取得される被写体像の所定の領域に対応する検波領域に基づく検波を制御する。
上記画像補正部は、上記撮影レンズに関するレンズ情報に基づいて、上記撮像素子により取得された被写体像を補正する。
上記表示部は、上記画像補正部により補正された被写体像と、上記検波領域を示す検波領域表示とを表示する画面を有する。
上記表示制御部は、上記検波領域及び上記レンズ情報に基づいて、上記検波領域表示を上記画面上の所定の位置に表示させる。
撮影レンズに関するレンズ情報に基づいて、画面に表示される上記被写体像が補正される。
上記検波領域及び上記レンズ情報に基づいて、上記検波領域を示す検波領域表示が上記画面上の所定の位置に表示される。
なお、ここに記載された効果は必ずしも限定されるものではなく、本開示中に記載されたいずれかの効果であってもよい。
図1は、本実施形態に係る撮像装置100の全体構成を示すブロック図である。
なお、取得部151および検波部152は、撮像制御装置150の一部として構成される場合に限られず、撮像制御装置とは独立して構成されてもよい。
以下、撮像制御装置150の詳細について、撮像装置100の動作と併せて説明する。
(1)撮像素子により取得される被写体像の所定の領域に対応する検波領域に基づく検波を制御する検波制御部と、
撮影レンズに関するレンズ情報に基づいて、画面に表示される前記被写体像を補正する画像補正部と、
前記検波領域及び前記レンズ情報に基づいて、前記検波領域を示す検波領域表示を前記画面上の所定の位置に表示させる表示制御部と
を具備する撮像制御装置。
(2)上記(1)に記載の撮像制御装置であって、
前記表示制御部は、前記レンズ情報に基づいて、前記検波領域表示の表示位置の補正の要否を判定する
撮像制御装置。
(3)上記(1)又は(2)に記載の撮像制御装置であって、
前記表示制御部は、前記検波領域に対応する前記被写体像の領域に合焦したときに、前記画面に前記検波領域表示を表示させる
撮像制御装置。
(4)上記(1)~(3)のいずれか1つに記載の撮像制御装置であって、
前記検波領域は、複数の検波領域を含み、
前記検波制御部は、前記複数の検波領域の中から選択された検波領域に基づく検波を制御する
撮像制御装置。
(5)上記(4)に記載の撮像制御装置であって、
前記検波制御部は、前記レンズ情報に基づいて、選択された検波領域の補正の要否を判定する
撮像制御装置。
(6)上記(1)~(5)のいずれか1つに記載の撮像制御装置であって、
前記レンズ情報は、前記撮影レンズの歪情報を含み、
前記画像補正部は、前記歪情報に基づいて、前記被写体像の歪曲収差を補正する
撮像制御装置。
(7)上記(1)~(6)のいずれか1つに記載の撮像制御装置であって、
前記レンズ情報は、前記撮影レンズから取得される
撮像制御装置。
(8)上記(1)~(7)のいずれか1つに記載の撮像制御装置であって、
前記検波制御部は、前記検波領域に対応する前記被写体像の領域の合焦位置を検出させる
撮像制御装置。
(9)上記(1)~(8)のいずれか1つに記載の撮像制御装置であって、
前記検波制御部は、前記検波領域に対応する前記被写体像の領域の露光量を検出させる
撮像制御装置。
(10)上記(1)~(9)のいずれか1つに記載の撮像制御装置であって、
前記レンズ情報を取得する取得部をさらに具備する
撮像制御装置。
(11)上記(1)~(10)のいずれか1つに記載の撮像制御装置であって、
前記検波制御部により制御され、前記検波領域に基づく検波処理を実行する検波部をさらに具備する
撮像制御装置。
(12)撮影レンズを有する撮影光学系と、
撮影光学系により結像された被写体像を取得する撮像素子と、
前記撮像素子により取得される被写体像の所定の領域に対応する検波領域に基づいて検波を制御させる検波制御部と、
前記撮影レンズに関するレンズ情報に基づいて、前記撮像素子により取得された被写体像を補正する画像補正部と、
前記画像補正部により補正された被写体像と、前記検波領域を示す検波領域表示とを表示する画面を有する表示部と、
前記検波領域及び前記レンズ情報に基づいて、前記検波領域表示を前記画面上の所定の位置に表示させる表示制御部と
を具備する撮像装置。
(13)上記(12)に記載の撮像装置であって、
前記撮像素子は、前記撮影レンズの瞳分割を行う複数の位相差検出画素を有する
撮像装置。
(14)上記(12)又は(13)に記載の撮像装置であって、
前記検波領域は、複数の検波領域を含み、
前記表示部は、前記複数の検波領域を示す複数の検波領域表示のうちいずれかの検波領域表示に対する選択操作を検出するタッチセンサをさらに有する
撮像装置。
(15)撮像素子により取得される被写体像の所定の領域に対応する検波領域に基づいて検波を制御し、
撮影レンズに関するレンズ情報に基づいて、画面に表示される前記被写体像を補正し、
前記検波領域及び前記レンズ情報に基づいて、前記検波領域を示す検波領域表示を前記画面上の所定の位置に表示させる
撮像制御方法。
101…光学撮像系
101A…撮影レンズ
102…撮像素子
106…制御部
108…表示部
108A…画面
150…撮像制御装置
151…取得部
152…検波部
153…検波制御部
154…画像補正部
155…表示制御部
Claims (15)
- 撮像素子により取得される被写体像の所定の領域に対応する検波領域に基づく検波を制御する検波制御部と、
撮影レンズに関するレンズ情報に基づいて、画面に表示される前記被写体像を補正する画像補正部と、
前記検波領域及び前記レンズ情報に基づいて、前記検波領域を示す検波領域表示を前記画面上の所定の位置に表示させる表示制御部と
を具備する撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記表示制御部は、前記レンズ情報に基づいて、前記検波領域表示の表示位置の補正の要否を判定する
撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記表示制御部は、前記検波領域に対応する前記被写体像の領域に合焦したときに、前記画面に前記検波領域表示を表示させる
撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記検波領域は、複数の検波領域を含み、
前記検波制御部は、前記複数の検波領域の中から選択された検波領域に基づく検波を制御する
撮像制御装置。 - 請求項4に記載の撮像制御装置であって、
前記検波制御部は、前記レンズ情報に基づいて、選択された検波領域の補正の要否を判定する
撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記レンズ情報は、前記撮影レンズの歪情報を含み、
前記画像補正部は、前記歪情報に基づいて、前記被写体像の歪曲収差を補正する
撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記レンズ情報は、前記撮影レンズから取得される
撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記検波制御部は、前記検波領域に対応する前記被写体像の領域の合焦位置を検出させる
撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記検波制御部は、前記検波領域に対応する前記被写体像の領域の露光量を検出させる
撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記レンズ情報を取得する取得部をさらに具備する
撮像制御装置。 - 請求項1に記載の撮像制御装置であって、
前記検波制御部により制御され、前記検波領域に基づく検波処理を実行する検波部をさらに具備する
撮像制御装置。 - 撮影レンズを有する撮影光学系と、
撮影光学系により結像された被写体像を取得する撮像素子と、
前記撮像素子により取得される被写体像の所定の領域に対応する検波領域に基づいて検波を制御させる検波制御部と、
前記撮影レンズに関するレンズ情報に基づいて、前記撮像素子により取得された被写体像を補正する画像補正部と、
前記画像補正部により補正された被写体像と、前記検波領域を示す検波領域表示とを表示する画面を有する表示部と、
前記検波領域及び前記レンズ情報に基づいて、前記検波領域表示を前記画面上の所定の位置に表示させる表示制御部と
を具備する撮像装置。 - 請求項12に記載の撮像装置であって、
前記撮像素子は、前記撮影レンズの瞳分割を行う複数の位相差検出画素を有する
撮像装置。 - 請求項12に記載の撮像装置であって、
前記検波領域は、複数の検波領域を含み、
前記表示部は、前記複数の検波領域を示す複数の検波領域表示のうちいずれかの検波領域表示に対する選択操作を検出するタッチセンサをさらに有する
撮像装置。 - 撮像素子により取得される被写体像の所定の領域に対応する検波領域に基づいて検波を制御し、
撮影レンズに関するレンズ情報に基づいて、画面に表示される前記被写体像を補正し、
前記検波領域及び前記レンズ情報に基づいて、前記検波領域を示す検波領域表示を前記画面上の所定の位置に表示させる
撮像制御方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580025940.9A CN106464783B (zh) | 2014-05-26 | 2015-03-10 | 图像拾取控制设备、图像拾取设备和图像拾取控制方法 |
US15/311,560 US10277796B2 (en) | 2014-05-26 | 2015-03-10 | Imaging control apparatus, imaging apparatus, and imaging control method |
JP2016523105A JP6547742B2 (ja) | 2014-05-26 | 2015-03-10 | 撮像制御装置、撮像装置および撮像制御方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-108198 | 2014-05-26 | ||
JP2014108198 | 2014-05-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015182021A1 true WO2015182021A1 (ja) | 2015-12-03 |
Family
ID=54698386
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/001300 WO2015182021A1 (ja) | 2014-05-26 | 2015-03-10 | 撮像制御装置、撮像装置および撮像制御方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10277796B2 (ja) |
JP (1) | JP6547742B2 (ja) |
CN (1) | CN106464783B (ja) |
WO (1) | WO2015182021A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7271316B2 (ja) * | 2019-05-31 | 2023-05-11 | キヤノン株式会社 | 撮像装置及びその制御方法 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005295418A (ja) * | 2004-04-05 | 2005-10-20 | Sony Corp | 撮像装置および撮像表示方法 |
WO2007129444A1 (ja) * | 2006-04-14 | 2007-11-15 | Nikon Corporation | 画像の歪曲補正方法、歪曲補正プログラム、及び光学装置 |
JP2008118387A (ja) * | 2006-11-02 | 2008-05-22 | Canon Inc | 撮像装置 |
JP2008292715A (ja) * | 2007-05-24 | 2008-12-04 | Sony Corp | 撮像装置、撮像装置の検波領域調整方法および撮像装置の検波領域調整プログラム |
JP2009109548A (ja) * | 2007-10-26 | 2009-05-21 | Sony Corp | 撮像装置 |
JP2009284462A (ja) * | 2008-04-21 | 2009-12-03 | Sony Corp | 撮像装置及び測距領域制御方法 |
JP2009302657A (ja) * | 2008-06-10 | 2009-12-24 | Canon Inc | 画像処理装置、制御方法、及びプログラム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009104390A1 (ja) * | 2008-02-22 | 2009-08-27 | パナソニック株式会社 | 撮像装置 |
JP5254904B2 (ja) * | 2009-08-20 | 2013-08-07 | キヤノン株式会社 | 撮像装置及び方法 |
JP5604160B2 (ja) * | 2010-04-09 | 2014-10-08 | パナソニック株式会社 | 撮像装置 |
JP5789091B2 (ja) * | 2010-08-20 | 2015-10-07 | キヤノン株式会社 | 撮像装置および撮像装置の制御方法 |
WO2014010324A1 (ja) * | 2012-07-13 | 2014-01-16 | 富士フイルム株式会社 | 画像変形装置およびその動作制御方法 |
JP5901801B2 (ja) * | 2013-01-04 | 2016-04-13 | 富士フイルム株式会社 | 画像処理装置、撮像装置、プログラム及び画像処理方法 |
JP6137840B2 (ja) * | 2013-01-18 | 2017-05-31 | オリンパス株式会社 | カメラシステム |
JP5833794B2 (ja) * | 2013-03-27 | 2015-12-16 | 富士フイルム株式会社 | 撮像装置 |
KR102146856B1 (ko) * | 2013-12-31 | 2020-08-21 | 삼성전자주식회사 | 렌즈 특성 기반의 촬영 모드 제시 방법, 상기 방법을 기록한 컴퓨터 판독 가능 저장매체 및 디지털 촬영 장치. |
-
2015
- 2015-03-10 CN CN201580025940.9A patent/CN106464783B/zh active Active
- 2015-03-10 JP JP2016523105A patent/JP6547742B2/ja active Active
- 2015-03-10 US US15/311,560 patent/US10277796B2/en active Active
- 2015-03-10 WO PCT/JP2015/001300 patent/WO2015182021A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005295418A (ja) * | 2004-04-05 | 2005-10-20 | Sony Corp | 撮像装置および撮像表示方法 |
WO2007129444A1 (ja) * | 2006-04-14 | 2007-11-15 | Nikon Corporation | 画像の歪曲補正方法、歪曲補正プログラム、及び光学装置 |
JP2008118387A (ja) * | 2006-11-02 | 2008-05-22 | Canon Inc | 撮像装置 |
JP2008292715A (ja) * | 2007-05-24 | 2008-12-04 | Sony Corp | 撮像装置、撮像装置の検波領域調整方法および撮像装置の検波領域調整プログラム |
JP2009109548A (ja) * | 2007-10-26 | 2009-05-21 | Sony Corp | 撮像装置 |
JP2009284462A (ja) * | 2008-04-21 | 2009-12-03 | Sony Corp | 撮像装置及び測距領域制御方法 |
JP2009302657A (ja) * | 2008-06-10 | 2009-12-24 | Canon Inc | 画像処理装置、制御方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP6547742B2 (ja) | 2019-07-24 |
US20170085779A1 (en) | 2017-03-23 |
CN106464783B (zh) | 2020-01-21 |
CN106464783A (zh) | 2017-02-22 |
JPWO2015182021A1 (ja) | 2017-04-20 |
US10277796B2 (en) | 2019-04-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6476065B2 (ja) | 撮像装置及び撮像装置の制御方法 | |
US9843735B2 (en) | Image processing apparatus, imaging apparatus comprising the same, and image processing method | |
JP5911445B2 (ja) | 撮像装置及びその制御方法 | |
JP2014029353A (ja) | 焦点調整装置および焦点調整方法 | |
JP2008262049A (ja) | オートフォーカス装置、撮像装置及びオートフォーカス方法 | |
JP5572700B2 (ja) | 撮像装置及び撮像方法 | |
JP5657184B2 (ja) | 撮像装置及び信号処理方法 | |
KR100819811B1 (ko) | 촬상 장치 및 촬상 방법 | |
JP5378283B2 (ja) | 撮像装置およびその制御方法 | |
JP6600458B2 (ja) | 撮像素子、焦点検出装置及び焦点検出方法 | |
JP2005269130A (ja) | 手振れ補正機能を有する撮像装置 | |
JP2015018084A (ja) | 撮像装置、画像処理方法及び画像処理プログラム | |
US9274402B2 (en) | Imaging device that executes auto focus control by referring to distance image data regarding a depth of a region | |
JP5933306B2 (ja) | 撮像装置 | |
US20220400215A1 (en) | Image pickup apparatus, image pickup method, and storage medium | |
JP2014230121A (ja) | 撮像装置および画素欠陥検出方法 | |
JP2016095416A (ja) | 焦点調節装置、撮像装置、焦点調節装置の制御方法、及びプログラム | |
WO2015182021A1 (ja) | 撮像制御装置、撮像装置および撮像制御方法 | |
JP2008283477A (ja) | 画像処理装置及び画像処理方法 | |
JP2014003417A (ja) | 撮像装置 | |
JP2011075841A (ja) | 撮像装置 | |
JP2018098764A (ja) | 撮像装置および画像合成方法 | |
JP6148575B2 (ja) | 撮像装置およびその制御方法、プログラム並びに記憶媒体 | |
JP2014180000A (ja) | 撮像装置及び撮像装置の制御方法 | |
JP2011180545A (ja) | 撮像装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15799350 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016523105 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15311560 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15799350 Country of ref document: EP Kind code of ref document: A1 |