WO2014132748A1 - 撮像装置及び車両制御装置 - Google Patents
撮像装置及び車両制御装置 Download PDFInfo
- Publication number
- WO2014132748A1 WO2014132748A1 PCT/JP2014/052378 JP2014052378W WO2014132748A1 WO 2014132748 A1 WO2014132748 A1 WO 2014132748A1 JP 2014052378 W JP2014052378 W JP 2014052378W WO 2014132748 A1 WO2014132748 A1 WO 2014132748A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dimensional object
- relative speed
- calculates
- reliability
- imaging device
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 53
- 238000004364 calculation method Methods 0.000 claims abstract description 48
- 238000001514 detection method Methods 0.000 claims abstract description 16
- 239000007787 solid Substances 0.000 claims abstract description 10
- 230000001133 acceleration Effects 0.000 claims description 8
- 238000000034 method Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 210000003127 knee Anatomy 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- FTGYKWAHGPIJIT-UHFFFAOYSA-N hydron;1-[2-[(2-hydroxy-3-phenoxypropyl)-methylamino]ethyl-methylamino]-3-phenoxypropan-2-ol;dichloride Chemical compound Cl.Cl.C=1C=CC=CC=1OCC(O)CN(C)CCN(C)CC(O)COC1=CC=CC=C1 FTGYKWAHGPIJIT-UHFFFAOYSA-N 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/223—Analysis of motion using block-matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8086—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for vehicle path indication
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- the present invention relates to an imaging device and a vehicle control device including two imaging units.
- Patent Document 1 As background art in this technical field.
- Patent Document 1 relates to a pedestrian movement start conditional expression using a knee position movement speed Vkt and a shoulder position movement speed Vst in which a pedestrian's intention is reflected in relation to a pedestrian's behavior (running road crossing behavior).
- an object of the present invention is to provide an imaging device that reduces the recognition of an erroneous moving body and prevents an erroneous brake operation even when the moving body crosses.
- an imaging apparatus includes a correlation value calculation unit that performs a correlation value calculation from two images captured by two imaging units, and a three-dimensional object detection that detects a three-dimensional object from the two images.
- a region dividing unit that divides an image region including a three-dimensional object into a plurality of regions, a relative speed calculating unit that calculates a relative speed for each of the regions, and a relative speed calculated for each region
- a reliability calculating unit that calculates the reliability of the three-dimensional object.
- the vehicle control device of the present invention includes a correlation value calculation unit that calculates a correlation value from two images captured by two imaging units, a three-dimensional object detection unit that detects a three-dimensional object from the two images, and a three-dimensional object.
- a region dividing unit that divides an image region including a plurality of regions, a relative velocity calculating unit that calculates a relative velocity for each of the regions, and a reliability of the three-dimensional object based on the relative velocity calculated for each region
- a control unit that controls the deceleration acceleration based on the reliability calculated by the imaging device.
- the present invention can provide an imaging apparatus capable of reducing recognition of an erroneous moving body and preventing an erroneous brake operation even when the moving body crosses.
- an example of an imaging apparatus that detects a three-dimensional object, calculates a relative speed, and outputs a reliability
- the three-dimensional object is a moving body such as a pedestrian
- a moving body detection apparatus that detects a moving body such as a pedestrian or a vehicle using a plurality of imaging units (cameras) such as a stereo camera has been put into practical use.
- a stereo camera calculates the positional deviation (parallax) of the same object (three-dimensional object) on a plurality of images taken at the same time by template matching, and the object (three-dimensional object) based on the calculated parallax.
- This stereo camera calculates the distance of an object such as a pedestrian using a pair of images captured by two image capturing units and recognizes the object, and detects the intrusion or abnormality of a suspicious person.
- the present invention can be applied to a monitoring system to be used and an in-vehicle system that supports safe driving of a car.
- a stereo camera used in the above-described monitoring system or in-vehicle system is to obtain a distance by applying a triangulation technique to a pair of images captured at a position interval, and includes at least two imaging units;
- a stereo image processing LSI Large Scale Integration
- the stereo image processing LSI IV which is this image processing unit, performs triangulation processing by performing processing of superimposing pixel information included in a pair of images to obtain a shift amount (parallax) of the coincident positions of the two images.
- FIG. 1 is a diagram showing a general principle of a stereo camera.
- ⁇ is parallax
- Z is a measurement distance (distance from the lens to the measurement point)
- f is a focal length (distance from the imaging surface to the lens)
- b is a baseline length (a length between two imaging elements).
- FIG. 2 shows the configuration of the entire system of the distance measuring method of the imaging apparatus which is a stereo camera according to the first embodiment of the present invention.
- the camera 201 that is the first image pickup unit and the camera 202 that is the second image pickup unit are provided, and the distance can be measured in stereo.
- a camera control unit 203 that controls the camera 201 and the camera 202, a RAM 204 that is a temporary storage area, a ROM 205 that stores programs and various initial values, a control system such as a brake, and communication means for notifying the user of the camera recognition state A given external IF 206, a correlation value calculation unit 207, a CPU 208 that controls the entire system, a solid object detection unit 209 that performs solid object detection using the output result of the correlation value calculation unit 207, and an arbitrary result among the results of the three-dimensional object detection result 209
- the cameras 201 and 202 are image sensors such as CCD (Charge Coupled Device Image Sensor) and CMOS (Complementary Metal Oxide Semiconductor) sensors.
- CCD Charge Coupled Device Image Sensor
- CMOS Complementary Metal Oxide Semiconductor
- the camera control unit 203 is a block that controls the camera 201 and the camera 202, and has a function of controlling the imaging timing of the two cameras to be the same and a function of controlling the exposure amount of the cameras to be the same. . This is because when the distance is measured using the camera 201 and the camera 202, when searching for the corresponding points of both cameras, the luminance value is the same.
- the correlation value calculation unit 207 will be described in detail with reference to FIG.
- the correlation value calculation is an operation for searching a position with high correlation in the horizontal direction from the images obtained from the camera 201 and the camera 202 and specifying the position.
- the search start position of the camera 201 of the stereo camera and the camera 202 are used.
- the difference between the positions having the highest correlation is the parallax ⁇ .
- the correlation value calculation method in this embodiment uses the sum of absolute differences (Sum of Absolute Difference), but the present invention is not limited to this.
- For the calculation of the correlation value several blocks are used as one block and comparison is performed for each block.
- FIG. 5 shows an image when one block is 4 pixels long and 4 pixels wide.
- a region surrounded by a dotted line represents a state of searching from an image of the camera 202 using one block of the camera 201 as a template. This search process is performed for each block size of the camera 201 to generate a parallax image.
- the parallax image shows the distance of each pixel, but since one parallax is obtained for every four vertical pixels and four horizontal pixels, the resolution becomes a quarter.
- the correlation value calculation unit 207 performs a correlation value calculation, calculates parallax, and generates a parallax image.
- the three-dimensional object detection unit 209 can find an arbitrary three-dimensional object (moving body such as a pedestrian) by calculating a three-dimensional position from a parallax image and grouping objects having a short distance. Also, an object identification process or the like is generally performed by pattern matching using luminance information of an original image, that is, a captured image captured by the camera 201.
- the region dividing unit 210 divides the region based on the output result from the three-dimensional object detection unit 209.
- the division method is not limited, but as an example, as shown in FIG. 3, the image is divided into a first region (upper body 301) and a second region (lower body 303).
- the division position is not limited to half the height. However, when the division is performed at the shoulder height position, the influence of the relative speed error is small due to the spread of the arms, so that it can be easily distinguished from the walking.
- the relative speed calculation unit 211 searches the correspondence between the past image stored in the storage unit and the current image for the output result of the region dividing unit 210, and selects the same one.
- the storage unit stores past positions and images.
- the movement amount is calculated for the corresponding one.
- the calculation method of the movement amount is not limited. For example, the center-of-gravity positions 302 and 304 of the areas (first area and second area) in which the three-dimensional object is detected are obtained, and the movement amounts of the center-of-gravity positions are used. That's fine. From the time (dt) required for the previous frame and the current frame and the moving distance (dx), the relative speed (dv) is obtained as shown in Equation 2 below.
- the reliability calculation unit 212 calculates the reliability from the relative speed obtained from the relative speed calculation unit 211.
- the reliability calculation method outputs the degree of coincidence of relative velocities obtained for each area divided for one three-dimensional object. For example, the reciprocal of the relative speed difference for each region can be used as the degree of coincidence. If the difference between the relative speed of the upper body 301 and the relative speed of the lower body 303 is the same, the reliability is high because the movement is sure. On the other hand, when only the upper body 301 has a relative speed and the lower body 303 has a small relative speed, there is a high possibility that the body does not really move, such as just bending and stretching. This is particularly effective for calculating the reliability of a three-dimensional object having a large shape change. However, the reliability calculation method is not limited to this.
- FIG. 4 shows a configuration of the entire system of the distance measuring method of the imaging apparatus which is a stereo camera according to the second embodiment of the present invention.
- the reliability calculation method is based on the similarity of images. Description of the blocks already described in the first embodiment is omitted.
- the feature amount calculation unit 410 calculates a feature amount for the position on the image from the position information of the three-dimensional object output from the three-dimensional object detection unit 209.
- the feature amount calculation unit 410 stores the past image in order to calculate a feature amount for determining the degree of coincidence between the past image and the current image.
- the coincidence degree determination unit 413 compares the feature amount obtained in this way with the past feature amount and the current feature amount, and calculates a coincidence degree depending on whether or not the feature amount is coincident. If the degree of coincidence is calculated, for example, as the difference sum of the past and current feature values, the value approaches 0 when the feature values are similar, and increases when the feature values are not similar.
- the relative speed calculation unit 411 obtains the relative speed according to the output result of the coincidence degree calculation unit 413.
- the relative speed can be obtained from the difference from the previous position, but the past several frames are held, and a frame having a high degree of coincidence is selected to calculate the relative speed. Then, since the shape is stable, the relative speed can be calculated with high accuracy.
- the degree of coincidence may be calculated for each divided area. Then, if the relative speed is calculated not using the whole image but using a portion that matches only a portion having a high degree of matching, it can be obtained with high accuracy.
- the reliability calculation unit 412 outputs the reliability based on the degree of coincidence used when the relative speed calculation unit 411 calculates the relative speed. For example, the case where the first frame and the third frame among the past three frames have a high degree of coincidence will be described.
- the relative speed (Va) obtained from the first and third frames and the relative speed (Vb) obtained from the second and third frames are used for the third frame. If the reliability is twice as high, the following equation 3 is used.
- the relative speed can be calculated with high accuracy, and the reliability at that time can also be output.
- FIG. I an example of a vehicle control device including the imaging device 1 described in the first and second embodiments and a control unit 2 that controls the vehicle based on the output of the imaging device 1 will be described with reference to FIG. I will explain. That is, this is an example in which the automobile is controlled based on the reliability output in the first and second embodiments.
- FIG. 6 A scene where a car is likely to collide with a pedestrian as shown in FIG. 6 will be described.
- the arrow indicates the moving direction, and shows how a person crosses (crosses) the traveling path of the car.
- the imaging device which is the stereo camera described in the first and second embodiments, is attached to a car, and the brake is automatically applied when there is a possibility of collision.
- the possibility of a collision can be determined from the distance to the pedestrian, the vehicle speed of the automobile, and the moving speed of the pedestrian.
- the reliability output from the reliability calculation units 212 and 412 of the imaging apparatuses according to the first and second embodiments is lower than a predetermined value, the pedestrian is not actually moving (moving), and erroneous measurement is performed. Judging that there is a high possibility that
- control unit 2 is configured to control the deceleration acceleration based on the reliability. Accordingly, the deceleration of the brake as shown in FIG. 7 is different between the normal state and the deceleration acceleration suppression state, and the control unit 2 controls the deceleration acceleration so that the vehicle does not suddenly stop suddenly. You can
- the control unit 2 includes a determination unit 3 that determines whether to control deceleration acceleration or to output an alarm signal based on the reliability calculated by the imaging device 1.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
- Mechanical Engineering (AREA)
- Image Analysis (AREA)
- Transportation (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Image Processing (AREA)
- Automation & Control Theory (AREA)
Abstract
Description
Z = b f / δ (1)
図2は、本発明の第1の実施の形態におけるステレオカメラである撮像装置の距離測定方法のシステム全体の構成を示す。
dv=dx÷dv (2)
つまり、相対速度算出部211では、移動量を算出し、算出した移動量から相対速度を算出するものである。
相対速度=(Va×2+Vb)÷3 (3)
このように信頼度が高い相対速度に重みをおいて算出した場合、高い信頼度を出力する。
202 カメラ
203 カメラ制御部
207 相関値算出部
209 立体物検出部
210 領域分割部
211 相対速度算出部
212 信頼度算出部
Claims (10)
- 2つの撮像部を有する撮像装置において、
前記2つの撮像部で撮像された2つの画像から相関値演算を行う相関値算出部と、
前記2つの画像から立体物を検出する立体物検出部と、
前記立体物を含む画像領域を複数の領域に分割する領域分割部と、
前記複数の領域の領域毎に相対速度を算出する相対速度算出部と、
領域毎に算出された前記相対速度に基づいて立体物の信頼度を算出する信頼度算出部と、を有する撮像装置。 - 請求項1記載の撮像装置において、
前記領域分割部は、前記立体物を含む画像領域を縦に2分割する撮像装置。 - 請求項2記載の撮像装置において、
前記領域分割部は、前記立体物検出部で検出された前記立体物が歩行者である場合、前記歩行者の肩の位置を基準に2分割する撮像装置。 - 請求項1記載の撮像装置において、
前記相関値演算部は、相関値演算した結果から視差画像を生成し、
前記立体物検出部は、前記視差画像を用いて立体物を検出する撮像装置。 - 請求項1記載の撮像装置において、
前記信頼度算出部は、領域毎に算出された相対速度の一致度に基づいて立体物が移動しているか否かの信頼度を算出する撮像装置。 - 請求項1記載の撮像装置において、
前記立体物検出部にて検出された立体物を含む画像から特徴量を算出する特徴量算出部と、
記憶された過去の特徴量と算出された現在の特徴量との特徴量の一致度を算出する一致度算出部と、を有し、
前記相対速度算出部は、算出された前記一致度に基づいて相対速度を算出する撮像装置。 - 請求項6記載の撮像装置において、
前記信頼度算出部は、前記特徴量の一致度に基づいて立体物の信頼度を算出する撮像装置。 - 請求項6記載の撮像装置において、
前記相対速度算出部は、記憶された過去のフレームから最も特徴量の一致度が高いフレームを選択して、相対速度を算出する撮像装置。 - 2つの撮像部で撮像された2つの画像から相関値演算を行う相関値算出部と、前記2つの画像から立体物を検出する立体物検出部と、前記立体物を含む画像領域を複数の領域に分割する領域分割部と、前記複数の領域の領域毎に相対速度を算出する相対速度算出部と、領域毎に算出された前記相対速度に基づいて立体物の信頼度を算出する信頼度算出部と、を有する撮像装置と、
前記撮像装置で算出された前記信頼度に基づいて減速加速度を制御する制御部と、
を有する車両制御装置。 - 請求項9記載の車両制御装置において、
前記制御部は、前記撮像装置で算出された前記信頼度に基づいて、減速加速度を制御するか、警報信号を出力するかを判断する判断部を有する車両制御装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015502825A JP6072892B2 (ja) | 2013-02-27 | 2014-02-03 | 撮像装置及び車両制御装置 |
EP14756720.0A EP2963615B1 (en) | 2013-02-27 | 2014-02-03 | Imaging device, and vehicle control device |
US14/770,033 US10140717B2 (en) | 2013-02-27 | 2014-02-03 | Imaging apparatus and vehicle controller |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013036563 | 2013-02-27 | ||
JP2013-036563 | 2013-02-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014132748A1 true WO2014132748A1 (ja) | 2014-09-04 |
Family
ID=51428022
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/052378 WO2014132748A1 (ja) | 2013-02-27 | 2014-02-03 | 撮像装置及び車両制御装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10140717B2 (ja) |
EP (1) | EP2963615B1 (ja) |
JP (1) | JP6072892B2 (ja) |
WO (1) | WO2014132748A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017199556A1 (ja) * | 2016-05-17 | 2017-11-23 | 富士フイルム株式会社 | ステレオカメラ及びステレオカメラの制御方法 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9619803B2 (en) | 2015-04-30 | 2017-04-11 | Google Inc. | Identifying consumers in a transaction via facial recognition |
US10397220B2 (en) | 2015-04-30 | 2019-08-27 | Google Llc | Facial profile password to modify user account data for hands-free transactions |
US10733587B2 (en) | 2015-04-30 | 2020-08-04 | Google Llc | Identifying consumers via facial recognition to provide services |
KR101832189B1 (ko) * | 2015-07-29 | 2018-02-26 | 야마하하쓰도키 가부시키가이샤 | 이상화상 검출장치, 이상화상 검출장치를 구비한 화상 처리 시스템 및 화상 처리 시스템을 탑재한 차량 |
US11062304B2 (en) | 2016-10-20 | 2021-07-13 | Google Llc | Offline user identification |
US10670680B2 (en) * | 2017-04-06 | 2020-06-02 | Case Western Reserve University | System and method for motion insensitive magnetic resonance fingerprinting |
JP6570791B2 (ja) * | 2017-04-26 | 2019-09-04 | 三菱電機株式会社 | 処理装置 |
WO2018222232A1 (en) * | 2017-05-31 | 2018-12-06 | Google Llc | Providing hands-free data for interactions |
JP7139431B2 (ja) * | 2018-08-22 | 2022-09-20 | 日立Astemo株式会社 | 画像処理装置および画像処理方法 |
DE102019212022B4 (de) * | 2019-08-09 | 2021-03-04 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zum Feststellen eines Parallaxenproblems in Sensordaten zweier Sensoren |
DE102019212021B4 (de) * | 2019-08-09 | 2024-02-08 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zum Feststellen eines Parallaxenproblems in Sensordaten zweier Sensoren |
JP7431623B2 (ja) * | 2020-03-11 | 2024-02-15 | 株式会社Subaru | 車外環境認識装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007316790A (ja) * | 2006-05-24 | 2007-12-06 | Nissan Motor Co Ltd | 歩行者検出装置および歩行者検出方法 |
JP2008045974A (ja) * | 2006-08-14 | 2008-02-28 | Fuji Heavy Ind Ltd | 物体検出装置 |
JP2010066810A (ja) | 2008-09-08 | 2010-03-25 | Mazda Motor Corp | 車両用歩行者検出装置 |
JP2012043271A (ja) * | 2010-08-20 | 2012-03-01 | Aisin Aw Co Ltd | 交差点情報取得装置、交差点情報取得方法及びコンピュータプログラム |
JP2012123667A (ja) * | 2010-12-09 | 2012-06-28 | Panasonic Corp | 姿勢推定装置および姿勢推定方法 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2779257B1 (fr) | 1998-05-27 | 2000-08-11 | France Telecom | Procede de detection de la profondeur relative entre objets dans une image a partir d'un couple d'images |
DE102004018813A1 (de) * | 2004-04-19 | 2006-02-23 | Ibeo Automobile Sensor Gmbh | Verfahren zur Erkennung und/oder Verfolgung von Objekten |
JP4177826B2 (ja) * | 2005-03-23 | 2008-11-05 | 株式会社東芝 | 画像処理装置および画像処理方法 |
WO2006121088A1 (ja) * | 2005-05-10 | 2006-11-16 | Olympus Corporation | 画像処理装置、画像処理方法および画像処理プログラム |
JP2008203992A (ja) * | 2007-02-16 | 2008-09-04 | Omron Corp | 検出装置および方法、並びに、プログラム |
JP5172314B2 (ja) * | 2007-12-14 | 2013-03-27 | 日立オートモティブシステムズ株式会社 | ステレオカメラ装置 |
CN101983389B (zh) * | 2008-10-27 | 2012-11-21 | 松下电器产业株式会社 | 移动体检测方法以及移动体检测装置 |
CN102227750B (zh) * | 2009-07-31 | 2014-06-25 | 松下电器产业株式会社 | 移动体检测装置及移动体检测方法 |
JP2013537661A (ja) * | 2010-06-30 | 2013-10-03 | タタ コンサルタンシー サービシズ リミテッド | ステレオビジョン技術を使用することによる移動物体の自動検出 |
JP5503578B2 (ja) * | 2011-03-10 | 2014-05-28 | パナソニック株式会社 | 物体検出装置及び物体検出方法 |
WO2013042205A1 (ja) * | 2011-09-20 | 2013-03-28 | トヨタ自動車株式会社 | 歩行者行動予測装置および歩行者行動予測方法 |
CN104424634B (zh) * | 2013-08-23 | 2017-05-03 | 株式会社理光 | 对象跟踪方法和装置 |
CN104794733B (zh) * | 2014-01-20 | 2018-05-08 | 株式会社理光 | 对象跟踪方法和装置 |
-
2014
- 2014-02-03 WO PCT/JP2014/052378 patent/WO2014132748A1/ja active Application Filing
- 2014-02-03 JP JP2015502825A patent/JP6072892B2/ja active Active
- 2014-02-03 US US14/770,033 patent/US10140717B2/en active Active
- 2014-02-03 EP EP14756720.0A patent/EP2963615B1/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007316790A (ja) * | 2006-05-24 | 2007-12-06 | Nissan Motor Co Ltd | 歩行者検出装置および歩行者検出方法 |
JP2008045974A (ja) * | 2006-08-14 | 2008-02-28 | Fuji Heavy Ind Ltd | 物体検出装置 |
JP2010066810A (ja) | 2008-09-08 | 2010-03-25 | Mazda Motor Corp | 車両用歩行者検出装置 |
JP2012043271A (ja) * | 2010-08-20 | 2012-03-01 | Aisin Aw Co Ltd | 交差点情報取得装置、交差点情報取得方法及びコンピュータプログラム |
JP2012123667A (ja) * | 2010-12-09 | 2012-06-28 | Panasonic Corp | 姿勢推定装置および姿勢推定方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2963615A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017199556A1 (ja) * | 2016-05-17 | 2017-11-23 | 富士フイルム株式会社 | ステレオカメラ及びステレオカメラの制御方法 |
JPWO2017199556A1 (ja) * | 2016-05-17 | 2019-04-04 | 富士フイルム株式会社 | ステレオカメラ及びステレオカメラの制御方法 |
US10863164B2 (en) | 2016-05-17 | 2020-12-08 | Fujifilm Corporation | Stereo camera and method of controlling stereo camera |
Also Published As
Publication number | Publication date |
---|---|
EP2963615B1 (en) | 2019-04-10 |
EP2963615A1 (en) | 2016-01-06 |
JP6072892B2 (ja) | 2017-02-01 |
EP2963615A4 (en) | 2016-11-09 |
US10140717B2 (en) | 2018-11-27 |
JPWO2014132748A1 (ja) | 2017-02-02 |
US20160005180A1 (en) | 2016-01-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6072892B2 (ja) | 撮像装置及び車両制御装置 | |
JP6623044B2 (ja) | ステレオカメラ装置 | |
JP7025912B2 (ja) | 車載環境認識装置 | |
US9704047B2 (en) | Moving object recognition apparatus | |
US10210400B2 (en) | External-environment-recognizing apparatus | |
US20120081542A1 (en) | Obstacle detecting system and method | |
JP6722051B2 (ja) | 物体検出装置、及び物体検出方法 | |
JP2019078716A (ja) | 距離計測装置、距離計測システム、撮像装置、移動体、距離計測装置の制御方法およびプログラム | |
WO2013132947A1 (ja) | 距離算出装置及び距離算出方法 | |
EP2924655B1 (en) | Disparity value deriving device, equipment control system, movable apparatus, robot, disparity value deriving method, and computer-readable storage medium | |
JPWO2009099022A1 (ja) | 周辺監視装置及び周辺監視方法 | |
JP6315308B2 (ja) | 制御用対象物識別装置、移動体機器制御システム及び制御用対象物認識用プログラム | |
JP2015232455A (ja) | 画像処理装置、画像処理方法、プログラム、視差データの生産方法、機器制御システム | |
JP2009169776A (ja) | 検出装置 | |
JP6722084B2 (ja) | 物体検出装置 | |
US8160300B2 (en) | Pedestrian detecting apparatus | |
WO2017047282A1 (ja) | 画像処理装置、物体認識装置、機器制御システム、画像処理方法およびプログラム | |
JPWO2017145541A1 (ja) | 移動体 | |
KR20060021922A (ko) | 두 개의 카메라를 이용한 장애물 감지 기술 및 장치 | |
JP2014154898A (ja) | 物体検知装置 | |
JP2020109560A (ja) | 信号機認識方法及び信号機認識装置 | |
WO2017154305A1 (ja) | 画像処理装置、機器制御システム、撮像装置、画像処理方法、及び、プログラム | |
CN109308442B (zh) | 车外环境识别装置 | |
JP2017167970A (ja) | 画像処理装置、物体認識装置、機器制御システム、画像処理方法およびプログラム | |
JP5717416B2 (ja) | 運転支援制御装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14756720 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015502825 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014756720 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14770033 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |