WO2014050286A1 - 移動物体認識装置 - Google Patents
移動物体認識装置 Download PDFInfo
- Publication number
- WO2014050286A1 WO2014050286A1 PCT/JP2013/070274 JP2013070274W WO2014050286A1 WO 2014050286 A1 WO2014050286 A1 WO 2014050286A1 JP 2013070274 W JP2013070274 W JP 2013070274W WO 2014050286 A1 WO2014050286 A1 WO 2014050286A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- moving object
- unit
- area
- pedestrian
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/254—Analysis of motion involving subtraction of images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the present invention relates to a moving object recognition device that detects a moving object from image information outside a vehicle.
- three-dimensional information to the object is obtained from the difference in appearance of the left and right cameras of the stereo camera, a three-dimensional object is detected therefrom, and whether the detected three-dimensional object is a pedestrian's shape or size Further, by determining the moving speed of the detected three-dimensional object, it is determined whether or not the detected three-dimensional object is a crossing pedestrian who may enter the traveling path of the vehicle.
- Patent Document 1 when obtaining a shift in the appearance of the left and right cameras with a stereo camera, the corresponding points of the images of the left and right cameras are obtained by image processing. It takes time for the detected moving speed values of the three-dimensional object to converge. Therefore, it takes time to detect a pedestrian crossing that may enter into the own road.
- a moving object recognition device is a moving object based on an image captured by a first imaging unit, a second imaging unit, a first imaging unit, and the second imaging unit.
- a non-overlapping area where the imaging area of the first imaging section and the imaging area of the second imaging section do not overlap is a first area, If an overlapping area where the imaging area of the imaging section of 1 and the imaging area of the second imaging section overlap is the second area, a method of detecting a moving object in the first area and the second area is used. Different configuration.
- FIG. 1 It is a figure showing one embodiment of a moving object recognition device concerning the present invention. It is a figure explaining the moving object recognition device concerning the present invention. It is a figure which shows the processing flow of the 1st area
- FIG. 1 One embodiment of a moving object recognition apparatus for detecting a moving object crossing a road, in this embodiment, a pedestrian using an image of a stereo camera mounted on a vehicle will be described using FIG.
- FIG. 1 is a block diagram for realizing a stereo camera which is a moving object recognition device of the present invention.
- the stereo camera detects a moving object based on an image captured by the left imaging unit 101 as a first imaging unit, the right imaging unit 102 as a second imaging unit, and the left imaging unit 101 and the right imaging unit 102.
- Moving object detection unit 108 detects a moving object by imaging the front of a vehicle equipped with a stereo camera.
- the moving object detection unit 108 includes a distance calculation unit 103, a first area processing unit 104, and a second area processing unit 105.
- the distance calculation unit 103 inputs the first image captured by the left imaging unit 101 and the second image captured by the right imaging unit 102, and the left imaging unit 101 and the right imaging unit 102 capture the same object. The distance to the object is calculated from the displacement on the first image and the second image. Details of the distance calculation unit 103 will be described later.
- the first area processing unit 104 detects a pedestrian crossing (a pedestrian crossing a road) from images of non-overlapping areas of the left imaging unit 101 and the right imaging unit 102.
- the non-overlapping region is a region where the imaging range 201 of the left imaging unit 101 and the imaging range 202 of the right imaging unit 102 do not overlap in FIG. 2 (the non-overlapping region 206 of the left imaging unit and the non-overlapping region of the right imaging unit). It is an overlapping area 207).
- the non-overlapping region of the left imaging unit 101 is the first region 203 and the non-overlapping region of the right imaging unit 102
- the first region 204 becomes the first region 204. Details of the first area processing unit 104 will be described later.
- the second area processing unit 105 detects a pedestrian crossing from the image of the overlapping area of the left imaging unit 101 and the right imaging unit 102.
- the overlapping area is an overlapping area 208 in which the imaging range 201 of the left imaging unit 101 and the imaging range 202 of the right imaging unit 102 overlap in FIG.
- the overlapping area is the second area 205. Details of the second area processing unit 105 will be described later.
- the feature of the present invention is that the method of detecting a moving object is different between the first area 203 and the second area 205 described above.
- movement is performed based on one image captured by the left imaging unit 101 as the first imaging unit or the single-eye camera of the right imaging unit 102 as the second imaging unit.
- An object is detected, and in the second area, a moving object is detected with the two cameras of the left imaging unit 101 and the right imaging unit 102 as stereo cameras. This makes it possible to detect moving objects such as pedestrians crossing the road earlier.
- Crossing pedestrian determination unit 106 is a collision that determines whether the crossing pedestrian detected by the second area processing unit 105 enters into the vehicle traveling path and collides with the vehicle or the possibility of collision It is a judgment part. If it is determined that the possibility of collision is high, information on the position and velocity of the pedestrian crossing is output from the pedestrian information output unit 107, and the vehicle is a driver based on the information on the position and velocity of the pedestrian crossing And automatic braking control to avoid collision with pedestrians crossing the road.
- an image captured by the left imaging unit 101 as the first imaging unit and the right imaging unit 102 as the second imaging unit left image as the first image, second image Get the right image.
- the first image (left image) captured by the left imaging unit 101 is a first non-overlapping area of the left and right imaging units.
- the image of the area 203 is used to detect mobile pedestrian candidates.
- images successive in time series of the first region 203 are compared, and a portion on the image which has changed on the image is detected.
- the optical flow of images continuous in time series is calculated. The optical flow can be calculated by comparing the images of two frames, and identifying the corresponding pixels for the same object included in each of the two images, which is a well-known technology that has already been established.
- a part showing a movement different from the movement of the background accompanying the movement of the vehicle is extracted.
- vehicle information such as the velocity of the vehicle, yaw rate, and internal parameters of the camera (unit cell size of the imaging device, focal length, distortion parameters)
- the motion of the background is estimated using geometric conditions such as, angle of fall, angle of rotation).
- geometric conditions such as, angle of fall, angle of rotation.
- a portion with a size is a candidate for a moving pedestrian.
- the size within a certain range is a size calculated in advance on what size an object of average size from a child's pedestrian to an adult's pedestrian is imaged on the screen. Further, among the candidates, only the candidate moving from the first area 203 to the direction of the second area 205 is set as a mobile pedestrian candidate.
- the mobile pedestrian candidate detected in the moving pedestrian detection process 302 in the first area of the left image is the first area 203 to the third area in the next frame. It is determined whether to enter the second area 205 or not.
- the movement If it progresses to pedestrian position / speed output processing 304, and it judges with not going into the 2nd field 205 by the next frame, it will move to mobile pedestrian detection processing 305 in the 1st field of a right picture.
- the position and the velocity on the image of the mobile pedestrian candidate detected in the mobile pedestrian detection processing 302 in the first region of the left image are output.
- the output result is used as an initial value when the second area processing unit 105 detects a moving pedestrian.
- the first region 204 which is a non-overlapping region of the left and right imaging units in the second image (right image) captured by the right imaging unit 102
- the candidate of the moving pedestrian is detected using the image of.
- the method of detecting the candidate of the mobile pedestrian is the same as the mobile pedestrian detection processing 302 in the first region of the left image, and first, the images consecutive in the time series of the first region 204 are compared It detects a part where the object has moved by changing, and shows a movement different from the movement of the background as the movement of the vehicle, and a part having a size within a certain range is a candidate for a moving pedestrian I assume. Further, among the candidates, only the candidate moving from the first area 204 to the direction of the second area 205 is set as a candidate for a mobile pedestrian.
- the mobile pedestrian candidate detected in the moving pedestrian detection process 305 in the first area of the right image is the second area from the first area 204 in the next frame. It is determined whether to enter the area 205 of If it is determined to enter the second area 205 in the next frame, using the moving velocity on the image of the moving pedestrian detected in the moving pedestrian detection processing 305 in the first region of the right image, the movement The process proceeds to the pedestrian position and speed output process 307, and when it is determined that the second area 205 is not entered in the next frame, the process ends.
- the position and the velocity on the image of the candidate of the mobile pedestrian detected in the mobile pedestrian detection processing 305 in the first region of the right image are output.
- the output result is used as an initial value when the second area processing unit 105 detects a moving pedestrian.
- distance information acquisition processing 401 three-dimensional distance information to the target object ahead of the host vehicle calculated by the distance calculation unit 103 is acquired.
- the method of acquiring three-dimensional distance information to the target object ahead of the host vehicle by the distance calculation unit 103 will be described later.
- the vehicle is the most self-vehicle for each section obtained by vertically dividing the distance image. Distances at close positions are extracted as representative values one by one, distance data existing near each other among the representative values are grouped, and a group having a certain size or more is set as a three-dimensional object. The traveling plane on which the vehicle travels is estimated, and a solid object located above the traveling plane is detected.
- the detected three-dimensional object is a pedestrian.
- the size of the three-dimensional object an average value of the sizes of pedestrians from children to adults is learned in advance, and if it is a size within the range of the learning value, it is determined to be a pedestrian.
- the shape of the three-dimensional object if the upper part of the detected three-dimensional object resembles the shape of the human head and shoulder, it is determined that the user is a pedestrian.
- the three-dimensional object moving to the central part of the second region is extracted from the portion adjacent to the first regions 203 and 204 of the second region 205.
- the process proceeds to the mobile pedestrian position / speed initial value input processing 404. If there is no pedestrian entering the second area 205 from the first areas 203 and 204, the process proceeds to the moving pedestrian speed calculation process 405.
- the pedestrian candidate detection process 402 detects the position and velocity of the mobile pedestrian output on the mobile pedestrian position / velocity output processes 304 and 307 on the screen.
- the initial values of the position and speed obtained in the moving pedestrian position / speed initial value input processing 404 are given to the three-dimensional object selected as the candidate in the pedestrian candidate detection processing 402.
- the moving speed of is calculated, and the result is taken as the moving pedestrian speed.
- FIG. 5 shows a camera of a corresponding point 601 (the same object captured by the left and right imaging units) of the image 209 captured by the left imaging unit of the stereo camera device as a moving object recognition device and the image 210 captured by the right imaging unit.
- the left imaging unit 101 is a camera with a focal length f consisting of a lens 602 and an imaging surface 603 and an optical axis 608 of the imaging device
- the right imaging unit 102 is a focal length f consisting of a lens 604 and an imaging surface 605
- the camera of the optical axis 609 of the imaging device 601 points in front of the camera is captured to point 606 of the imaging surface 603 of the left image capturing unit 101 (a distance from the optical axis 608 of the d 2), the left in the image 209 point 606 (the optical axis 608 of the d 4 pixel position) It becomes.
- 601 point in front of the camera is imaged to a point 607 of the imaging surface 605 of the right imaging unit 102 (the distance from the optical axis 609 d 3), d 5 pixels from the right in the image 210 point 607 (the optical axis 609 Position).
- a is the size of the imaging element of the imaging surfaces 603 and 605.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
Claims (7)
- 第1の撮像部と、
第2の撮像部と、
前記第1の撮像部及び前記第2の撮像部で撮像した画像に基づいて移動物体を検出する移動物体検出部と、を有し、
前記移動物体検出部は、前記第1の撮像部の撮像領域と前記第2の撮像部の撮像領域とが重複しない非重複領域を第一の領域、前記第1の撮像部の撮像領域と前記第2の撮像部の撮像領域とが重複する重複領域を第二の領域、とした場合、前記第一の領域と前記第二の領域とで、移動物体の検出方法が異なる移動物体認識装置。 - 請求項1記載の移動物体認識装置において、
前記第一の領域においては、前記第1の撮像部又は前記第2の撮像部で撮像した1つの画像に基づいて移動物体の検出し、
前記第二の領域においては、前記第1の撮像部及び前記第2の撮像部で撮像した2つの画像に基づいて移動物体の検出する移動物体認識装置。 - 請求項1記載の移動物体認識装置において、
前記移動物体検出部は、
前記第1の撮像部及び前記第2の撮像部で撮像した画像に基づいて、移動物体までの距離を算出する距離算出部と、
前記第一の領域の画像から移動物体を検知する第一の領域処理部と、
前記距離算出部で算出された前記移動物体までの距離と、前記第一の領域処理部での前記移動物体の検知結果と、に基づいて前記第二の領域の画像から移動物体を検知する第二の領域処理部と、を有する移動物体認識装置。 - 請求項3記載の移動物体認識装置において、
前記第一の領域処理部は、
前記第1の撮像部で撮像した第1の画像と、前記第2の撮像部で撮像した第2の画像を取得する画像取得部と、
前記第1の画像又は前記第2の画像の前記第一の領域において時系列の画像情報を用いて移動物体候補を検出する移動歩行者検出部と、
前記移動歩行者検出部で検出された前記移動物体候補が前記第二の領域に進入するか否かを判定する第二の領域進入判定部と、
前記第二の領域進入判定部にて前記移動物体候補が前記第二の領域に進入すると判定された場合、前記移動物体候補の画像上の位置と速度の情報を出力する移動歩行者位置速度出力部と、を有する移動物体認識装置。 - 請求項4記載の移動物体認識装置において、
前記第二の領域処理部は、
前記距離算出部で算出された前記移動物体までの距離情報を取得する距離情報取得処理部と、
前記距離情報取得処理部で取得した前記距離情報から距離画像を生成し、前記距離画像から前記移動物体が歩行者か否か判定し、歩行者候補を検出する歩行者候補検出部と、 前記歩行者候補検出部で検出された歩行者候補において前記第一の領域から前記第二の領域に移動する歩行者が存在するか否かを判定する移動歩行者進入判定部と、
前記移動歩行者進入判定部において、歩行者が存在すると判定された場合、前記移動歩行者位置速度出力部で出力された前記移動物体候補の画像上の位置と速度の情報を前記歩行者の初期値として設定する移動歩行者位置速度初期値入力部と、
設定された前記歩行者の移動速度を算出する移動歩行者速度算出部と、を有する移動物体認識装置 - 請求項3記載の移動物体認識装置において、
前記距離算出部は、前記第二の領域において、前記第1の撮像部及び前記第2の撮像部で撮像した画像の見え方のずれから三次元の距離情報を算出する移動物体認識装置。 - 請求項3記載の移動物体認識装置において、
前記第二の領域処理部で検知された前記移動物体が自車と衝突するか否か、又は衝突する可能性を判定する衝突判定部と、
前記衝突判定部による判定結果に基づいて、前記移動物体の情報を出力する移動物体情報出力部と、を有する移動物体認識装置。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/422,450 US9704047B2 (en) | 2012-09-26 | 2013-07-26 | Moving object recognition apparatus |
DE112013004720.9T DE112013004720T5 (de) | 2012-09-26 | 2013-07-26 | Vorrichtung zum Erkennen sich bewegender Objekte |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012211663A JP6014440B2 (ja) | 2012-09-26 | 2012-09-26 | 移動物体認識装置 |
JP2012-211663 | 2012-09-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014050286A1 true WO2014050286A1 (ja) | 2014-04-03 |
Family
ID=50387700
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/070274 WO2014050286A1 (ja) | 2012-09-26 | 2013-07-26 | 移動物体認識装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9704047B2 (ja) |
JP (1) | JP6014440B2 (ja) |
DE (1) | DE112013004720T5 (ja) |
WO (1) | WO2014050286A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106170828A (zh) * | 2014-04-24 | 2016-11-30 | 日立汽车系统株式会社 | 外界识别装置 |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6110256B2 (ja) * | 2013-08-21 | 2017-04-05 | 株式会社日本自動車部品総合研究所 | 対象物推定装置および対象物推定方法 |
KR101601475B1 (ko) | 2014-08-25 | 2016-03-21 | 현대자동차주식회사 | 야간 주행 시 차량의 보행자 검출장치 및 방법 |
JP6623044B2 (ja) * | 2015-11-25 | 2019-12-18 | 日立オートモティブシステムズ株式会社 | ステレオカメラ装置 |
US10643464B2 (en) * | 2016-04-25 | 2020-05-05 | Rami B. Houssami | Pace delineation jibe iota |
RU2699716C1 (ru) * | 2016-05-30 | 2019-09-09 | Ниссан Мотор Ко., Лтд. | Способ обнаружения объектов и устройство обнаружения объектов |
CN108256404B (zh) * | 2016-12-29 | 2021-12-10 | 北京旷视科技有限公司 | 行人检测方法和装置 |
US10699139B2 (en) * | 2017-03-30 | 2020-06-30 | Hrl Laboratories, Llc | System for real-time object detection and recognition using both image and size features |
EP4283575A3 (en) * | 2017-10-12 | 2024-02-28 | Netradyne, Inc. | Detection of driving actions that mitigate risk |
JP7025912B2 (ja) | 2017-12-13 | 2022-02-25 | 日立Astemo株式会社 | 車載環境認識装置 |
JP7079182B2 (ja) | 2018-10-26 | 2022-06-01 | 株式会社クボタ | 電子燃料噴射式ディーゼルエンジン |
CN111814513B (zh) * | 2019-04-11 | 2024-02-13 | 富士通株式会社 | 行人物品检测装置及方法、电子设备 |
DE102019212022B4 (de) * | 2019-08-09 | 2021-03-04 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zum Feststellen eines Parallaxenproblems in Sensordaten zweier Sensoren |
DE102019212021B4 (de) * | 2019-08-09 | 2024-02-08 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zum Feststellen eines Parallaxenproblems in Sensordaten zweier Sensoren |
JP7499140B2 (ja) | 2020-10-14 | 2024-06-13 | 日立Astemo株式会社 | 物体認識装置 |
JP2022121377A (ja) * | 2021-02-08 | 2022-08-19 | フォルシアクラリオン・エレクトロニクス株式会社 | 外界認識装置 |
JP2022182335A (ja) * | 2021-05-28 | 2022-12-08 | 株式会社Subaru | 車両の車外撮像装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06325180A (ja) * | 1993-05-14 | 1994-11-25 | Matsushita Electric Ind Co Ltd | 移動体自動追跡装置 |
JP2005228127A (ja) * | 2004-02-13 | 2005-08-25 | Fuji Heavy Ind Ltd | 歩行者検出装置、及び、その歩行者検出装置を備えた車両用運転支援装置 |
JP2006041939A (ja) * | 2004-07-27 | 2006-02-09 | Victor Co Of Japan Ltd | 監視装置及び監視プログラム |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6873251B2 (en) * | 2002-07-16 | 2005-03-29 | Delphi Technologies, Inc. | Tracking system and method employing multiple overlapping sensors |
EP1751495A2 (en) * | 2004-01-28 | 2007-02-14 | Canesta, Inc. | Single chip red, green, blue, distance (rgb-z) sensor |
US7576639B2 (en) * | 2006-03-14 | 2009-08-18 | Mobileye Technologies, Ltd. | Systems and methods for detecting pedestrians in the vicinity of a powered industrial vehicle |
US8633810B2 (en) * | 2009-11-19 | 2014-01-21 | Robert Bosch Gmbh | Rear-view multi-functional camera system |
-
2012
- 2012-09-26 JP JP2012211663A patent/JP6014440B2/ja active Active
-
2013
- 2013-07-26 US US14/422,450 patent/US9704047B2/en active Active
- 2013-07-26 WO PCT/JP2013/070274 patent/WO2014050286A1/ja active Application Filing
- 2013-07-26 DE DE112013004720.9T patent/DE112013004720T5/de active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06325180A (ja) * | 1993-05-14 | 1994-11-25 | Matsushita Electric Ind Co Ltd | 移動体自動追跡装置 |
JP2005228127A (ja) * | 2004-02-13 | 2005-08-25 | Fuji Heavy Ind Ltd | 歩行者検出装置、及び、その歩行者検出装置を備えた車両用運転支援装置 |
JP2006041939A (ja) * | 2004-07-27 | 2006-02-09 | Victor Co Of Japan Ltd | 監視装置及び監視プログラム |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106170828A (zh) * | 2014-04-24 | 2016-11-30 | 日立汽车系统株式会社 | 外界识别装置 |
EP3136368A4 (en) * | 2014-04-24 | 2017-11-29 | Hitachi Automotive Systems, Ltd. | External-environment-recognizing apparatus |
US10210400B2 (en) | 2014-04-24 | 2019-02-19 | Hitachi Automotive Systems, Ltd. | External-environment-recognizing apparatus |
CN106170828B (zh) * | 2014-04-24 | 2020-04-03 | 日立汽车系统株式会社 | 外界识别装置 |
Also Published As
Publication number | Publication date |
---|---|
DE112013004720T5 (de) | 2015-06-11 |
US20150235093A1 (en) | 2015-08-20 |
JP6014440B2 (ja) | 2016-10-25 |
JP2014067198A (ja) | 2014-04-17 |
US9704047B2 (en) | 2017-07-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014050286A1 (ja) | 移動物体認識装置 | |
US10043082B2 (en) | Image processing method for detecting objects using relative motion | |
CN106485233B (zh) | 可行驶区域检测方法、装置和电子设备 | |
KR20200127219A (ko) | 검출된 배리어에 기반한 차량의 항법 | |
JP5421072B2 (ja) | 接近物体検知システム | |
KR20190080885A (ko) | 차로 병합 및 차로 분리의 항법을 위한 시스템 및 방법 | |
WO2010032523A1 (ja) | 道路境界検出判断装置 | |
EP4187523A1 (en) | Systems and methods for curb detection and pedestrian hazard assessment | |
EP2928178B1 (en) | On-board control device | |
WO2013136878A1 (ja) | 物体検出装置 | |
CN109997148B (zh) | 信息处理装置、成像装置、设备控制系统、移动对象、信息处理方法和计算机可读记录介质 | |
WO2015189847A1 (en) | Top-down refinement in lane marking navigation | |
JP5482672B2 (ja) | 移動物体検出装置 | |
JP6816401B2 (ja) | 画像処理装置、撮像装置、移動体機器制御システム、画像処理方法、及びプログラム | |
JP6032034B2 (ja) | 物体検知装置 | |
JP4937844B2 (ja) | 歩行者検出装置 | |
CN109522779B (zh) | 图像处理装置 | |
JP3651419B2 (ja) | 環境認識装置 | |
JP5587852B2 (ja) | 画像処理装置及び画像処理方法 | |
JP2005309660A (ja) | 車両用右左折支援装置 | |
CN109308442B (zh) | 车外环境识别装置 | |
JP6253175B2 (ja) | 車両の外部環境認識装置 | |
JP6295868B2 (ja) | 車両用表示装置 | |
JP6361347B2 (ja) | 車両用表示装置 | |
JP6564682B2 (ja) | 対象物検出装置、対象物検出方法、及び、対象物検出プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13842006 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14422450 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112013004720 Country of ref document: DE Ref document number: 1120130047209 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13842006 Country of ref document: EP Kind code of ref document: A1 |