Nothing Special   »   [go: up one dir, main page]

WO2017122688A1 - Device for detecting abnormality of lens of onboard camera - Google Patents

Device for detecting abnormality of lens of onboard camera Download PDF

Info

Publication number
WO2017122688A1
WO2017122688A1 PCT/JP2017/000659 JP2017000659W WO2017122688A1 WO 2017122688 A1 WO2017122688 A1 WO 2017122688A1 JP 2017000659 W JP2017000659 W JP 2017000659W WO 2017122688 A1 WO2017122688 A1 WO 2017122688A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
bird
vehicle
abnormality
eye
Prior art date
Application number
PCT/JP2017/000659
Other languages
French (fr)
Japanese (ja)
Inventor
大輔 杉浦
宗昭 松本
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2017122688A1 publication Critical patent/WO2017122688A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to a lens abnormality detection device that detects an abnormality of a lens of a vehicle-mounted camera.
  • a system is known in which an in-vehicle camera is used to capture an image of the surroundings of the vehicle so as to monitor the running state of the vehicle and notify an abnormality, or to generate a bird's-eye view of the surroundings of the vehicle as viewed from above.
  • Patent Document 1 For example, in Patent Document 1 below, two on-vehicle cameras are used to image roads ahead and behind the vehicle in the running direction, and road marking lines such as a road center line and a lane boundary line are extracted from the captured images of the on-vehicle cameras.
  • An apparatus for calculating the reliability of road marking lines has been proposed.
  • the reliability histories of road lane markings calculated for each camera are compared, and if there is an in-vehicle camera with relatively low reliability, the lens of the in-vehicle camera is abnormal. Judge.
  • the proposed apparatus it is possible to detect a lens abnormality with higher accuracy than an apparatus that performs abnormality determination by extracting and comparing overlapping image regions from two captured images obtained by two cameras. it can. This is because if an overlapping image region is extracted from two captured images and an abnormality is determined, an abnormality of a lens portion that does not correspond to the overlapping image region (for example, dirt on the lens) cannot be detected.
  • a lens abnormality of an in-vehicle camera can be accurately detected using captured images obtained from a plurality of in-vehicle cameras even when the vehicle is traveling on a road without a road marking line.
  • the lens abnormality detection device for an in-vehicle camera includes a plurality of in-vehicle cameras, a viewpoint conversion unit, a storage unit, and an abnormality determination unit.
  • the plurality of in-vehicle cameras capture images of the surroundings of the vehicle, and the viewpoint conversion unit generates a bird's-eye view image by converting the viewpoints of images captured by the plurality of in-vehicle cameras.
  • storage part memorize
  • the abnormality determination unit compares the rear bird's-eye image covering the rear in the traveling direction of the vehicle among the bird's-eye images generated by the viewpoint conversion unit and the history image stored in the storage unit.
  • the history image used for this comparison is a history image corresponding to the rear bird's-eye view image
  • the abnormality determination unit determines whether the difference between the rear bird's-eye view image and the history image is equal to or greater than the abnormality determination value. It is determined that a lens abnormality has occurred.
  • the lens abnormality is an abnormality of the lens that makes it impossible to capture a clear image with the in-vehicle camera, and includes, for example, scratches, damage, dirt, and the like of the lens.
  • a lens abnormality detection device determines a lens abnormality of the in-vehicle camera by comparing bird's-eye images obtained from captured images captured by a plurality of in-vehicle cameras. Therefore, in such a lens abnormality detection device, it is possible to determine lens abnormality even if the cameras that are subject to lens abnormality determination are not arranged so that the imaging regions overlap.
  • the lens abnormality may be determined by comparing the rear bird's-eye image and the history image corresponding to the rear bird's-eye image to obtain a difference between the rear bird's-eye image and the history image. There is no need to extract lines and calculate their linearity and parallelism. For this reason, in such a lens abnormality detection device, abnormality determination can be performed even if there is no specific target such as a road marking line on the traveling path of the vehicle.
  • the lens abnormality determination can be performed by comparing the bird's-eye images obtained from the respective in-vehicle cameras and obtaining the difference between the bird's-eye images. Therefore, the lens abnormality determination can be performed easily and in a short time compared to the related technology described above. It can be implemented.
  • the imaging system 2 of the present embodiment includes a front camera 4 f and a rear camera 4 r as vehicle-mounted cameras, a control unit 10, and a display unit 30.
  • the front camera 4f and the rear camera 4r are attached to the front part and the rear part of the vehicle 40 so as to image the road ahead and behind the vehicle 40, respectively. Further, in the front camera 4f and the rear camera 4r, lenses 5f and 5r are provided on the front end side in the imaging direction, respectively.
  • the control unit 10 generates a bird's-eye view image of the road around the vehicle from the vertical direction, based on the images captured by the cameras 4f and 4r. Then, the generated bird's-eye view image is displayed on the display unit 30 including a liquid crystal display or the like disposed in the vehicle interior.
  • the control unit 10 also functions as a lens abnormality detection device of the present disclosure.
  • the control unit 10 includes an imaging signal input unit 12, a detection signal input unit 14, a memory 16, a display control unit 18, an image processing unit 20, and an abnormality notification unit 32.
  • the imaging signal input unit 12 takes in the imaging signals from the front camera 4f and the rear camera 4r and inputs them to the image processing unit 20 as captured image data.
  • the detection signal input unit 14 takes in detection signals from the wheel speed sensor 6 that detects the rotational speed of each wheel of the vehicle 40 and the steering angle sensor 8 that detects the steering angle of the steering, respectively, and the wheel speed data and the steering angle data. And input to the image processing unit 20.
  • the image processing unit 20 includes a microcomputer including a CPU, a ROM, a RAM, and the like.
  • the CPU executes a program in the ROM, the viewpoint conversion unit 22, the image composition unit 24, the abnormality determination unit 26, and , Function as a shadow determination unit 28.
  • the viewpoint conversion unit 22 converts the captured image data of the front and rear of the vehicle input from the imaging signal input unit 12 from the viewpoint, thereby moving each road region in front and rear of the vehicle 40 upward.
  • a bird's-eye image 50f and a bird's-eye image 50r viewed from the above are generated.
  • the viewpoint conversion unit 22 stores the bird's-eye image 50f or the bird's-eye image 50r in the memory 16 in association with the position and direction of the vehicle 40 obtained from the wheel speed data and the steering angle data representing the traveling state of the vehicle 40.
  • the viewpoint conversion unit 22 provides a bird's-eye image covering the front in the traveling direction (hereinafter also referred to as “front bird's-eye image”) 50f or a bird's-eye image covering the rear in the traveling direction (hereinafter referred to as “rear bird's-eye image”).
  • front bird's-eye image a bird's-eye image covering the front in the traveling direction
  • rear bird's-eye image a bird's-eye image covering the rear in the traveling direction
  • the memory 16 is a storage unit that stores a history image, and includes a readable and writable nonvolatile memory such as an EEPROM or a flash memory.
  • the image composition unit 24 composes the latest bird's-eye view images 50f and 50r generated by the viewpoint conversion unit 22 and the history image 52 stored in the memory 16 so that the vehicle 40 and the entire periphery thereof can be viewed from above. A bird's-eye view image is generated.
  • the generated bird's-eye view image is output from the image processing unit 20 to the display control unit 18 and displayed on the display unit 30 via the display control unit 18.
  • the abnormality determination unit 26 compares the rear bird's-eye image 50r with the history image of the position of the vehicle 40 corresponding to the rear bird's-eye image 50r, and compares the difference between the rear bird's-eye image 50r and this history image (in other words, the degree of coincidence). ) Is calculated.
  • the calculated difference is equal to or greater than the abnormality determination value (in other words, when the degree of coincidence is less than the threshold value)
  • the abnormality determination value in other words, when the degree of coincidence is less than the threshold value
  • the abnormality notifying unit 32 only needs to be able to notify the passenger of the vehicle 40 of the abnormality of the lenses 5f and 5r.
  • the abnormality notifying unit 32 may include a circuit that blinks or lights a display lamp such as an LED.
  • a circuit for displaying an error message may be included.
  • a voice synthesis circuit may be included so that the abnormality of the lenses 5f and 5r can be notified by voice.
  • the shadow determination unit 28 determines whether or not the shadow of the vehicle 40 is reflected in at least one of the rear bird's-eye view image and the history image used by the abnormality determination unit 26 for the abnormality determination of the lenses 5f and 5r. When the shadow determination unit 28 determines that one of the rear bird's-eye view image and the history image has a shadow of the vehicle 40, the abnormality determination unit 26 prohibits the abnormality determination of the lenses 5f and 5r.
  • lens abnormality determination processing executed in the image processing unit 20 to realize functions as the viewpoint conversion unit 22, the abnormality determination unit 26, and the shadow determination unit 28 will be described with reference to the flowchart of FIG. To do.
  • the lens abnormality determination process shown in FIG. 4 represents a program that is repeatedly executed as one of the main routines in the microcomputer (specifically, the CPU) included in the image processing unit 20.
  • this program is stored in the ROM in the image processing unit 20.
  • other non-transitional tangible records separate from the microcomputer included in the image processing unit 20 such as the memory 16. It may be stored on a medium.
  • S110 (S represents a step)
  • images captured by the front camera 4f and the rear camera 4r are captured from the imaging signal input unit 12, and the front bird's-eye image 50f and the rear are captured.
  • a bird's-eye view image 50r is generated.
  • the process of S110 is a process for realizing the function as the viewpoint conversion unit 22.
  • the viewpoint image of the captured image is converted to generate a bird's-eye view image.
  • This generation method is publicly known and, for example, a technique described in Japanese Patent Laid-Open No. 10-211849 can be used. Detailed description is omitted.
  • S120 it is determined whether or not the vehicle 40 is currently traveling by fetching the wheel speed data from the detection signal input unit 14. Then, if the vehicle 40 is not traveling, the lens abnormality determination process is temporarily terminated. If the vehicle 40 is traveling, the process proceeds to S130.
  • the traveling direction (that is, forward or backward) of the vehicle 40 is determined from the wheel speed data, and the front bird's-eye image 50f is selected from the bird's-eye images 50f and 50r generated in S110. Then, the selected front bird's-eye image 50f is stored in the memory 16 as a history image during vehicle travel.
  • the latest bird's-eye images 50f and 50r generated in S110 and the history image stored in the memory 16 in S130 are displayed as bird's-eye images around the host vehicle by display control processing as the image composition unit 24. It is used for displaying on the unit 30.
  • the rear bird's-eye image 50r is selected from the bird's-eye images 50f and 50r generated in S110, and it is determined whether or not there is a shadow of the vehicle 40 in the selected rear bird's-eye image 50r.
  • the front bird's-eye view image 50f is sequentially stored in the memory 16 as a history image while the vehicle 40 is traveling. Therefore, in S150, based on the vehicle speed obtained from the wheel speed data, the front bird's-eye image 50f at the same position as the rear bird's-eye image 50r is extracted from the history image stored in the memory 16.
  • subsequent S160 it is determined whether or not the history image read from the memory 16 in S150 has a shadow of the host vehicle 40. If it is determined in S160 that there is a shadow of the host vehicle 40 in the history image, it is conceivable that the abnormality of the lenses 5f and 5r is erroneously determined in the process described later. finish. If it is determined in S160 that there is no shadow of the host vehicle 40 in the history image, the process proceeds to S170.
  • the processing of S140 and S160 is processing for realizing the function as the shadow determination unit 28.
  • a determination method for determining whether or not there is a shadow in the bird's-eye view image is known, and for example, a shadow detection technique described in Japanese Patent Application Laid-Open No. 2010-237976 can be used.
  • a region of a partial region of the bird's-eye view image is divided based on the hue and the luminance, and the difference in hue is equal to or less than a predetermined threshold among the plurality of divided regions. Two areas where the difference in brightness is equal to or greater than a predetermined value are extracted.
  • the high-brightness area is the non-shadow area
  • the low-brightness area is the shadow area
  • the vector from the shadow area to the non-shadow area in the color information space is specified as the color information of the light source.
  • region division is performed based on the hue and brightness, and when the hue difference between the adjacent regions matches the hue of the light source within a predetermined range, The low brightness area is identified as a shadow.
  • S140 and S160 it is determined whether the shadow area thus identified corresponds to the vehicle body shape of the host vehicle 40, so that the rear bird's-eye view image 50r or the history image includes the shadow of the host vehicle 40. Can be determined.
  • Japanese Patent Laid-Open No. 2010-237976 introduces a technique for detecting shadows by a procedure different from the above procedure.
  • the shadow detection technique is used to detect the shadow of the host vehicle 40. It may be determined, or another shadow detection technique may be used.
  • the host vehicle 40 determines in S140 and S160. You may make it judge that the shadow of the vehicle 40 does not exist.
  • the difference between the rear bird's-eye view image 50r determined to have no shadow of the host vehicle 40 in S140 and S160 and the history image is calculated.
  • This difference calculation is performed by, for example, extracting feature points from the rear bird's-eye view image 50r and the history image by edge detection or the like, and comparing the feature points with each other, so that the difference between the rear bird's-eye view image 50r and the history image (in other words, This is done by calculating the degree of coincidence.
  • the difference calculated in S170 is compared with a preset abnormality determination value. If the difference is less than the abnormality determination value, the images captured by the front camera 4f and the rear camera 4r are both normal. If it is determined that there is, the lens abnormality determination process is temporarily terminated.
  • processing of S170 and S180 is processing for realizing the function as the abnormality determination unit 26.
  • S190 by prohibiting execution of the display control process as the image composition unit 24, the generation of the bird's-eye image around the host vehicle 40 and the output of the bird's-eye image to the display control unit 18 are prohibited. The display of the bird's-eye view image on 30 is stopped.
  • the abnormality notification unit 32 notifies the occupant of the abnormality of the lenses 5f and 5r, and the lens abnormality determination process is temporarily terminated. Since the lens abnormality determination process is executed as one of main routines in the CPU, the process of S110 is started after a predetermined time has elapsed.
  • the front bird's-eye image 50 f generated from the image captured by the front camera 4 f. are sequentially stored in the memory 16 as a history image.
  • the history image stored in the memory 16 is used to generate a bird's-eye image around the vehicle 40 together with the latest bird's-eye images 50f and 50r generated from images captured by the front camera 4f and the rear camera 4r.
  • a bird's-eye image in the same area as the bird's-eye image 50f (t1) stored in the memory 16 as a history image at time T1 is generated from an image captured by the rear camera 4r at time T2.
  • the bird's-eye view image 50r (t2) generated at time T2 is compared with the bird's-eye view image 50f (t1) that is a history image corresponding to the bird's-eye view image 50r (t2), and the difference between the two images is compared. Is greater than or equal to the abnormality determination value, it is determined that the lens 5f or 5r is abnormal.
  • the edge of the sign 60 is detected from each image. Then, the abnormality of the lenses 5f and 5r is determined from the difference in position and shape.
  • the imaging system 2 of the present embodiment it is not necessary to arrange the cameras that are the targets of abnormality determination so that the imaging areas overlap. Further, as described in Patent Document 1, it is not necessary to extract road lane markings from images captured by the front camera 4f and the rear camera 4r and calculate their linearity and parallelism.
  • the present embodiment by mounting a plurality of cameras 4f and 4r on the vehicle 40, these cameras 4f and 4f, even if there is no specific target such as a road marking line on the road surface on which the vehicle 40 travels, the abnormality of the 4r lenses 5f and 5r can be respectively determined.
  • this abnormality determination only needs to obtain a difference between the bird's-eye images by comparing the bird's-eye images at the same position of the vehicle 40 generated from the captured images of the cameras 4f and 4r. Compared to the case of calculating linearity and parallelism, it can be carried out easily and in a short time.
  • the abnormality determination is not performed.
  • the abnormality of the lenses 5f and 5r is erroneously determined due to the influence of the shadow. Can be prevented or suppressed.
  • the display of the bird's-eye image on the display unit 30 by the image composition unit 24 is prohibited, and the abnormality of the lenses 5f and 5r is detected via the abnormality notification unit 32. Inform. For this reason, when an abnormality such as breakage or dirt occurs in the lens of the front camera 4f or the rear camera 4r, and a bird's-eye image around the vehicle 40 cannot be normally generated, a blurred bird's-eye image is displayed on the display unit 30. Can be suppressed. In addition, by notifying the abnormality of the lenses 5f and 5r, it is possible to prompt the user to check each of the cameras 4f and 4r.
  • this indication is not limited to the above-mentioned embodiment, and can carry out various modifications.
  • the above embodiment has been described on the assumption that the front camera 4f and the rear camera 4r are mounted on the vehicle 40, the present disclosure is not limited to the above embodiment if a plurality of cameras are mounted on the vehicle 40.
  • Lens abnormality can be determined in the same manner as above.
  • the vehicle 40 may be equipped with side cameras that capture the left and right sides of the vehicle, respectively.
  • a front bird's-eye image and a rear bird's-eye image of the vehicle are generated using the front camera and one side camera, and the front bird's-eye image is left as a history image.
  • the lens abnormality is determined by comparing the image regions having the same position in the rear bird's-eye view image and the history image.
  • the lens abnormality of each camera can be determined in the same manner as in the above embodiment. Similarly, a lens abnormality can be determined using the rear camera and one side camera.
  • one or both of the front camera 4f, the rear camera 4r, and the side camera are used to generate bird's-eye images, and image regions whose positions match are extracted from the generated bird's-eye images and compared. By doing so, lens abnormality may be determined.
  • the image processing unit 20 includes a microcomputer, and the CPU of the microcomputer executes the program, whereby the viewpoint conversion unit 22, the image composition unit 24, the abnormality determination unit 26, and the shadow determination unit 28. As a function is realized.
  • the image processing unit 20 is configured to implement part or all of the functions using hardware that combines a logic circuit, an analog circuit, or the like in place of or in addition to the microcomputer. May be.
  • the wheel speed sensor 6 and the steering angle sensor 8 are described as being used so that the position and traveling direction of the vehicle 40 can be grasped.
  • a satellite positioning system is used separately from these. You may do it.
  • a plurality of functions of one constituent element in the above embodiment may be realized by a plurality of constituent elements, or one function of one constituent element may be realized by a plurality of constituent elements.
  • a plurality of functions possessed by a plurality of constituent elements may be realized by one constituent element, or one function realized by a plurality of constituent elements may be realized by one constituent element.
  • at least a part of the configuration of the above embodiment may be added to or replaced with the configuration of the other embodiment.
  • all the aspects included in the technical idea specified only by the wording described in the claims are embodiments of the present disclosure.
  • the present disclosure can be realized as a system including the imaging system 2 as a component, or a program for causing a computer to function as the image processing unit 20 of the imaging system 2. it can. Further, the present disclosure can be realized in various forms such as a non-transitional actual recording medium such as a semiconductor memory in which the program is recorded, a lens abnormality detection method realized by the image processing unit 20, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)

Abstract

A lens abnormality detection device provided with a plurality of onboard cameras (4f, 4r), a visual point conversion unit (22), a storage unit (16), and an abnormality determination unit (26). The storage unit stores a front bird's-eye image, among bird's-eye images generated by the visual point conversion unit, that includes the front in the traveling direction of a vehicle, as a history image, successively in accordance with the traveling of the vehicle. The abnormality determination unit compares a rear bird's-eye image that includes the rear in the traveling direction of the vehicle with the history image stored in the storage unit, and determines that there is lens abnormality in one of the plurality of onboard cameras when a difference between the rear bird's-eye image and the history image is greater than or equal to an abnormality determination value.

Description

車載カメラのレンズ異常検出装置In-vehicle camera lens abnormality detection device 関連出願の相互参照Cross-reference of related applications
 本国際出願は、2016年1月12日に日本国特許庁に出願された日本国特許出願第2016-003752号に基づく優先権を主張するものであり、日本国特許出願第2016-003752号の全内容を参照により本国際出願に援用する。 This international application claims priority based on Japanese Patent Application No. 2016-003752 filed with the Japan Patent Office on January 12, 2016, and is based on Japanese Patent Application No. 2016-003752. The entire contents are incorporated herein by reference.
 本開示は、車載カメラのレンズの異常を検出するレンズ異常検出装置に関する。 The present disclosure relates to a lens abnormality detection device that detects an abnormality of a lens of a vehicle-mounted camera.
 車載カメラを使って車両の周囲を撮像することで、車両の走行状態を監視して異常を報知したり、車両の周囲を上方から見た鳥瞰画像を生成したりするシステムが知られている。 A system is known in which an in-vehicle camera is used to capture an image of the surroundings of the vehicle so as to monitor the running state of the vehicle and notify an abnormality, or to generate a bird's-eye view of the surroundings of the vehicle as viewed from above.
 この種のシステムでは、車載カメラのレンズに傷や汚れ等の異常が生じると、車両の周囲を良好に撮像することができなくなるので、撮像画像を処理することでレンズの異常を検出することが提案されている。 In this type of system, if an abnormality such as scratches or dirt occurs on the lens of the in-vehicle camera, it becomes impossible to image the surroundings of the vehicle well, so it is possible to detect the lens abnormality by processing the captured image. Proposed.
 例えば、下記特許文献1には、2つの車載カメラで車両の走行方向前方及び後方の道路を撮像し、それら車載カメラの撮像画像から車道中央線や車線境界線等の道路区画線を抽出して、道路区画線の信頼度を算出する装置が提案されている。そして、この提案の装置では、カメラ毎に算出した道路区画線の信頼度の履歴を比較し、相対的に信頼度が低下している車載カメラがあれば、その車載カメラのレンズに異常があると判断する。 For example, in Patent Document 1 below, two on-vehicle cameras are used to image roads ahead and behind the vehicle in the running direction, and road marking lines such as a road center line and a lane boundary line are extracted from the captured images of the on-vehicle cameras. An apparatus for calculating the reliability of road marking lines has been proposed. In this proposed device, the reliability histories of road lane markings calculated for each camera are compared, and if there is an in-vehicle camera with relatively low reliability, the lens of the in-vehicle camera is abnormal. Judge.
特開2014-115814号公報JP 2014-115814 A
 上記提案の装置によれば、2つのカメラによる2つの撮像画像の中から重複する画像領域を抽出して比較することで異常判定を行う装置に比べて、レンズの異常を精度よく検出することができる。これは、2つの撮像画像の中から重複する画像領域を抽出して異常判定すると、重複する画像領域に対応しないレンズ部分の異常(例えばレンズの汚れ)を検出できないからである。 According to the proposed apparatus, it is possible to detect a lens abnormality with higher accuracy than an apparatus that performs abnormality determination by extracting and comparing overlapping image regions from two captured images obtained by two cameras. it can. This is because if an overlapping image region is extracted from two captured images and an abnormality is determined, an abnormality of a lens portion that does not correspond to the overlapping image region (for example, dirt on the lens) cannot be detected.
 しかしながら、上記提案の装置においては、各車載カメラによる撮像画像の中から道路区画線を抽出し、その抽出した道路区画線の直線性と道路区画線の幅方向両側のエッジの平行性とを算出することで、道路区画線の信頼度を求める。発明者の詳細な検討の結果、上記提案の装置では、車両が道路区画線のない道路を走行しているときには、車載カメラのレンズ異常を検出することができないという課題が見出された。 However, in the proposed apparatus, road lane markings are extracted from images captured by each on-vehicle camera, and the straightness of the extracted road lane markings and the parallelism of the edges on both sides in the width direction of the road lane markings are calculated. By doing so, the reliability of the road marking line is obtained. As a result of detailed studies by the inventor, it has been found that the above-mentioned proposed apparatus cannot detect lens abnormality of an in-vehicle camera when the vehicle is traveling on a road having no road marking line.
 また、撮像画像の中から道路区画線を抽出して、その直線性と平行性を算出するには、2つの画像を比較する場合に比べて、複雑な画像処理を行う必要があるため、発明者の詳細な検討の結果、上記提案の装置では、道路区画線の信頼性の算出、延いてはレンズ異常の判定、に時間がかかるという課題も見出された。 In addition, in order to extract road lane markings from captured images and calculate their linearity and parallelism, it is necessary to perform more complicated image processing than when two images are compared. As a result of detailed examinations by the persons, it has been found that the proposed apparatus takes time to calculate the reliability of the road marking line, and thus to determine the lens abnormality.
 本開示の1つの局面は、車両が道路区画線のない道路を走行しているときにでも、複数の車載カメラから得られる撮像画像を使って、車載カメラのレンズ異常を精度よく検出できることが望ましい。 In one aspect of the present disclosure, it is desirable that a lens abnormality of an in-vehicle camera can be accurately detected using captured images obtained from a plurality of in-vehicle cameras even when the vehicle is traveling on a road without a road marking line. .
 本開示の一局面における車載カメラのレンズ異常検出装置は、複数の車載カメラと、視点変換部と、記憶部と、異常判定部とを備える。
 複数の車載カメラは、車両の周囲を撮像し、視点変換部は、その複数の車載カメラによる撮像画像をそれぞれ視点変換することで鳥瞰画像を生成する。
The lens abnormality detection device for an in-vehicle camera according to one aspect of the present disclosure includes a plurality of in-vehicle cameras, a viewpoint conversion unit, a storage unit, and an abnormality determination unit.
The plurality of in-vehicle cameras capture images of the surroundings of the vehicle, and the viewpoint conversion unit generates a bird's-eye view image by converting the viewpoints of images captured by the plurality of in-vehicle cameras.
 そして、記憶部は、視点変換部にて生成された鳥瞰画像のうち、車両の走行方向前方を網羅する前方鳥瞰画像を、履歴画像として、車両の走行に応じて順次記憶する。
 また、異常判定部は、視点変換部にて生成された鳥瞰画像のうち、車両の走行方向後方を網羅する後方鳥瞰画像と、記憶部に記憶された履歴画像とを比較する。
And a memory | storage part memorize | stores sequentially the front bird's-eye view image which covers the driving | running | working direction front of a vehicle among the bird's-eye images produced | generated in the viewpoint conversion part as a history image according to driving | running | working of a vehicle.
In addition, the abnormality determination unit compares the rear bird's-eye image covering the rear in the traveling direction of the vehicle among the bird's-eye images generated by the viewpoint conversion unit and the history image stored in the storage unit.
 この比較に用いる履歴画像は、後方鳥瞰画像に対応した履歴画像であり、異常判定部は、後方鳥瞰画像と履歴画像との差分が異常判定値以上であるとき、複数の車載カメラの一つにレンズ異常が生じていると判定する。 The history image used for this comparison is a history image corresponding to the rear bird's-eye view image, and the abnormality determination unit determines whether the difference between the rear bird's-eye view image and the history image is equal to or greater than the abnormality determination value. It is determined that a lens abnormality has occurred.
 なお、レンズ異常とは、車載カメラによって鮮明な画像を撮像できなくなるレンズの異常のことであり、例えば、レンズの傷や破損、汚れ、等が含まれる。
 このようなレンズ異常検出装置は、複数の車載カメラで撮像された撮像画像から得られる鳥瞰画像同士を比較することで、車載カメラのレンズの異常を判定する。従って、このようなレンズ異常検出装置では、レンズ異常の判定対象となるカメラ同士を、撮像領域が重複するように配置しなくても、レンズ異常を判定することができる。
The lens abnormality is an abnormality of the lens that makes it impossible to capture a clear image with the in-vehicle camera, and includes, for example, scratches, damage, dirt, and the like of the lens.
Such a lens abnormality detection device determines a lens abnormality of the in-vehicle camera by comparing bird's-eye images obtained from captured images captured by a plurality of in-vehicle cameras. Therefore, in such a lens abnormality detection device, it is possible to determine lens abnormality even if the cameras that are subject to lens abnormality determination are not arranged so that the imaging regions overlap.
 また、レンズ異常の判定は、後方鳥瞰画像と、その後方鳥瞰画像に対応する履歴画像とを比較して、後方鳥瞰画像と履歴画像との差分を求めるようにすればよく、鳥瞰画像から道路区画線を抽出してその直線性や平行性を算出する必要がない。このため、このようなレンズ異常検出装置では、車両の走行路に道路区画線等の特定の物標がなくても異常判定を行うことができる。 In addition, the lens abnormality may be determined by comparing the rear bird's-eye image and the history image corresponding to the rear bird's-eye image to obtain a difference between the rear bird's-eye image and the history image. There is no need to extract lines and calculate their linearity and parallelism. For this reason, in such a lens abnormality detection device, abnormality determination can be performed even if there is no specific target such as a road marking line on the traveling path of the vehicle.
 また、レンズ異常の判定は、各車載カメラから得られる鳥瞰画像を比較し、鳥瞰画像の差分を求めることで実施できるため、上述した関連技術に比べて、レンズの異常判定を簡単かつ短時間で実施できることになる。 In addition, the lens abnormality determination can be performed by comparing the bird's-eye images obtained from the respective in-vehicle cameras and obtaining the difference between the bird's-eye images. Therefore, the lens abnormality determination can be performed easily and in a short time compared to the related technology described above. It can be implemented.
実施形態の撮像システム全体の構成を表すブロック図である。It is a block diagram showing the structure of the whole imaging system of embodiment. 車両への車載カメラの取付位置及び撮像方向を表す説明図である。It is explanatory drawing showing the attachment position and imaging direction of the vehicle-mounted camera to a vehicle. 車載カメラによる撮像範囲及び視点変換後の鳥瞰画像領域を表す説明図である。It is explanatory drawing showing the bird's-eye view image area | region after imaging range and viewpoint conversion with a vehicle-mounted camera. 図1の画像処理部にて実行されるレンズ異常判定処理を表すフローチャートである。It is a flowchart showing the lens abnormality determination process performed in the image processing part of FIG. レンズ異常判定処理により得られる鳥瞰画像とこれを用いた異常判定動作を説明する説明図である。It is explanatory drawing explaining the abnormality determination operation | movement using the bird's-eye view image obtained by lens abnormality determination processing, and this.
 以下に本開示の例示的な実施形態を図面と共に説明する。
 図1に示すように、本実施形態の撮像システム2は、車載カメラとしてのフロントカメラ4f及びリヤカメラ4rと、制御ユニット10と、表示部30とを備える。
Hereinafter, exemplary embodiments of the present disclosure will be described with reference to the drawings.
As shown in FIG. 1, the imaging system 2 of the present embodiment includes a front camera 4 f and a rear camera 4 r as vehicle-mounted cameras, a control unit 10, and a display unit 30.
 図2に示すように、フロントカメラ4f及びリヤカメラ4rは、それぞれ、車両40の前方及び後方の道路を撮像するように、車両40の前部及び後部に取り付けられている。
また、フロントカメラ4f及びリヤカメラ4rにおいて、撮像方向先端側には、それぞれ、レンズ5f、5rが設けられている。
As shown in FIG. 2, the front camera 4f and the rear camera 4r are attached to the front part and the rear part of the vehicle 40 so as to image the road ahead and behind the vehicle 40, respectively.
Further, in the front camera 4f and the rear camera 4r, lenses 5f and 5r are provided on the front end side in the imaging direction, respectively.
 制御ユニット10は、これら各カメラ4f、4rによる撮像画像から、車両の周囲の道路を鉛直方向から俯瞰した鳥瞰画像を生成する。そして、その生成した鳥瞰画像を、車室内に配置された、液晶ディスプレイ等を含む表示部30に表示させる。また、制御ユニット10は、本開示のレンズ異常検出装置としても機能する。 The control unit 10 generates a bird's-eye view image of the road around the vehicle from the vertical direction, based on the images captured by the cameras 4f and 4r. Then, the generated bird's-eye view image is displayed on the display unit 30 including a liquid crystal display or the like disposed in the vehicle interior. The control unit 10 also functions as a lens abnormality detection device of the present disclosure.
 制御ユニット10には、撮像信号入力部12、検出信号入力部14、メモリ16、表示制御部18、画像処理部20、及び、異常報知部32が備えられている。
 撮像信号入力部12は、フロントカメラ4f及びリヤカメラ4rからの撮像信号を取り込み、撮像画像データとして画像処理部20に入力する。
The control unit 10 includes an imaging signal input unit 12, a detection signal input unit 14, a memory 16, a display control unit 18, an image processing unit 20, and an abnormality notification unit 32.
The imaging signal input unit 12 takes in the imaging signals from the front camera 4f and the rear camera 4r and inputs them to the image processing unit 20 as captured image data.
 検出信号入力部14は、車両40の各車輪の回転速度を検出する車輪速センサ6や、ステアリングの操舵角を検出する操舵角センサ8からの検出信号をそれぞれ取り込み、車輪速データ、操舵角データに変換して、画像処理部20に入力する。 The detection signal input unit 14 takes in detection signals from the wheel speed sensor 6 that detects the rotational speed of each wheel of the vehicle 40 and the steering angle sensor 8 that detects the steering angle of the steering, respectively, and the wheel speed data and the steering angle data. And input to the image processing unit 20.
 画像処理部20は、CPU、ROM、RAM、等を含むマイクロコンピュータを備えており、CPUがROM内のプログラムを実行することにより、視点変換部22、画像合成部24、異常判定部26、及び、影判定部28、として機能する。 The image processing unit 20 includes a microcomputer including a CPU, a ROM, a RAM, and the like. When the CPU executes a program in the ROM, the viewpoint conversion unit 22, the image composition unit 24, the abnormality determination unit 26, and , Function as a shadow determination unit 28.
 視点変換部22は、図3に示すように、撮像信号入力部12から入力される車両の前方及び後方の撮像画像データを視点変換することにより、車両40の前方及び後方の各道路領域を上方から見た鳥瞰画像50f及び鳥瞰画像50rを生成する。 As shown in FIG. 3, the viewpoint conversion unit 22 converts the captured image data of the front and rear of the vehicle input from the imaging signal input unit 12 from the viewpoint, thereby moving each road region in front and rear of the vehicle 40 upward. A bird's-eye image 50f and a bird's-eye image 50r viewed from the above are generated.
 そして、視点変換部22は、車両40の走行状態を表す車輪速データ及び操舵角データから得られる車両40の位置及び方向に関連づけて、鳥瞰画像50f若しくは鳥瞰画像50rをメモリ16に記憶する。 The viewpoint conversion unit 22 stores the bird's-eye image 50f or the bird's-eye image 50r in the memory 16 in association with the position and direction of the vehicle 40 obtained from the wheel speed data and the steering angle data representing the traveling state of the vehicle 40.
 つまり、視点変換部22は、車両40の走行時に、走行方向前方を網羅する鳥瞰画像(以下、「前方鳥瞰画像」とも称する)50f若しくは走行方向後方を網羅する鳥瞰画像(以下、「後方鳥瞰画像」とも称する)50rをメモリ16に記憶することで、カメラ4f、4rではリアルタイムで撮像できない車両40直下の鳥瞰画像を履歴画像52として記憶するのである。 That is, when the vehicle 40 travels, the viewpoint conversion unit 22 provides a bird's-eye image covering the front in the traveling direction (hereinafter also referred to as “front bird's-eye image”) 50f or a bird's-eye image covering the rear in the traveling direction (hereinafter referred to as “rear bird's-eye image”). By storing 50r in the memory 16, a bird's-eye view image directly under the vehicle 40 that cannot be captured in real time by the cameras 4f and 4r is stored as the history image 52.
 なお、メモリ16は、履歴画像を記憶する記憶部であり、EEPROM、フラッシュメモリ等の読み書き可能な不揮発性メモリを備えている。
 画像合成部24は、視点変換部22で生成された最新の鳥瞰画像50f、50rと、メモリ16に記憶された履歴画像52と、を合成することで、車両40及びその周囲全体を上方から見た鳥瞰画像を生成する。
The memory 16 is a storage unit that stores a history image, and includes a readable and writable nonvolatile memory such as an EEPROM or a flash memory.
The image composition unit 24 composes the latest bird's- eye view images 50f and 50r generated by the viewpoint conversion unit 22 and the history image 52 stored in the memory 16 so that the vehicle 40 and the entire periphery thereof can be viewed from above. A bird's-eye view image is generated.
 そして、その生成された鳥瞰画像は、画像処理部20から表示制御部18に出力され、表示制御部18を介して表示部30に表示される。
 異常判定部26は、後方鳥瞰画像50rと、この後方鳥瞰画像50rに対応した車両40の位置の履歴画像とを比較し、この後方鳥瞰画像50rとこの履歴画像との差分(換言すれば一致度)を算出する。
Then, the generated bird's-eye view image is output from the image processing unit 20 to the display control unit 18 and displayed on the display unit 30 via the display control unit 18.
The abnormality determination unit 26 compares the rear bird's-eye image 50r with the history image of the position of the vehicle 40 corresponding to the rear bird's-eye image 50r, and compares the difference between the rear bird's-eye image 50r and this history image (in other words, the degree of coincidence). ) Is calculated.
 そして、その算出した差分が異常判定値以上であるとき(換言すれば一致度が閾値未満であるとき)、フロントカメラ4f及びリヤカメラ4rのレンズ5f、5rの少なくとも一方に異常が生じていると判定し、異常報知部32を介して、異常の発生を報知する。 When the calculated difference is equal to or greater than the abnormality determination value (in other words, when the degree of coincidence is less than the threshold value), it is determined that an abnormality has occurred in at least one of the lenses 5f and 5r of the front camera 4f and the rear camera 4r. Then, the occurrence of an abnormality is notified via the abnormality notification unit 32.
 なお、異常報知部32は、車両40の乗員にレンズ5f、5rの異常を報知できればよく、例えば、LED等の表示ランプを点滅若しくは点灯させる回路を含んでもよく、表示部30若しくは他の表示器にエラーメッセージを表示する回路を含んでもよい。また、レンズ5f、5rの異常を音声にて報知できるように、音声合成回路を含んでもよい。 It should be noted that the abnormality notifying unit 32 only needs to be able to notify the passenger of the vehicle 40 of the abnormality of the lenses 5f and 5r. For example, the abnormality notifying unit 32 may include a circuit that blinks or lights a display lamp such as an LED. A circuit for displaying an error message may be included. In addition, a voice synthesis circuit may be included so that the abnormality of the lenses 5f and 5r can be notified by voice.
 影判定部28は、異常判定部26がレンズ5f、5rの異常判定に用いる後方鳥瞰画像及び履歴画像の少なくとも一方に、車両40の影が写っているか否かを判定する。そして、影判定部28は、後方鳥瞰画像及び履歴画像の一方に車両40の影があると判定すると、異常判定部26によるレンズ5f、5rの異常判定を禁止させる。 The shadow determination unit 28 determines whether or not the shadow of the vehicle 40 is reflected in at least one of the rear bird's-eye view image and the history image used by the abnormality determination unit 26 for the abnormality determination of the lenses 5f and 5r. When the shadow determination unit 28 determines that one of the rear bird's-eye view image and the history image has a shadow of the vehicle 40, the abnormality determination unit 26 prohibits the abnormality determination of the lenses 5f and 5r.
 これは、レンズ5f、5rの異常判定に用いる後方鳥瞰画像と履歴画像との一方に車両40の影が写っている場合、これらの画像を比較すると、影の影響によりこれら画像の差分が大きくなり、レンズ5f、5rの異常を誤判定してしまうことが考えられるためである。 This is because, when the shadow of the vehicle 40 is reflected in one of the rear bird's-eye view image and the history image used for determining the abnormality of the lenses 5f and 5r, when these images are compared, the difference between these images increases due to the influence of the shadow. This is because it is considered that the abnormality of the lenses 5f and 5r is erroneously determined.
 次に、画像処理部20において、視点変換部22、異常判定部26、及び、影判定部28としての機能を実現するために実行されるレンズ異常判定処理を、図4のフローチャートに沿って説明する。 Next, lens abnormality determination processing executed in the image processing unit 20 to realize functions as the viewpoint conversion unit 22, the abnormality determination unit 26, and the shadow determination unit 28 will be described with reference to the flowchart of FIG. To do.
 なお、図4に示すレンズ異常判定処理は、画像処理部20に含まれるマイクロコンピュータ(詳しくはCPU)において、メインルーチンの一つとして繰り返し実行されるプログラムを表している。 The lens abnormality determination process shown in FIG. 4 represents a program that is repeatedly executed as one of the main routines in the microcomputer (specifically, the CPU) included in the image processing unit 20.
 本実施形態では、このプログラムは、画像処理部20内のROMに記憶されるが、例えば、メモリ16等、画像処理部20に含まれるマイクロコンピュータとは別体の他の非遷移的実体的記録媒体に記憶されていてもよい。 In the present embodiment, this program is stored in the ROM in the image processing unit 20. However, for example, other non-transitional tangible records separate from the microcomputer included in the image processing unit 20 such as the memory 16. It may be stored on a medium.
 図4に示すように、レンズ異常判定処理においては、まずS110(Sはステップを表す)にて、撮像信号入力部12からフロントカメラ4f及びリヤカメラ4rによる撮像画像を取り込み、前方鳥瞰画像50f及び後方鳥瞰画像50rを生成する。 As shown in FIG. 4, in the lens abnormality determination process, first, in S110 (S represents a step), images captured by the front camera 4f and the rear camera 4r are captured from the imaging signal input unit 12, and the front bird's-eye image 50f and the rear are captured. A bird's-eye view image 50r is generated.
 S110の処理は、視点変換部22としての機能を実現するための処理である。S110では、撮像画像を視点変換することで鳥瞰画像を生成するが、この生成方法は公知であり、例えば、特開平10-211849号公報に記載された手法を利用することができることから、ここでは詳細な説明は省略する。 The process of S110 is a process for realizing the function as the viewpoint conversion unit 22. In S110, the viewpoint image of the captured image is converted to generate a bird's-eye view image. This generation method is publicly known and, for example, a technique described in Japanese Patent Laid-Open No. 10-211849 can be used. Detailed description is omitted.
 次にS120では、検出信号入力部14から車輪速データを取り込むことで、現在、車両40が走行中であるか否かを判断する。そして、車両40が走行中でなければ、当該レンズ異常判定処理を一旦終了し、車両40が走行中であれば、S130に移行する。 Next, in S120, it is determined whether or not the vehicle 40 is currently traveling by fetching the wheel speed data from the detection signal input unit 14. Then, if the vehicle 40 is not traveling, the lens abnormality determination process is temporarily terminated. If the vehicle 40 is traveling, the process proceeds to S130.
 S130では、車輪速データから車両40の走行方向(つまり、前進又は後退)を判定して、S110にて生成された鳥瞰画像50f、50rの中から、前方鳥瞰画像50fを選択する。そして、その選択した前方鳥瞰画像50fを、車両走行中の履歴画像として、メモリ16に記憶する。 In S130, the traveling direction (that is, forward or backward) of the vehicle 40 is determined from the wheel speed data, and the front bird's-eye image 50f is selected from the bird's- eye images 50f and 50r generated in S110. Then, the selected front bird's-eye image 50f is stored in the memory 16 as a history image during vehicle travel.
 なお、S110にて生成された最新の鳥瞰画像50f、50rや、S130にてメモリ16に記憶された履歴画像は、画像合成部24としての表示制御処理によって、自車両の周囲の鳥瞰画像を表示部30に表示させるのに利用される。 The latest bird's- eye images 50f and 50r generated in S110 and the history image stored in the memory 16 in S130 are displayed as bird's-eye images around the host vehicle by display control processing as the image composition unit 24. It is used for displaying on the unit 30.
 次に、S140では、S110にて生成された鳥瞰画像50f、50rの中から後方鳥瞰画像50rを選択し、その選択した後方鳥瞰画像50rに自車両40の影があるか否かを判断する。 Next, in S140, the rear bird's-eye image 50r is selected from the bird's- eye images 50f and 50r generated in S110, and it is determined whether or not there is a shadow of the vehicle 40 in the selected rear bird's-eye image 50r.
 S140にて、後方鳥瞰画像50rに自車両40の影があると判断された場合には、後述の処理でレンズ5f、5rの異常を誤判定してしまうことが考えられるので、当該レンズ異常判定処理を一旦終了する。 If it is determined in S140 that the shadow of the host vehicle 40 is present in the rear bird's-eye view image 50r, it may be erroneously determined that the lens 5f or 5r is abnormal in the process described later. The process is temporarily terminated.
 一方、S140にて、後方鳥瞰画像50rに自車両40の影がないと判断された場合には、S150に移行して、S140で選択した後方鳥瞰画像50rに対応した履歴画像をメモリ16から読み出す。 On the other hand, if it is determined in S140 that there is no shadow of the host vehicle 40 in the rear bird's-eye image 50r, the process proceeds to S150, and the history image corresponding to the rear bird's-eye image 50r selected in S140 is read from the memory 16. .
 つまり、メモリ16には、車両40の走行中に、前方鳥瞰画像50fが履歴画像として順次記憶される。そこで、S150では、車輪速データから得られる車速に基づき、メモリ16に記憶された履歴画像の中から、後方鳥瞰画像50rと同じ位置の前方鳥瞰画像50fを抽出するのである。 That is, the front bird's-eye view image 50f is sequentially stored in the memory 16 as a history image while the vehicle 40 is traveling. Therefore, in S150, based on the vehicle speed obtained from the wheel speed data, the front bird's-eye image 50f at the same position as the rear bird's-eye image 50r is extracted from the history image stored in the memory 16.
 そして、続くS160では、S150にてメモリ16から読み出された履歴画像に自車両40の影があるか否かを判断する。
 S160にて履歴画像に自車両40の影があると判断された場合には、後述の処理でレンズ5f、5rの異常を誤判定してしまうことが考えられるので、当該レンズ異常判定処理を一旦終了する。また、S160にて履歴画像に自車両40の影はないと判断された場合には、S170に移行する。
In subsequent S160, it is determined whether or not the history image read from the memory 16 in S150 has a shadow of the host vehicle 40.
If it is determined in S160 that there is a shadow of the host vehicle 40 in the history image, it is conceivable that the abnormality of the lenses 5f and 5r is erroneously determined in the process described later. finish. If it is determined in S160 that there is no shadow of the host vehicle 40 in the history image, the process proceeds to S170.
 ここで、S140及びS160の処理は、影判定部28としての機能を実現するための処理である。そして、これらの処理において、鳥瞰画像に影が存在するか否かを判定する判定方法は公知であり、例えば、特開2010-237976号公報に記載された影検出技術を利用することができる。 Here, the processing of S140 and S160 is processing for realizing the function as the shadow determination unit 28. In these processes, a determination method for determining whether or not there is a shadow in the bird's-eye view image is known, and for example, a shadow detection technique described in Japanese Patent Application Laid-Open No. 2010-237976 can be used.
 つまり、これらの処理では、例えば、鳥瞰画像の一部領域に対して、色相と輝度に基づいて領域分割を行い、分割された複数の領域のうち、色相の差が所定の閾値以下でありかつ輝度の差が所定値以上となっている2つの領域を抽出する。 That is, in these processes, for example, a region of a partial region of the bird's-eye view image is divided based on the hue and the luminance, and the difference in hue is equal to or less than a predetermined threshold among the plurality of divided regions. Two areas where the difference in brightness is equal to or greater than a predetermined value are extracted.
 そして、抽出した2つの領域のうちの輝度の高い領域を非陰影領域、輝度の低い領域を陰影領域とし、色情報の空間における当該陰影領域から非陰影領域へのベクトルを光源の色情報として特定する。 Of the two extracted areas, the high-brightness area is the non-shadow area, the low-brightness area is the shadow area, and the vector from the shadow area to the non-shadow area in the color information space is specified as the color information of the light source. To do.
 次に、鳥瞰画像の全領域について、色相と輝度に基づいて領域分割を行い、隣接する領域間の色相の差が光源の色相と所定の範囲内で一致する場合に、それら隣接する領域のうち、輝度の低い領域を、影であると特定する。 Next, for all the regions of the bird's-eye view image, region division is performed based on the hue and brightness, and when the hue difference between the adjacent regions matches the hue of the light source within a predetermined range, The low brightness area is identified as a shadow.
 そして、S140、S160では、このように特定された影領域が自車両40の車体形状に対応しているか否かを判断することで、後方鳥瞰画像50r若しくは履歴画像に自車両40の影が含まれているか否かを判定できる。 In S140 and S160, it is determined whether the shadow area thus identified corresponds to the vehicle body shape of the host vehicle 40, so that the rear bird's-eye view image 50r or the history image includes the shadow of the host vehicle 40. Can be determined.
 なお、特開2010-237976号公報には、上記手順とは異なる手順で影検出を行う技術が紹介されているが、S140、S160では、その影検出技術を利用して自車両40の影を判定するようにしてもよし、他の影検出技術を利用するようにしてもよい。 Japanese Patent Laid-Open No. 2010-237976 introduces a technique for detecting shadows by a procedure different from the above procedure. In S140 and S160, the shadow detection technique is used to detect the shadow of the host vehicle 40. It may be determined, or another shadow detection technique may be used.
 また、後方鳥瞰画像50rに自車両40の影が含まれていても、その影領域が小さく、レンズ5f、5rの異常判定に影響を与えることがない場合には、S140、S160にて、自車両40の影は存在しないと判断するようにしてもよい。 In addition, even if the rear bird's-eye view image 50r includes a shadow of the host vehicle 40, if the shadow area is small and does not affect the abnormality determination of the lenses 5f and 5r, the host vehicle 40 determines in S140 and S160. You may make it judge that the shadow of the vehicle 40 does not exist.
 次に、S170では、S140、S160にて自車両40の影はないと判断された後方鳥瞰画像50rと履歴画像との差分を計算する。この差分計算は、例えば、エッジ検出等によって後方鳥瞰画像50r及び履歴画像の中から特徴点を抽出し、特徴点同士を比較することで、後方鳥瞰画像50rと履歴画像との差分(換言すれば一致度)を算出することにより行われる。 Next, in S170, the difference between the rear bird's-eye view image 50r determined to have no shadow of the host vehicle 40 in S140 and S160 and the history image is calculated. This difference calculation is performed by, for example, extracting feature points from the rear bird's-eye view image 50r and the history image by edge detection or the like, and comparing the feature points with each other, so that the difference between the rear bird's-eye view image 50r and the history image (in other words, This is done by calculating the degree of coincidence.
 そして、続くS180では、S170にて算出された差分と予め設定された異常判定値とを比較し、差分が異常判定値未満であれば、フロントカメラ4fとリヤカメラ4rとによる撮像画像は共に正常であると判断して、当該レンズ異常判定処理を一旦終了する。 In the subsequent S180, the difference calculated in S170 is compared with a preset abnormality determination value. If the difference is less than the abnormality determination value, the images captured by the front camera 4f and the rear camera 4r are both normal. If it is determined that there is, the lens abnormality determination process is temporarily terminated.
 S180において、差分が異常判定値以上である場合には、フロントカメラ4fとリヤカメラ4rとによる撮像画像の一方が異常であり、各カメラのレンズ5f、5rの一方に破損、汚れ等の異常があると判断して、S190に移行する。 In S180, when the difference is equal to or larger than the abnormality determination value, one of the images captured by the front camera 4f and the rear camera 4r is abnormal, and one of the lenses 5f and 5r of each camera is abnormal such as breakage or dirt. The process proceeds to S190.
 なお、S170及びS180の処理は、異常判定部26としての機能を実現するための処理である。
 S190では、画像合成部24としての表示制御処理の実行を禁止することで、自車両40の周囲の鳥瞰画像の生成、及び、その鳥瞰画像の表示制御部18への出力を禁止し、表示部30への鳥瞰画像の表示を中止させる。
Note that the processing of S170 and S180 is processing for realizing the function as the abnormality determination unit 26.
In S190, by prohibiting execution of the display control process as the image composition unit 24, the generation of the bird's-eye image around the host vehicle 40 and the output of the bird's-eye image to the display control unit 18 are prohibited. The display of the bird's-eye view image on 30 is stopped.
 そして、最後に、S200にて、異常報知部32を介して、レンズ5f、5rの異常を乗員に報知し、当該レンズ異常判定処理を一旦終了する。なお、当該レンズ異常判定処理は、CPUにおいてメインルーチンの一つとして実行されることから、所定時間経過後にS110の処理が開始される。 Finally, in S200, the abnormality notification unit 32 notifies the occupant of the abnormality of the lenses 5f and 5r, and the lens abnormality determination process is temporarily terminated. Since the lens abnormality determination process is executed as one of main routines in the CPU, the process of S110 is started after a predetermined time has elapsed.
 以上説明したように、本実施形態の撮像システム2においては、図5に例示するように、車両40が前方方向に走行しているときには、フロントカメラ4fによる撮像画像から生成された前方鳥瞰画像50fが、履歴画像として順次メモリ16に記憶される。 As described above, in the imaging system 2 of the present embodiment, as illustrated in FIG. 5, when the vehicle 40 is traveling in the forward direction, the front bird's-eye image 50 f generated from the image captured by the front camera 4 f. Are sequentially stored in the memory 16 as a history image.
 そして、メモリ16に記憶された履歴画像は、フロントカメラ4f及びリヤカメラ4rによる撮像画像から生成された最新の鳥瞰画像50f、50rと共に、車両40の周囲の鳥瞰画像を生成するのに用いられる。 The history image stored in the memory 16 is used to generate a bird's-eye image around the vehicle 40 together with the latest bird's- eye images 50f and 50r generated from images captured by the front camera 4f and the rear camera 4r.
 また、車両40の走行に伴い、時刻T1で履歴画像としてメモリ16に記憶された鳥瞰画像50f(t1)と同じ領域の鳥瞰画像が、時刻T2でリヤカメラ4rによる撮像画像から生成されることになる。 Further, as the vehicle 40 travels, a bird's-eye image in the same area as the bird's-eye image 50f (t1) stored in the memory 16 as a history image at time T1 is generated from an image captured by the rear camera 4r at time T2. .
 そして、本実施形態では、時刻T2で生成された鳥瞰画像50r(t2)と、この鳥瞰画像50r(t2)に対応した履歴画像である鳥瞰画像50f(t1)とを比較し、両画像の差分が異常判定値以上であるとき、レンズ5f又は5rに異常があると判定する。 In this embodiment, the bird's-eye view image 50r (t2) generated at time T2 is compared with the bird's-eye view image 50f (t1) that is a history image corresponding to the bird's-eye view image 50r (t2), and the difference between the two images is compared. Is greater than or equal to the abnormality determination value, it is determined that the lens 5f or 5r is abnormal.
 従って、図5に例示するように、鳥瞰画像50r(t2)と履歴画像である鳥瞰画像50f(t1)とに横断歩道を表す標識60が含まれていれば、各画像から標識60のエッジが検出されて、その位置や形状の差分からレンズ5f、5rの異常が判定される。 Therefore, as illustrated in FIG. 5, if the bird's-eye image 50r (t2) and the bird's-eye image 50f (t1) as the history image include the sign 60 representing a pedestrian crossing, the edge of the sign 60 is detected from each image. Then, the abnormality of the lenses 5f and 5r is determined from the difference in position and shape.
 また、各画像に標識60が写っていなくても、路面の模様や凹凸、走行路とその周囲との境界部分の色の違い、等がエッジ検出により特徴点として抽出され、これらの差分からレンズ5f、5rに異常があるか否かが判定される。 Even if the sign 60 is not shown in each image, the road surface pattern and unevenness, the color difference at the boundary between the traveling road and its surroundings, and the like are extracted as feature points by edge detection, and the lens is determined from these differences. It is determined whether or not there is an abnormality in 5f and 5r.
 従って、本実施形態の撮像システム2によれば、異常判定の対象となるカメラ同士を、撮像領域が重複するように配置する必要がない。また、上記特許文献1に記載のように、フロントカメラ4f及びリヤカメラ4rによる撮像画像の中から道路区画線を抽出して、その直線性や平行性を算出する必要がない。 Therefore, according to the imaging system 2 of the present embodiment, it is not necessary to arrange the cameras that are the targets of abnormality determination so that the imaging areas overlap. Further, as described in Patent Document 1, it is not necessary to extract road lane markings from images captured by the front camera 4f and the rear camera 4r and calculate their linearity and parallelism.
 このため、本実施形態によれば、車両40に複数のカメラ4f、4rを搭載することで、車両40が走行する路面に道路区画線等の特定の物標がなくても、これらカメラ4f、4rのレンズ5f、5rの異常をそれぞれ判定できる。 For this reason, according to the present embodiment, by mounting a plurality of cameras 4f and 4r on the vehicle 40, these cameras 4f and 4f, even if there is no specific target such as a road marking line on the road surface on which the vehicle 40 travels, The abnormality of the 4r lenses 5f and 5r can be respectively determined.
 また、この異常判定は、各カメラ4f、4rの撮像画像から生成される車両40の同一位置の鳥瞰画像を比較して、鳥瞰画像の差分を求めるだけでよいので、道路区画線を抽出して直線性や平行性を算出する場合に比べて、簡単かつ短時間で実施できる。 In addition, this abnormality determination only needs to obtain a difference between the bird's-eye images by comparing the bird's-eye images at the same position of the vehicle 40 generated from the captured images of the cameras 4f and 4r. Compared to the case of calculating linearity and parallelism, it can be carried out easily and in a short time.
 また、本実施形態では、レンズ5f、5rの異常判定に用いる後方鳥瞰画像と履歴画像との一方に自車両の影が含まれている場合には、異常判定を実施しないようにされている。 In the present embodiment, when one of the rear bird's-eye view image and the history image used for abnormality determination of the lenses 5f and 5r includes a shadow of the own vehicle, the abnormality determination is not performed.
 このため、本実施形態によれば、例えば、太陽光によって車両40の前方側若しくは後方側に影ができている場合に、その影の影響を受けてレンズ5f、5rの異常を誤判定してしまうのを防止または抑制できる。 Therefore, according to the present embodiment, for example, when a shadow is formed on the front side or the rear side of the vehicle 40 due to sunlight, the abnormality of the lenses 5f and 5r is erroneously determined due to the influence of the shadow. Can be prevented or suppressed.
 また、本実施形態では、レンズ5f、5rの異常を検出すると、画像合成部24による表示部30への鳥瞰画像の表示を禁止し、異常報知部32を介して、レンズ5f、5rの異常を報知する。このため、フロントカメラ4f若しくはリヤカメラ4rのレンズに、破損、汚れ等の異常が生じ、車両40の周囲の鳥瞰画像を正常に生成できないときに、表示部30に不鮮明な鳥瞰画像が表示されてしまうのを抑制できる。また、レンズ5f、5rの異常を報知することで、使用者に対し、各カメラ4f、4rの点検を促すことができる。 Further, in the present embodiment, when the abnormality of the lenses 5f and 5r is detected, the display of the bird's-eye image on the display unit 30 by the image composition unit 24 is prohibited, and the abnormality of the lenses 5f and 5r is detected via the abnormality notification unit 32. Inform. For this reason, when an abnormality such as breakage or dirt occurs in the lens of the front camera 4f or the rear camera 4r, and a bird's-eye image around the vehicle 40 cannot be normally generated, a blurred bird's-eye image is displayed on the display unit 30. Can be suppressed. In addition, by notifying the abnormality of the lenses 5f and 5r, it is possible to prompt the user to check each of the cameras 4f and 4r.
 以上、本開示の一実施形態について説明したが、本開示は、上記実施形態に限定されるものではなく、種々変形して実施することができる。
 例えば、上記実施形態は、車両40には、フロントカメラ4fとリヤカメラ4rとが搭載されているものとして説明したが、本開示は、車両40に複数のカメラが搭載されていれば、上記実施形態と同様にレンズ異常を判定できる。
As mentioned above, although one embodiment of this indication was described, this indication is not limited to the above-mentioned embodiment, and can carry out various modifications.
For example, although the above embodiment has been described on the assumption that the front camera 4f and the rear camera 4r are mounted on the vehicle 40, the present disclosure is not limited to the above embodiment if a plurality of cameras are mounted on the vehicle 40. Lens abnormality can be determined in the same manner as above.
 つまり、車両40には、フロントカメラ4fとリヤカメラ4rとに加えて、車両の左及び右をそれぞれ撮像するサイドカメラが搭載されることがある。
 この場合、フロントカメラと一方のサイドカメラとを使って、車両の前方鳥瞰画像と後方鳥瞰画像とを生成し、前方鳥瞰画像を履歴画像として残す。そして、後方鳥瞰画像と履歴画像とで位置が一致する画像領域同士を比較することで、レンズ異常を判定する。
In other words, in addition to the front camera 4f and the rear camera 4r, the vehicle 40 may be equipped with side cameras that capture the left and right sides of the vehicle, respectively.
In this case, a front bird's-eye image and a rear bird's-eye image of the vehicle are generated using the front camera and one side camera, and the front bird's-eye image is left as a history image. Then, the lens abnormality is determined by comparing the image regions having the same position in the rear bird's-eye view image and the history image.
 このようにしても、上記実施形態と同様に、各カメラのレンズ異常を判定することができる。また、同様に、リヤカメラと一方のサイドカメラとを使って、レンズ異常を判定することもできる。 Even in this case, the lens abnormality of each camera can be determined in the same manner as in the above embodiment. Similarly, a lens abnormality can be determined using the rear camera and one side camera.
 またこの場合、フロントカメラ4f、リヤカメラ4r、及び、サイドカメラの一方若しくは両方を使って、それぞれ、鳥瞰画像を生成し、その生成した鳥瞰画像の中から位置が一致する画像領域を抽出して比較することで、レンズ異常を判定するようにしてもよい。 In this case, one or both of the front camera 4f, the rear camera 4r, and the side camera are used to generate bird's-eye images, and image regions whose positions match are extracted from the generated bird's-eye images and compared. By doing so, lens abnormality may be determined.
 また次に、上記実施形態では、画像処理部20はマイクロコンピュータを備え、マイクロコンピュータのCPUがプログラムを実行することで、視点変換部22、画像合成部24、異常判定部26、影判定部28としての機能が実現されている。 Next, in the above embodiment, the image processing unit 20 includes a microcomputer, and the CPU of the microcomputer executes the program, whereby the viewpoint conversion unit 22, the image composition unit 24, the abnormality determination unit 26, and the shadow determination unit 28. As a function is realized.
 しかし、画像処理部20は、その機能の一部若しくは全てを、マイクロコンピュータに代えて、またはマイクロコンピュータに加えて、論理回路やアナログ回路等を組み合わせたハードウェアを用いて実現するように構成されてもよい。 However, the image processing unit 20 is configured to implement part or all of the functions using hardware that combines a logic circuit, an analog circuit, or the like in place of or in addition to the microcomputer. May be.
 また、上記実施形態では、車両40の位置や進行方向を把握できるようにするために、車輪速センサ6及び操舵角センサ8を用いるものとして説明したが、これらとは別に衛星測位システムを利用するようにしてもよい。 In the above-described embodiment, the wheel speed sensor 6 and the steering angle sensor 8 are described as being used so that the position and traveling direction of the vehicle 40 can be grasped. However, a satellite positioning system is used separately from these. You may do it.
 次に、上記実施形態における1つの構成要素が有する複数の機能を、複数の構成要素によって実現したり、1つの構成要素が有する1つの機能を、複数の構成要素によって実現したりしてもよい。また、複数の構成要素が有する複数の機能を、1つの構成要素によって実現したり、複数の構成要素によって実現される1つの機能を、1つの構成要素によって実現したりしてもよい。また、上記実施形態の構成の一部を省略してもよい。また、上記実施形態の構成の少なくとも一部を、他の上記実施形態の構成に対して付加又は置換してもよい。なお、特許請求の範囲に記載した文言のみによって特定される技術思想に含まれるあらゆる態様が本開示の実施形態である。 Next, a plurality of functions of one constituent element in the above embodiment may be realized by a plurality of constituent elements, or one function of one constituent element may be realized by a plurality of constituent elements. . Further, a plurality of functions possessed by a plurality of constituent elements may be realized by one constituent element, or one function realized by a plurality of constituent elements may be realized by one constituent element. Moreover, you may abbreviate | omit a part of structure of the said embodiment. Further, at least a part of the configuration of the above embodiment may be added to or replaced with the configuration of the other embodiment. In addition, all the aspects included in the technical idea specified only by the wording described in the claims are embodiments of the present disclosure.
 また、本開示は、上述した撮像システム2の他、当該撮像システム2を構成要素とするシステム、あるいは、当該撮像システム2の画像処理部20としてコンピュータを機能させるためのプログラム、として実現することができる。また、本開示は、このプログラムを記録した半導体メモリ等の非遷移的実態的記録媒体、画像処理部20にて実現されるレンズ異常検出方法など、種々の形態で実現することもできる。 In addition to the imaging system 2 described above, the present disclosure can be realized as a system including the imaging system 2 as a component, or a program for causing a computer to function as the image processing unit 20 of the imaging system 2. it can. Further, the present disclosure can be realized in various forms such as a non-transitional actual recording medium such as a semiconductor memory in which the program is recorded, a lens abnormality detection method realized by the image processing unit 20, and the like.

Claims (4)

  1.  車両の周囲を撮像するように構成された複数の車載カメラ(4f、4r)と、
     前記複数の車載カメラによる撮像画像をそれぞれ視点変換することで鳥瞰画像を生成するように構成された視点変換部(22、S110)と、
     前記視点変換部にて生成された前記鳥瞰画像のうち、前記車両の走行方向前方を網羅する前方鳥瞰画像を、履歴画像として、前記車両の走行に応じて順次記憶するように構成された記憶部(16、S130)と、
     前記視点変換部にて生成された前記鳥瞰画像のうち、前記車両の走行方向後方を網羅する後方鳥瞰画像と、前記記憶部に記憶され、当該後方鳥瞰画像に対応した履歴画像と、を比較し、前記後方鳥瞰画像と前記履歴画像との差分が異常判定値以上であるとき、前記複数の車載カメラの一つにレンズ異常が生じていると判定するように構成された異常判定部(26、S160-S180)と、
     を備えた車載カメラのレンズ異常検出装置。
    A plurality of in-vehicle cameras (4f, 4r) configured to image the surroundings of the vehicle;
    A viewpoint conversion unit (22, S110) configured to generate a bird's-eye view image by converting viewpoints of images captured by the plurality of in-vehicle cameras,
    A storage unit configured to sequentially store, as a history image, a front bird's-eye image covering the front in the traveling direction of the vehicle among the bird's-eye images generated by the viewpoint conversion unit. (16, S130),
    Among the bird's-eye images generated by the viewpoint conversion unit, a rear bird's-eye image covering the rear in the traveling direction of the vehicle is compared with a history image stored in the storage unit and corresponding to the rear bird's-eye image. When the difference between the rear bird's-eye view image and the history image is greater than or equal to an abnormality determination value, an abnormality determination unit configured to determine that a lens abnormality has occurred in one of the plurality of in-vehicle cameras (26, S160-S180),
    Lens abnormality detection device for in-vehicle camera equipped with
  2.  前記視点変換部にて生成された前記鳥瞰画像に自車両の影が含まれているか否かを判定するように構成された影判定部(28、S140、S150)を備え、
     前記異常判定部は、前記影判定部にて前記自車両の前記影が含まれていると判定された鳥瞰画像を利用して前記レンズ異常を判定するのを禁止するように構成されている請求項1に記載の車載カメラのレンズ異常検出装置。
    A shadow determination unit (28, S140, S150) configured to determine whether or not the bird's-eye image generated by the viewpoint conversion unit includes a shadow of the host vehicle;
    The abnormality determination unit is configured to prohibit the lens abnormality from being determined using a bird's-eye image that is determined by the shadow determination unit to include the shadow of the host vehicle. Item 2. An on-vehicle camera lens abnormality detection device according to Item 1.
  3.  前記視点変換部にて生成された前記鳥瞰画像と前記記憶部に記憶された履歴画像とを合成することで、前記自車両の周囲の鳥瞰画像を生成するように構成された画像合成部(24)を備え、
     前記異常判定部は、前記レンズ異常が生じていると判定すると前記画像合成部による画像合成を禁止するように構成されている請求項1または請求項2に記載の車載カメラのレンズ異常検出装置。
    An image synthesis unit (24) configured to generate a bird's-eye image around the host vehicle by synthesizing the bird's-eye image generated by the viewpoint conversion unit and the history image stored in the storage unit. )
    The in-vehicle camera lens abnormality detection device according to claim 1, wherein the abnormality determination unit is configured to prohibit image composition by the image composition unit when it is determined that the lens abnormality has occurred.
  4.  前記異常判定部にて前記レンズ異常が生じていると判定されると、前記レンズ異常が生じていることを報知するように構成された異常報知部(32、S190)を備えている、請求項1~請求項3のいずれか1項に記載の車載カメラのレンズ異常検出装置。 An abnormality notifying unit (32, S190) configured to notify that the lens abnormality has occurred when the abnormality determining unit determines that the lens abnormality has occurred. The lens abnormality detection device for an in-vehicle camera according to any one of claims 1 to 3.
PCT/JP2017/000659 2016-01-12 2017-01-11 Device for detecting abnormality of lens of onboard camera WO2017122688A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016003752A JP6565693B2 (en) 2016-01-12 2016-01-12 In-vehicle camera lens abnormality detection device
JP2016-003752 2016-01-12

Publications (1)

Publication Number Publication Date
WO2017122688A1 true WO2017122688A1 (en) 2017-07-20

Family

ID=59311246

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/000659 WO2017122688A1 (en) 2016-01-12 2017-01-11 Device for detecting abnormality of lens of onboard camera

Country Status (2)

Country Link
JP (1) JP6565693B2 (en)
WO (1) WO2017122688A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7009209B2 (en) * 2017-12-28 2022-01-25 株式会社デンソーテン Camera misalignment detection device, camera misalignment detection method and abnormality detection device
WO2022102425A1 (en) * 2020-11-11 2022-05-19 ソニーセミコンダクタソリューションズ株式会社 Signal processing device, and signal processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011108278A1 (en) * 2010-03-05 2011-09-09 パナソニック株式会社 Monitoring system, control method thereof, and semiconductor integrated circuit
JP2013246493A (en) * 2012-05-23 2013-12-09 Denso Corp Vehicle circumference image display controller and vehicle circumference image display control program
JP2014013994A (en) * 2012-07-04 2014-01-23 Denso Corp Vehicle peripheral image display control device and vehicle peripheral image display control program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011108278A1 (en) * 2010-03-05 2011-09-09 パナソニック株式会社 Monitoring system, control method thereof, and semiconductor integrated circuit
JP2013246493A (en) * 2012-05-23 2013-12-09 Denso Corp Vehicle circumference image display controller and vehicle circumference image display control program
JP2014013994A (en) * 2012-07-04 2014-01-23 Denso Corp Vehicle peripheral image display control device and vehicle peripheral image display control program

Also Published As

Publication number Publication date
JP6565693B2 (en) 2019-08-28
JP2017126835A (en) 2017-07-20

Similar Documents

Publication Publication Date Title
JP5421072B2 (en) Approaching object detection system
JP6137081B2 (en) Car equipment
US9619894B2 (en) System and method for estimating vehicle dynamics using feature points in images from multiple cameras
US11450040B2 (en) Display control device and display system
US9946938B2 (en) In-vehicle image processing device and semiconductor device
US10099617B2 (en) Driving assistance device and driving assistance method
JP2008222153A (en) Merging support device
JP6376354B2 (en) Parking assistance system
WO2015122124A1 (en) Vehicle periphery image display apparatus and vehicle periphery image display method
US20190102892A1 (en) Drive recorder
JP6635621B2 (en) Automotive vision system and method of controlling the vision system
WO2017122688A1 (en) Device for detecting abnormality of lens of onboard camera
CN111989541B (en) Stereo camera device
JP7183729B2 (en) Imaging abnormality diagnosis device
JP6327115B2 (en) Vehicle periphery image display device and vehicle periphery image display method
US9827906B2 (en) Image processing apparatus
US10960820B2 (en) Vehicle periphery image display device and vehicle periphery image display method
WO2015001747A1 (en) Travel road surface indication detection device and travel road surface indication detection method
JP2008042759A (en) Image processing apparatus
JP4040620B2 (en) Vehicle periphery monitoring device
EP3396620B1 (en) Display control device and display control method
CN113170057A (en) Image pickup unit control device
JP2020042716A (en) Abnormality detection device and abnormality detection method
WO2019013253A1 (en) Detection device
US11410288B2 (en) Image processing apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17738442

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17738442

Country of ref document: EP

Kind code of ref document: A1