WO2011030699A1 - 車両周辺監視装置 - Google Patents
車両周辺監視装置 Download PDFInfo
- Publication number
- WO2011030699A1 WO2011030699A1 PCT/JP2010/064866 JP2010064866W WO2011030699A1 WO 2011030699 A1 WO2011030699 A1 WO 2011030699A1 JP 2010064866 W JP2010064866 W JP 2010064866W WO 2011030699 A1 WO2011030699 A1 WO 2011030699A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- display screen
- displayed
- display
- Prior art date
Links
- 238000012544 monitoring process Methods 0.000 title description 12
- 230000002441 reversible effect Effects 0.000 claims description 18
- 238000012806 monitoring device Methods 0.000 claims description 17
- 238000003384 imaging method Methods 0.000 claims description 8
- 238000012545 processing Methods 0.000 description 14
- 238000000034 method Methods 0.000 description 10
- 238000012937 correction Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000001154 acute effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000010008 shearing Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 208000020401 Depressive disease Diseases 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000004300 dark adaptation Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000004301 light adaptation Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/26—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/306—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using a re-scaling of images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/806—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
Definitions
- the present invention relates to a vehicle periphery monitoring device that performs a predetermined process on a captured image captured by a camera and displays the image on a display device in a vehicle interior.
- Patent Document 1 As a technique for performing predetermined image processing (distortion correction) so that a driver can easily view a captured image of a vehicle surroundings captured by a camera on a display device in a vehicle interior, for example, a method described in Patent Document 1 is known. It has been.
- the driver's visibility to the displayed image is improved by correcting the distortion. That is, by displaying an image (video) around the vehicle taken with a camera having a wide angle of view (for example, horizontal 160 degrees or more) as a single image, a wide area can be visually recognized on the display screen, The presence or absence of a pedestrian can be recognized.
- a wide angle of view for example, horizontal 160 degrees or more
- the present invention has been made in view of the above-mentioned problems, and its purpose is to appropriately present to the driver the situation where there are obstacles, people, etc. in the vicinity in accordance with the use situation (running situation) of the vehicle.
- the purpose is to do.
- the technical means taken in the present invention is that the vehicle is stopped based on at least one imaging means for photographing the periphery of the vehicle and a photographed image photographed by the imaging means.
- the display screen including a display image at the time of stop for displaying when the vehicle is not, and the image for displaying when the vehicle is not in a stop state, the vehicle being narrower than the area around the vehicle on which the display image at the time of stop is projected
- Screen generating means for generating a display screen including a non-stop display image that projects the surrounding area, and display means for displaying the display screen generated by the screen generating means.
- the non-stop display image is a partial image of the stop display image.
- the stop state is a state from when the shift position of the vehicle is changed to the reverse position until it is determined that the vehicle has started running, and the vehicle is not in the stop state. Is more preferably a state in which it is determined that the vehicle has started running from the stopped state.
- the display screen including the non-stop display image is displayed, the display screen including the stop display image is not displayed unless the vehicle shift position is changed to the reverse position again. It is more suitable if it is constituted.
- the vehicle when determining that the vehicle has started traveling, it is determined that the vehicle has started traveling when the vehicle has traveled a predetermined distance after changing the shift position of the vehicle to the reverse position, or the vehicle speed is It is more preferable that the vehicle is determined to start running when it is equal to or greater than a predetermined value.
- FIG. 1 is a schematic view of a vehicle 100 including a vehicle periphery monitoring device 50 according to the present invention.
- the vehicle 100 is provided with a rear camera 11 that captures a scene behind the vehicle 100.
- the rear camera 11 is provided at the rear portion of the vehicle 100 and is provided so as to face slightly downward with respect to the horizontal direction. That is, the optical axis of the rear camera 11 forms an acute angle with respect to the road surface on which the vehicle 100 exists.
- the rear camera 11 is installed with a depression angle of about 30 degrees toward the rear of the vehicle 100, and can photograph a region up to about 8 m behind the vehicle 100.
- the rear camera 11 is a digital camera that incorporates an image sensor such as a CCD (charge coupled device) or a CIS (CMOS image sensor) and outputs information captured by the image sensor in real time as moving image information.
- the rear camera 11 is preferably configured using, for example, a wide-angle lens (for example, an angle of view of 160 ° or more) or a fish-eye lens.
- the vehicle 100 includes an electronic control unit 20 illustrated in FIG.
- the electronic control unit 20 is composed of, for example, a microcomputer having a ROM and a RAM.
- a captured image (video) captured by the rear camera 11 is transmitted to the electronic control unit 20 as described later, and after predetermined processing is performed by the electronic control unit 20, a monitor 200 in the vehicle interior shown in FIG. (Display means).
- the monitor 200 is also used as a display device for the navigation system.
- the vehicle 100 is provided with a vehicle speed sensor that detects the speed of the vehicle 100 (not shown).
- the wheel pulse detected by the vehicle speed sensor and the vehicle speed are transmitted to the electronic control unit 20 by a predetermined communication means.
- FIG. 2 shows the vehicle periphery monitoring device 50, and the captured image captured by the rear camera 11 is transmitted to the electronic control unit 20.
- the electronic control unit 20 includes a screen generation unit 21 and an output unit 22.
- the screen generation unit 21 generates a display screen based on the captured image captured by the rear camera 11.
- the generated display screen is transmitted from the screen generation unit 21 to the monitor 200 via the output unit 22 and displayed on the monitor 200.
- FIG. 3 is a diagram showing a display screen 210 displayed on the monitor 200 by the vehicle periphery monitoring device 50 according to the present invention.
- the display screen is switched to the display screen shown in FIG.
- the captured image captured by the rear camera 11 includes a central image 211, a right image 212 having a substantially parallelogram shape arranged on the right side of the central image 211, and a substantially arranged image on the left side of the central image 211. It is displayed as a parallelogram-shaped left image 213.
- the center image 211 and the right image 212 and the center image 211 and the left image 213 are partitioned by a dividing line 250.
- a predetermined gap is provided between the central image 211 and the right image 212 and between the central image 211 and the left image 213. In this way, it is easy to distinguish between the rear and the side by dividing and displaying between the center image 211 and the right image 212 and between the center image 211 and the left image 213.
- the center image 211 and the right image 212 have continuous image contents (maintaining continuity of the image contents) with the dividing line 250 interposed therebetween. Further, adjacent sides of the center image 211 and the right image 212 are parallel to each other. The same applies to the center image 211 and the left image 213.
- the generation of the center image 211, the right image 212, and the left image 213 and the arrangement of these images on the display screen are due to the function of the screen generation unit 21.
- the image content is continuous means that the center image 211 and the right image 212 (or the left image 213) are connected.
- the image values on the opposite sides coincide (or are continuous). Even if the image values on the opposite sides of the central image 211 and the right image 212 (or the left image 213) do not match (or continue), the central image 211 and the right image 212 (or the left image 213)
- the image content is said to be continuous.
- the central image 211 and the right image 212 match (or continue)
- the central image 211 and the right image 212 or This is a case where the left image 213) is masked with a dividing line having a predetermined width on the boundary between the center image 211 and the right image 212 (left image 213) when arranged in the horizontal direction without a gap.
- the display screen 210 by the vehicle periphery monitoring device 50 in this embodiment is such that the rear side of the vehicle and its left and right conditions are visually recognized as if it were a three-sided mirror holding the left and right mirrors at a predetermined angle with respect to the central mirror.
- the central image 211 is an image in the traveling direction (front) when the steering wheel is substantially straight, and the right image 212 and the left image 213 have the left and right peripheral situations of the content displayed by the central image 211, respectively. Is displayed.
- the positional relationship among the central image 211, the right image 212, and the left image 213 corresponds to the actual positional relationship
- the front image (the central image 211) is displayed in a rectangle, and the image corresponds to the left and right of the front image.
- (Right image 212, Left image 213) are displayed in a parallelogram shape. Therefore, it is possible to confirm the safety of a wide range behind the vehicle 100, and to easily distinguish whether there are obstacles or people in the rear or the side.
- a figure simulating the vehicle 100 and an icon 260 composed of two triangles and a trapezoid are displayed. This allows the driver to recognize where the center image 211, the right image 212, and the left image 213 are displayed around the vehicle 100.
- a warning text is displayed at the bottom of the display screen 210.
- an index line 300 serving as an index when the driver is driving is superimposed and displayed.
- an index line extending in the horizontal direction of the screen indicates a predicted position from the rear end of the vehicle 100.
- the index lines extending in the horizontal direction of the screen are index lines indicating positions of 50 cm, 1 m, and 3 m from the rear end of the vehicle 100 from the lower side of the screen, respectively.
- two left and right lines extending substantially in the vertical direction of the display screen 210 indicate the width of the vehicle 100 or a width obtained by increasing or decreasing a predetermined value to the width of the vehicle 100. is there.
- the index line 300 may be a fixed line or may be linked to the steering angle.
- the index line 300 can be displayed in three dimensions with at least one of a shadow part and a side part. According to this, it is possible to make the driver recognize that the index line 300 is along the road surface displayed in the central image 211.
- FIG. 4 is a real image of a display screen displayed by the vehicle periphery monitoring device according to the embodiment of the present invention.
- the screen generation unit 21 first performs distortion correction processing described in, for example, Japanese Patent Application Laid-Open No. 2009-12652 on the captured image captured by the rear camera 11.
- the outline of the distortion correction processing described in Japanese Patent Application Laid-Open No. 2009-12652 will be described in the case of applying to the present embodiment.
- the first axis (the horizontal direction of the display screen) is substantially fixed with respect to the captured image, while the second axis A process of enlarging the (display screen vertical direction) direction with a non-linearly increasing enlargement ratio depending on the distance from the second axis is performed.
- the first axis direction is fixed, and the second axis direction is enlarged at an enlargement factor ⁇ .
- This enlargement ratio increases as the distance from the second axis increases, that is, as the size of the display screen horizontal coordinate (the absolute value of the display screen horizontal coordinate) increases.
- an image after the distortion correction processing is performed on the captured image captured by the rear camera 11 is denoted as IM0.
- the origin O is the center of the captured image (image sensor).
- the center image 211, the right image 212, and the left image 213 are generated as follows.
- IM0 a rectangular area shown in FIG. 6 (b) having an x coordinate of ⁇ t ⁇ x ⁇ t (t> 0) and a y coordinate of ⁇ H / 2 ⁇ y ⁇ H / 2 (H> 0).
- IM1 that is the center image 211.
- the central image 211 is a partial image (IM1) of IM0 having a rectangular shape with a vertical side length of H and a horizontal side length of 2t, as shown in FIG. 6B.
- the right image 212 is generated as follows. First, as shown in FIG. 6B, the IM2 that is the region of the IM0 where the x coordinate is t ⁇ x ⁇ t + W is specified. That is, IM2 is a partial image of IM0 and is adjacent to IM1 in IM0. Therefore, IM2 is a rectangle having a vertical side length of H and a horizontal side length of W as shown in FIG. Here, IM2 is deformed to generate IM2 ′. This image deformation means that the pixel value at the coordinate value (X0, Y0) in IM2 is set to the pixel value at the coordinate value (X1, Y1) in IM2 ′.
- the right image 212 becomes an image in which the image is lifted by shearing in the y-axis direction as it goes to the right side (the direction in which the x coordinate increases) with respect to the original image IM2 (IM0).
- IM0 the original image IM2
- the left image 213 is also generated as follows.
- the IM3 that is the region of the IM0 whose x coordinate is tW ⁇ x ⁇ ⁇ t is specified. That is, IM3 is a partial image of IM0 and is adjacent to IM1 in IM0.
- IM3 is a rectangle having a vertical side length of H and a horizontal side length of W as shown in FIG.
- the image of IM3 is deformed to generate IM3 ′.
- This image deformation means that the pixel value at the coordinate value (X0, Y0) in IM3 is set to p, and the pixel value at the coordinate value (X1, Y1) in IM3 ′ is set to p.
- the left image 213 becomes an image in which the image is lifted by shearing in the y-axis direction as it goes to the left side (the direction in which the x coordinate becomes smaller) with respect to the original image IM3 (IM0).
- IM0 the original image IM3
- the screen generation unit 21 arranges the right image 212 and the left image 213 generated in this way on the right and left of the central image 211, respectively, and generates the display screen 210 of FIGS.
- the display screen shown in FIGS. 3 and 4 is switched to 210. That is, the display screen 210 shown in FIG. 3 and FIG. 4 is a display screen including an image (display image at stop) for displaying when the vehicle 100 is stopped.
- the display screen 210 shown in FIG. 3 and FIG. 4 is a display screen including an image (display image at stop) for displaying when the vehicle 100 is stopped.
- the vehicle periphery monitoring device shifts to a standby mode in which it waits without performing display for periphery monitoring.
- a standby mode for example, a navigation screen is displayed.
- the center image 211, the right image 212, and the left image 213 are display images when stopped.
- FIG. 5 shows a display screen including an image displayed when the vehicle 100 is not stopped (non-stop display image) when the vehicle 100 starts moving backward (running or moving) after confirming safety.
- the central image 211 is a non-stop display image.
- the image generation unit 21 generates a display screen including a display image when stopped and a display screen including a display image when not stopped.
- the central image 211, the right image 212, and the left image 213 are display images when stopped, and the central image 211 is a non-stop display image, the periphery of the vehicle 100 on which the non-stop display image is displayed.
- the area is narrower than the area around the vehicle 100 where the display image is displayed when the vehicle is stopped. That is, the stop-time display image is a wide-angle image, and the non-stop-time display image is a narrow-angle image.
- the non-stop display image is a partial image of the stop display image.
- the display screen including the non-stop display image is the display screen 220 shown in FIG.
- the central image 211 is enlarged and displayed by the amount that the right image 212 and the left image 213 are not displayed.
- the central image 211 may not be enlarged, and an image (only the central image 211) that does not display the right image 212 and the left image 213 from the display screens of FIGS. 3 and 4 may be used. .
- the display When switching from the display screen including the display image at the time of stop (display screen 210) to the display screen including the display image at the time of non-stop (display screen 220), the display may be immediately switched to the enlarged display of the central image 211.
- switching is performed as follows. First, the right image 212 and the left image 213 are deleted from the display screen 210. Thereafter, as schematically illustrated in FIG. 7, the central image 211a (the central image 211 immediately after the right image 212 and the left image 213 are deleted) is gradually enlarged to the central image 211b, the central image 211c, and the central image 211d. To go. At this time, the image may be continuously expanded or may be expanded stepwise (stepwise). By presenting the process of switching the display screen to the driver in this way, the driver recognizes where the display screen (image) after switching corresponds (corresponds) to the display screen (image) before switching, Easy to understand.
- the reverse start of the vehicle 100 is, for example, a wheel pulse from a vehicle speed sensor as a movement distance detecting means for detecting a movement distance when the speed of the vehicle 100 detected by the vehicle speed sensor is greater than a predetermined speed. This is when the electronic control unit 20 determines that the vehicle has traveled a predetermined distance from the stop state using the accumulated number.
- the vehicle periphery monitoring device 50 stands by. For example, a navigation screen is displayed.
- the vehicle may be returned to the display screen (display screen 210) including the display image when stopped.
- the driver may be notified by sound or voice of switching to the display screen including the display image at the time of stop.
- the driver can monitor the surroundings appropriately according to the situation by automatically switching to an appropriate range of images instead of an image with a wider angle of view than necessary when traveling backwards. It becomes.
- the driver does not need to perform a switch operation and is not troublesome.
- the vehicle 100 may not be provided with illumination for illuminating the areas of the right image 212 and the left image 213. Therefore, when it is determined at night (or when the vehicle 100 is in a place with low illuminance such as indoors), only the right image 212 and the left image 213 of the center image 211, the right image 212, and the left image 213 are corrected for luminance. It is preferable to improve visibility by adding processing to display brightly. At this time, determination at night may be performed based on luminance information of the entire captured image.
- the right image 212 or the partial image that is the original of the right image 212 is subjected to luminance correction processing based on the luminance information of the right image 212 or the partial image (IM2) that is the original of the right image 212.
- the luminance correction processing of the left image 213 may be performed.
- the luminance information is equal to or less than a predetermined threshold value, it is determined that it is nighttime, and the right image 212 and the left image 213 are displayed brightly. Further, the determination at night may be based on time information instead of luminance information, or may be based on lighting of a headlight or a width lamp by a driver.
- the electronic control unit 20 is provided separately from the rear camera 11, but is not limited to this, and the electronic control unit 20 may be provided in the rear camera 11 and integrated.
- the present invention can of course be applied when the vehicle 100 moves forward.
- a front camera is provided instead of the rear camera 11, and the shift position may be determined as a forward position (D position or the like) instead of the reverse position.
- the captured image from one rear camera 11 is divided and displayed in the center image 211, the right image 212, and the left image 213, but images captured by a plurality of cameras may be used.
- a right side camera that captures the rear right side of the vehicle 100 and a left side camera that captures the rear left side of the vehicle 100 are displayed.
- the captured image of the rear camera 11 is displayed in the central image 211, and the right side
- the video of the camera may be displayed on the right image 212, and the video of the left side camera may be displayed on the left image 213.
- the present invention is not limited to dividing images from one camera, and when displaying images from a plurality of cameras, the positional relationship of each image and the position and direction relationship with the vehicle can be accurately presented to the driver. it can.
- the imaging ranges of the rear camera 11, the right side camera, and the left side camera may or may not overlap.
- the present invention can also be applied to the case where the front camera monitors the front periphery of the vehicle 100.
- the present invention can also be applied to the case where the side periphery of the vehicle 100 is monitored by a side camera that monitors the side of the vehicle 100.
- the right image 212 and the left image 213 have a parallelogram shape in which the vertical side that is not on the central image 211 side is higher than the central image 211. Modifications thereof are shown in FIGS. In FIG. 8, the right image 212 and the left image 213 are rectangular. In FIG. 9, the right image 212 and the left image 213 are trapezoidal. In FIG. 10, the right image 212 and the left image 213 have parallelogram shapes such that the vertical sides of the right image 212 and the left image 213 that are not on the center image 211 side are below the center image 211. In addition, the right image 212 and the left image 213 can be triangular or polygonal.
- the right image 212 and the left image 213 may be subjected to the image deformation as described above according to the shape displayed on the display screen, or may be performed according to the shape displayed on the display screen without performing the image deformation. An image cut out from a specific image may be used.
- the central image 211 has a rectangular shape, but is not limited thereto, and may be a hexagon as shown in FIG. 11 or other polygons.
- the adjacent sides of the center image 211 and the right image 212 are parallel to each other and have a predetermined gap.
- the image content is continuous across the gap (partition line).
- the center image 211, the left image 213, and the right image 212 are displayed while the vehicle 100 is stopped, and only the center image 211 is displayed while the vehicle 100 is running.
- An image of a camera with an angle (for example, 160 °) is displayed, and an image with a narrow angle of view (for example, equivalent to about 130 °) obtained by extracting a part (partial image) from the image with the wide angle of view during traveling. May be displayed.
- two cameras with a wide angle of view and two cameras with a narrow angle of view may be provided, and the display may be switched in the same manner.
- a rear camera 11 and a right side camera that captures the rear right side of the vehicle 100 and an image of the left side camera that captures the rear left side of the vehicle 100 are displayed. At least one of the images of the camera and the left side camera may be displayed, and only the image of the rear camera 11 may be displayed during traveling.
- the right image 212 and the left image 213 are brightly displayed by performing luminance correction processing.
- Image processing that is reversed, reversed (negative / positive reversed), or reversed after monochrome (negative / positive reversed) may be performed.
- monochrome By converting to monochrome, an image with the highest possible brightness can be obtained even when the subject is dark. Further, by reversing, a bright image can be obtained when the subject is dark.
- the sensitivity adjustment of the human eye is more rapid with light adaptation than with dark adaptation, and a better response is obtained with a brighter index for lens focus adjustment. It is also preferable from the viewpoint of engineering.
- a color image is negative-positive-inverted, the color is greatly deviated from the actual color. Therefore, when the photographed image is a color image, it is desirable to reverse the grayscale image negative-positive in order to reduce discomfort.
- the wide-angle image (display screen 210) including the display image at the time of stoppage is returned to the display screen (display screen 210), but once the display state of the display screen including the narrow-angle image (non-stop display image) is obtained, the vehicle 100 is then stopped.
- the display screen (display screen 210 shown in FIGS. 3 and 4) including a wide-angle image (display image at the time of stop) may not be returned.
- the operation mode of the former (this embodiment) and the operation mode of the latter (another embodiment) may be switchable by a driver using a switch.
- the captured image captured by the camera is used without changing the viewpoint (change of viewpoint position and line-of-sight direction).
- the present invention is not limited to this, and the captured image converted from the viewpoint is used. May be.
- a viewpoint conversion image processed to increase or decrease the depression angle of the captured image within the range of the gaze direction that forms an acute angle with respect to the horizontal direction an image that displays the periphery of the vehicle having an acute depression angle
- a viewpoint-converted image obtained by combining these processes or a viewpoint-converted image obtained by combining these processes may be used.
- the present invention can be used for a vehicle periphery monitoring device that performs a predetermined process on a captured image of a vehicle periphery captured by a camera and displays the image on a display device in a vehicle interior.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Closed-Circuit Television Systems (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
Abstract
Description
20・・・電子制御ユニット
21・・・画面生成部(画面生成手段)
50・・・車両周辺監視装置
100・・・車両
200・・・モニタ(表示手段)
210・・・表示画面
220・・・表示画面
211・・・中央画像(第1画像)
212・・・右画像(第2画像)
213・・・左画像(第2画像)
250・・・分割線
260・・・アイコン
300・・・指標線
Claims (6)
- 車両周辺を撮影する少なくとも一つの撮像手段と、
前記撮像手段が撮影した撮影画像に基づいて、前記車両が停止状態のときに表示するための停止時表示画像を含む表示画面と、前記車両が停止状態でないときに表示するための画像であり、当該停止時表示画像が映し出す前記車両周辺の領域よりも狭い前記車両周辺の領域を映し出す非停止時表示画像を含む表示画面とを生成する画面生成手段と、
前記画面生成手段が生成した表示画面を表示する表示手段と、を備えたことを特徴とする車両周辺監視装置。 - 前記撮像手段は1つであり、前記非停止時表示画像は、前記停止時表示画像の部分画像である請求項1に記載の車両周辺監視装置。
- 前記停止状態とは、前記車両のシフト位置を後退位置に変更してから前記車両が走行を開始したと判定するまでの間の状態であり、
前記車両が停止状態でないとは、前記停止状態から前記車両が走行を開始したと判定された状態である請求項1または2に記載の車両周辺監視装置。 - 一旦、前記非停止時表示画像を含む表示画面を表示した後には、再度、前記車両のシフト位置を後退位置に変更しなければ、前記停止時表示画像を含む表示画面を表示しない請求項3に記載の車両周辺監視装置。
- 前記車両のシフト位置を後退位置に変更してから所定の距離を走行したときに前記車両が走行を開始したと判定する請求項3または4に記載の車両周辺監視装置。
- 前記車両の速度が所定の値以上であるときに前記車両が走行を開始したと判定する請求項3または4に記載の車両周辺監視装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080038865.7A CN102481876B (zh) | 2009-09-11 | 2010-09-01 | 车辆周边监视装置 |
JP2011530811A JPWO2011030699A1 (ja) | 2009-09-11 | 2010-09-01 | 車両周辺監視装置 |
US13/392,171 US20120154590A1 (en) | 2009-09-11 | 2010-09-01 | Vehicle surrounding monitor apparatus |
EP10815296.8A EP2476588B1 (en) | 2009-09-11 | 2010-09-01 | Vehicle surrounding monitor apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009210053 | 2009-09-11 | ||
JP2009-210053 | 2009-09-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011030699A1 true WO2011030699A1 (ja) | 2011-03-17 |
Family
ID=43732372
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/064866 WO2011030699A1 (ja) | 2009-09-11 | 2010-09-01 | 車両周辺監視装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20120154590A1 (ja) |
EP (1) | EP2476588B1 (ja) |
JP (1) | JPWO2011030699A1 (ja) |
CN (1) | CN102481876B (ja) |
WO (1) | WO2011030699A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2562047A1 (en) * | 2011-08-23 | 2013-02-27 | Fujitsu General Limited | Drive assisting apparatus |
JP2013177095A (ja) * | 2012-02-29 | 2013-09-09 | Murakami Corp | 車外音導入装置 |
JP2014209713A (ja) * | 2013-03-28 | 2014-11-06 | アイシン精機株式会社 | 周辺監視装置、及びプログラム |
JP2018069845A (ja) * | 2016-10-26 | 2018-05-10 | 株式会社デンソーテン | 画像表示装置及び画像処理方法 |
JP2020158114A (ja) * | 2020-06-02 | 2020-10-01 | 株式会社ユピテル | 撮像装置、システム、及びこのシステムを装備したフォークリフト |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9242602B2 (en) * | 2012-08-27 | 2016-01-26 | Fotonation Limited | Rearview imaging systems for vehicle |
JP2017208750A (ja) * | 2016-05-20 | 2017-11-24 | ローム株式会社 | 映像監視装置、映像表示システム、及び車両 |
WO2018029927A1 (ja) * | 2016-08-09 | 2018-02-15 | 株式会社Jvcケンウッド | 表示制御装置、表示装置、表示制御方法及びプログラム |
JP6939494B2 (ja) * | 2017-12-11 | 2021-09-22 | トヨタ自動車株式会社 | 画像表示装置 |
JP7091955B2 (ja) * | 2018-09-06 | 2022-06-28 | トヨタ自動車株式会社 | 車両用周辺表示装置 |
JP7159801B2 (ja) * | 2018-11-15 | 2022-10-25 | トヨタ自動車株式会社 | 車両用電子ミラーシステム |
JP7051667B2 (ja) * | 2018-11-26 | 2022-04-11 | 本田技研工業株式会社 | 車載装置 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08301010A (ja) * | 1995-05-01 | 1996-11-19 | Mitsubishi Motors Corp | 車両後方監視装置 |
JP2002109697A (ja) * | 2000-10-02 | 2002-04-12 | Matsushita Electric Ind Co Ltd | 運転支援装置 |
JP2003134507A (ja) * | 2001-10-24 | 2003-05-09 | Nissan Motor Co Ltd | 車両後方監視装置 |
JP2003199093A (ja) * | 2001-12-28 | 2003-07-11 | Matsushita Electric Ind Co Ltd | 運転支援装置及び画像出力装置並びにカメラ付きロッド |
JP2003212041A (ja) * | 2002-01-25 | 2003-07-30 | Toyota Central Res & Dev Lab Inc | 車輌後方表示装置 |
JP2009012652A (ja) | 2007-07-05 | 2009-01-22 | Aisin Seiki Co Ltd | 車両の周辺監視装置 |
JP2009060404A (ja) * | 2007-08-31 | 2009-03-19 | Denso Corp | 映像処理装置 |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3475507B2 (ja) * | 1994-08-08 | 2003-12-08 | 日産自動車株式会社 | 車両用周囲モニタ装置 |
JPH10244891A (ja) * | 1997-03-07 | 1998-09-14 | Nissan Motor Co Ltd | 駐車補助装置 |
US6512858B2 (en) * | 1998-07-21 | 2003-01-28 | Foveon, Inc. | Image scanning circuitry with row and column addressing for use in electronic cameras |
JP3298851B2 (ja) * | 1999-08-18 | 2002-07-08 | 松下電器産業株式会社 | 多機能車載カメラシステムと多機能車載カメラの画像表示方法 |
JP4153202B2 (ja) * | 2001-12-25 | 2008-09-24 | 松下電器産業株式会社 | 映像符号化装置 |
JP3909251B2 (ja) * | 2002-02-13 | 2007-04-25 | アルパイン株式会社 | 視線を用いた画面制御装置 |
JP4195966B2 (ja) * | 2002-03-05 | 2008-12-17 | パナソニック株式会社 | 画像表示制御装置 |
US20050168441A1 (en) * | 2002-11-05 | 2005-08-04 | Fujitsu Limited | Display control device, display control method, computer product |
JP2004257837A (ja) * | 2003-02-25 | 2004-09-16 | Olympus Corp | ステレオアダプタ撮像システム |
JP2005223524A (ja) * | 2004-02-04 | 2005-08-18 | Nissan Motor Co Ltd | 車両周辺監視装置 |
US20080150709A1 (en) * | 2004-02-20 | 2008-06-26 | Sharp Kabushiki Kaisha | Onboard Display Device, Onboard Display System and Vehicle |
DE102004015806A1 (de) * | 2004-03-29 | 2005-10-27 | Smiths Heimann Biometrics Gmbh | Verfahren und Anordnung zur Aufnahme interessierender Bereiche von beweglichen Objekten |
JP2005346648A (ja) * | 2004-06-07 | 2005-12-15 | Denso Corp | 視界支援装置およびプログラム |
JP2006033793A (ja) * | 2004-06-14 | 2006-02-02 | Victor Co Of Japan Ltd | 追尾映像再生装置 |
JP2006140908A (ja) * | 2004-11-15 | 2006-06-01 | Matsushita Electric Ind Co Ltd | 画像合成装置 |
US7423521B2 (en) * | 2004-12-07 | 2008-09-09 | Kabushiki Kaisha Honda Lock | Vehicular visual assistance system |
JP2006256419A (ja) * | 2005-03-16 | 2006-09-28 | Clarion Co Ltd | 駐車支援装置 |
US8130269B2 (en) * | 2005-03-23 | 2012-03-06 | Aisin Aw Co., Ltd. | Visual recognition apparatus, methods, and programs for vehicles |
JP4645254B2 (ja) * | 2005-03-24 | 2011-03-09 | アイシン・エィ・ダブリュ株式会社 | 車両周辺視認装置 |
JP4956915B2 (ja) * | 2005-05-20 | 2012-06-20 | 日産自動車株式会社 | 映像表示装置及び映像表示方法 |
JP2007081932A (ja) * | 2005-09-15 | 2007-03-29 | Auto Network Gijutsu Kenkyusho:Kk | 車両周辺視認装置 |
JP5045172B2 (ja) * | 2007-03-20 | 2012-10-10 | マツダ株式会社 | 車両用運転支援装置 |
JP2008242597A (ja) * | 2007-03-26 | 2008-10-09 | Yuhshin Co Ltd | 車両用監視装置 |
US8218007B2 (en) * | 2007-09-23 | 2012-07-10 | Volkswagen Ag | Camera system for a vehicle and method for controlling a camera system |
JP5194679B2 (ja) * | 2007-09-26 | 2013-05-08 | 日産自動車株式会社 | 車両用周辺監視装置および映像表示方法 |
JP5347257B2 (ja) * | 2007-09-26 | 2013-11-20 | 日産自動車株式会社 | 車両用周辺監視装置および映像表示方法 |
JP4875014B2 (ja) * | 2008-03-19 | 2012-02-15 | シャープ株式会社 | 帯電検知装置、及びそれを搭載した原稿読取装置 |
JPWO2009141846A1 (ja) * | 2008-05-19 | 2011-09-22 | パナソニック株式会社 | 車両周囲監視装置および車両周囲監視方法 |
WO2010080610A1 (en) * | 2008-12-19 | 2010-07-15 | Delphi Technologies, Inc. | Electronic side view display system |
JP4927904B2 (ja) * | 2009-05-07 | 2012-05-09 | 富士通テン株式会社 | 車両の運転支援装置 |
-
2010
- 2010-09-01 US US13/392,171 patent/US20120154590A1/en not_active Abandoned
- 2010-09-01 CN CN201080038865.7A patent/CN102481876B/zh not_active Expired - Fee Related
- 2010-09-01 EP EP10815296.8A patent/EP2476588B1/en not_active Not-in-force
- 2010-09-01 JP JP2011530811A patent/JPWO2011030699A1/ja active Pending
- 2010-09-01 WO PCT/JP2010/064866 patent/WO2011030699A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08301010A (ja) * | 1995-05-01 | 1996-11-19 | Mitsubishi Motors Corp | 車両後方監視装置 |
JP2002109697A (ja) * | 2000-10-02 | 2002-04-12 | Matsushita Electric Ind Co Ltd | 運転支援装置 |
JP2003134507A (ja) * | 2001-10-24 | 2003-05-09 | Nissan Motor Co Ltd | 車両後方監視装置 |
JP2003199093A (ja) * | 2001-12-28 | 2003-07-11 | Matsushita Electric Ind Co Ltd | 運転支援装置及び画像出力装置並びにカメラ付きロッド |
JP2003212041A (ja) * | 2002-01-25 | 2003-07-30 | Toyota Central Res & Dev Lab Inc | 車輌後方表示装置 |
JP2009012652A (ja) | 2007-07-05 | 2009-01-22 | Aisin Seiki Co Ltd | 車両の周辺監視装置 |
JP2009060404A (ja) * | 2007-08-31 | 2009-03-19 | Denso Corp | 映像処理装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2476588A4 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2562047A1 (en) * | 2011-08-23 | 2013-02-27 | Fujitsu General Limited | Drive assisting apparatus |
JP2013046124A (ja) * | 2011-08-23 | 2013-03-04 | Fujitsu General Ltd | 運転支援装置 |
JP2013177095A (ja) * | 2012-02-29 | 2013-09-09 | Murakami Corp | 車外音導入装置 |
JP2014209713A (ja) * | 2013-03-28 | 2014-11-06 | アイシン精機株式会社 | 周辺監視装置、及びプログラム |
US9956913B2 (en) | 2013-03-28 | 2018-05-01 | Aisin Seiki Kabushiki Kaisha | Surroundings-monitoring device and computer program product |
US10710504B2 (en) | 2013-03-28 | 2020-07-14 | Aisin Seiki Kabushiki Kaisha | Surroundings-monitoring device and computer program product |
JP2018069845A (ja) * | 2016-10-26 | 2018-05-10 | 株式会社デンソーテン | 画像表示装置及び画像処理方法 |
JP2020158114A (ja) * | 2020-06-02 | 2020-10-01 | 株式会社ユピテル | 撮像装置、システム、及びこのシステムを装備したフォークリフト |
Also Published As
Publication number | Publication date |
---|---|
CN102481876B (zh) | 2014-12-17 |
EP2476588B1 (en) | 2014-07-30 |
US20120154590A1 (en) | 2012-06-21 |
JPWO2011030699A1 (ja) | 2013-02-07 |
CN102481876A (zh) | 2012-05-30 |
EP2476588A1 (en) | 2012-07-18 |
EP2476588A4 (en) | 2013-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5598731B2 (ja) | 車両周辺監視装置 | |
WO2011030699A1 (ja) | 車両周辺監視装置 | |
US20200086794A1 (en) | Vehicle rear monitoring system | |
JP5233583B2 (ja) | 車載用監視装置 | |
JP5953824B2 (ja) | 車両用後方視界支援装置及び車両用後方視界支援方法 | |
JP5459154B2 (ja) | 車両用周囲画像表示装置及び方法 | |
JP4248570B2 (ja) | 画像処理装置並びに視界支援装置及び方法 | |
JP6287406B2 (ja) | ヘッドアップディスプレイ装置 | |
JP2019125920A (ja) | モニター表示システム及びその表示方法 | |
CN103765489B (zh) | 障碍物警报装置 | |
TWI533694B (zh) | 車用障礙物偵測顯示系統 | |
JP2010118935A (ja) | 車体透過表示装置 | |
JP6668975B2 (ja) | 電子機器および自動車 | |
WO2017038123A1 (ja) | 車両の周囲監視装置 | |
WO2015129280A1 (ja) | 画像処理装置および画像処理方法 | |
US10455159B2 (en) | Imaging setting changing apparatus, imaging system, and imaging setting changing method | |
KR101657673B1 (ko) | 파노라마뷰 생성 장치 및 방법 | |
JP2008285105A (ja) | 情報表示装置 | |
JP2010064646A (ja) | 車両周辺監視装置および車両周辺監視方法 | |
JP2011065280A (ja) | 画像処理装置、周辺監視システム、及び操作支援システム | |
JP6149379B2 (ja) | 画像表示制御装置 | |
JP4033170B2 (ja) | 車両用表示装置 | |
JP2018171964A (ja) | 車両用画像表示装置及び設定方法 | |
JP2006024120A (ja) | 車両用画像処理システム及び画像処理装置 | |
JP2006244331A (ja) | 夜間走行補助装置及び方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080038865.7 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10815296 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010815296 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13392171 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011530811 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |