Nothing Special   »   [go: up one dir, main page]

US20090118994A1 - Vehicle and lane recognizing device - Google Patents

Vehicle and lane recognizing device Download PDF

Info

Publication number
US20090118994A1
US20090118994A1 US11/919,634 US91963406A US2009118994A1 US 20090118994 A1 US20090118994 A1 US 20090118994A1 US 91963406 A US91963406 A US 91963406A US 2009118994 A1 US2009118994 A1 US 2009118994A1
Authority
US
United States
Prior art keywords
lane
information
actual
estimated
hand line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/919,634
Other versions
US7970529B2 (en
Inventor
Naoki Mori
Sachio Kobayashi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Assigned to HONDA MOTOR CO., LTD. reassignment HONDA MOTOR CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KOBAYASHI, SACHIO, MORI, NAOKI
Publication of US20090118994A1 publication Critical patent/US20090118994A1/en
Application granted granted Critical
Publication of US7970529B2 publication Critical patent/US7970529B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/04Monitoring the functioning of the control system
    • B60W50/045Monitoring control system parameters
    • B60W2050/046Monitoring control system parameters involving external transmission of data to or from the vehicle, e.g. via telemetry, satellite, Global Positioning System [GPS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Definitions

  • the present invention relates to a vehicle and lane recognizing device for recognizing the lane of a road by processing an image of the road obtained by an imaging means such as a camera and obtaining information on the road from a GPS or the like and map data.
  • Patent Document 1 Hei 11-72337/1999 (hereinafter, referred to as Patent Document 1) and Japanese Patent Laid-Open No. 2003-123058 (hereinafter, referred to as Patent Document 2).
  • Patent Document 3 Japanese Patent Laid-open No. 2002-92794 (hereinafter, referred to as Patent Document 3) and Japanese Patent Laid-Open No. 2003-205805 (hereinafter, referred to as Patent Document 4)).
  • the traffic lane recognizing device in Patent Document 1 includes an image information processing means which recognizes the positions of left and right white lines on the road ahead by processing image information from a camera, and a traffic lane estimating means which estimates the position and feature of the traffic lane and a positional relationship between the traffic lane and the subject vehicle from the white line position image information obtained by the image information processing means.
  • the traffic lane recognizing device further includes a road condition determining means which determines whether there is a special particular road area (for example, a particular white line area having white lines in a striped pattern or zebra pattern) different from an area having normal white lines on the road ahead.
  • the road condition determining means determines whether there is the particular road area on the road ahead while associating data on the particular road area stored in a ROM with vehicle position information obtained from a GPS.
  • the image information processing means recognizes the midpoint of both edges of a white line as a position in the road width direction of the white line for a normal white line, a particular part of the white line area (an edge on the main line side of a recognizable white line area) is considered to be a position in the road width direction of the white line if the road condition determining means determines that there is the particular road area on the road ahead of the vehicle.
  • the traffic lane estimating means estimates an area defined by the extracted left and right white lines on the road or an area defined by one of the extracted left and right white lines on the road and a previously recognized lane width as a traffic lane.
  • the traffic lane recognizing device in Patent Document 2 includes an image processing means which detects a traffic lane by processing a road image captured by a camera.
  • the image processing means has a plurality of image processing algorithms different from each other according to a plurality of types of lane marks such as a white line, a raised marker, a post cone and so on.
  • the traffic lane recognizing device receives position information of the road along which the vehicle is currently traveling from a satellite by using a GPS receiving means and determines the type of lane marks on the road ahead along which the vehicle is currently traveling from a road map data file which stores lane mark types for respective roads.
  • the traffic lane recognizing device selects an image processing algorithm suitable for the lane marks ahead out of the plurality of image processing algorithms and detects the traffic lane.
  • a warning device for a vehicle in Patent Document 3 includes a white line recognition system which recognizes left and right division lines (white lines) of the lane of the road along which the vehicle is traveling from the image obtained by a camera by image processing and a sensor group (vehicle speed sensors or the like) for use in obtaining vehicle behaviors.
  • the warning device for a vehicle includes a navigation system which obtains position information and road information on the road along which the vehicle is traveling by comparison with map data read from a CD-ROM on the basis of the position of a subject vehicle obtained from s GPS.
  • the warning device for a vehicle issues a warning for a driver when predicting that the vehicle deviates from the lane on the basis of the left and right division lines of the recognized lane and vehicle behaviors obtained by the sensor group.
  • the warning device for a vehicle changes a warning generation condition to a condition under which the warning is more difficult to be issued in comparison with other cases.
  • a vehicle driving support apparatus in Patent Document 4 includes a surroundings monitoring means which recognizes white lines or the like indicating a lane by processing an image ahead of a vehicle captured by a camera, and a driving support means which supports a driver's operation by a lane deviation warning control or the like on the basis of information from the surroundings monitoring means.
  • the vehicle driving support apparatus includes a road condition recognizing means, which recognizes the road conditions (branch road or the like) of the road ahead of the vehicle from the current position of the vehicle obtained based on radio wave information from a plurality of satellites, map information stored in a storage medium such as a DVD-ROM, and road ahead information obtained by a road-vehicle communication from road infrastructure and which determines whether a problem will occur in driving support control according to the road ahead conditions. Then, the vehicle driving support apparatus previously provides a driver with information on the basis of information from the road condition recognizing means if reliability of the operation of the driving support means is projected to decrease.
  • a road condition recognizing means which recognizes the road conditions (branch road or the like) of the road ahead of the vehicle from the current position of the vehicle obtained based on radio wave information from a plurality of satellites, map information stored in a storage medium such as a DVD-ROM, and road ahead information obtained by a road-vehicle communication from road infrastructure and which determines whether a problem will occur in
  • an object of the present invention is to provide a vehicle and lane recognizing device capable of detecting a lane accurately while increasing opportunities for detecting the lane as much as possible by processing a road image obtained via an imaging means such as a camera and obtaining road information from GPS and map data, even if there is an unpredictable skid mark or road repaired part on the road.
  • a vehicle comprising: an imaging means; an image processing means which obtains an image of a road via the imaging means, performs a process of estimating a lane of the road by processing the obtained image, and outputs a result of the process as first lane information; holding means which holds map data of the road; position information obtaining means which obtains the current position information of the vehicle; a lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information; and an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information and the second lane information.
  • a lane recognizing device comprising: an image processing means which performs a process of estimating a lane of a road by processing an image of the road obtained via an imaging means mounted on a vehicle and outputs a result of the process as first lane information; a holding means which holds map data of the road; a position information obtaining means which obtains the current position information of the vehicle; a lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information; and an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information and the second lane information (First aspect of the present invention).
  • the image processing means performs the process of estimating the lane of the road by processing the image of the road obtained via the imaging means and outputs the result of the process as the first lane information.
  • the image processing means performs the process of estimating the lane of the road by processing the image of the road obtained via the imaging means and outputs the result of the process as the first lane information.
  • information on the lane along which the vehicle is traveling is obtained. If, however, there is, for example, an unpredictable skid mark or road repaired part on the road when estimating the lane by the image processing, it is difficult to estimate the lane appropriately. Also in this case, however, it is preferable that the actual lane is recognized as appropriately as possible.
  • the lane estimating means performs the process of estimating the lane along which the vehicle is traveling using the map data and the current position information and outputs the result of the process as the second lane information.
  • the lane information can be obtained in a different method from the method of estimating the lane by the image processing.
  • the actual lane recognizing means recognizes the actual lane on the basis of both of the first lane information and the second lane information, the information on the lane estimated by the map data and position information can be used as the information indicating the actual lane even if the lane is not appropriately estimated by the image processing. Therefore, it is possible to detect the actual lane accurately while increasing the opportunities for detecting the actual lane as much as possible.
  • a vehicle comprising: an imaging means; an image processing means which obtains an image of a road via the imaging means, performs a process of estimating a lane of the road by processing the obtained image, and outputs a result of the process as first lane information; a holding means which holds map data of the road; position information obtaining means which obtains the current position information of the vehicle; lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information; a means which calculates lane similarity, which is a degree of similarity between the shape of the lane indicated by the first lane information and the shape of the lane indicated by the second lane information; and an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information, the second lane information, and a result of comparison between the lane similarity and a given threshold value.
  • a lane recognizing device comprising: an image processing means which performs a process of estimating a lane of a road by processing an image of the road obtained via an imaging means mounted on a vehicle and outputs a result of the process as first lane information; a holding means which holds map data of the road; a position information obtaining means which obtains the current position information of the vehicle; a lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information; a means which calculates lane similarity, which is a degree of similarity between the shape of the lane indicated by the first lane information and the shape of the lane indicated by the second lane information; and an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information, the second lane information, and a result of comparison between the lane similarity and a given threshold value
  • the image processing means performs the process of estimating the lane of the road by processing the image of the road obtained via the imaging means and outputs the result of the process as the first lane information.
  • the image processing means performs the process of estimating the lane of the road by processing the image of the road obtained via the imaging means and outputs the result of the process as the first lane information.
  • information on the lane along which the vehicle is traveling is obtained. If, however, there is, for example, an unpredictable skid mark or road repaired part on the road when estimating the lane by the image processing, it is difficult to estimate the lane appropriately. Also in this case, however, it is preferable that the actual lane is recognized as appropriately as possible.
  • the lane estimating means performs the process of estimating the lane along which the vehicle is traveling using the map data and the current position information of the vehicle and outputs the result of the process as the second lane information.
  • the lane information can be obtained in a different method from the method of estimating the lane by the image processing.
  • the actual lane recognizing means recognizes the actual lane on the basis of both of the first lane information and the second lane information, the information on the lane estimated by the map data and position information can be used as the information indicating the actual lane even if the lane is not appropriately estimated by the image processing. Therefore, it is possible to detect the actual lane accurately while increasing the opportunities for detecting the actual lane as much as possible. Then, the actual lane recognizing means recognizes the actual lane on the basis of the first lane information, the second lane information, and the result of comparison between the lane similarity, which is the degree of similarity between the shape of the lane indicated by the first lane information and the shape of the lane indicated by the second lane information, and the given threshold value.
  • the actual lane recognizing means can grasp the reliability of the information on the lane estimated by the image processing and the reliability of the information on the lane estimated by the map data and position information from the degree of similarity between the shapes of the lanes estimated by both, and therefore it can recognize the actual lane more appropriately.
  • the shape of the lane estimated by the image processing and the shape of the lane estimated by the map data and position information are each accurately estimated from the shape of the actual lane, it is conceivable that the degree of similarity between the shapes of the lane estimated by both is high. Furthermore, if the lane is estimated by the image processing accurately, the shape of the lane estimated by the image processing is considered to be more accurate in position than the shape of the lane estimated by the map data and position information since the image processing of the actual road ahead of the vehicle is more local than the processing by the map data and position information in view of the position-fix accuracy of the GPS or the like and data density of the map data.
  • the actual lane recognizing means recognizes the actual lane on the basis of the first lane information in the case where the lane similarity is greater than the given threshold value (Third aspect of the present invention).
  • the third aspect of the present invention if the lane similarity is greater than the given threshold value, the actual lane is recognized on the basis of the first lane information, and therefore the reliability of the lane estimation by the image processing is verified according to the information on the lane estimated by the map data and position information.
  • the information on the shape of the lane estimated by the image processing which is thought to be more accurate than the shape of the lane estimated by the map data and position information, is used as information indicating the actual lane, by which the actual lane is appropriately recognized.
  • the shape of the lane estimated by the image processing differs from the shape of the lane estimated by the map data and position information, the actual lane is not likely to be appropriately estimated in one or both of the lane estimation by the image processing and the lane estimation by the map data and position information. For example, if there is a branch or the like in the road ahead, it is conceivable that the lane shape cannot be appropriately estimated in either approach.
  • the actual lane recognizing means outputs information indicating that the actual lane is not recognized in the case where the lane similarity is equal to or smaller than the given threshold value (Fourth aspect of the present invention).
  • the actual lane recognizing means outputs the information indicating that the actual lane is not recognized if the lane similarity is equal to or smaller than the given threshold value, and therefore if the actual lane is not likely to be appropriately estimated in one or both of the lane estimation by the image processing and the lane estimation by the map data and position information, it can grasp that the actual lane is not recognized appropriately.
  • the process of estimating the lane is performed based on the map data and position information, for example, when the position information obtaining means obtains the vehicle position information from a GPS or other driving information providing services via signals and communication, it is conceivable that the lane cannot be estimated because the current position information of the vehicle cannot be obtained due to a signal or communication failure or the like.
  • the first lane information includes first estimation execution information indicating whether the lane is estimated by the process of the image processing means and first lane shape information indicating the shape of the estimated lane in the case where the lane is estimated.
  • the second lane information includes second estimation execution information indicating whether the lane is estimated by the process of the lane estimating means and second lane shape information indicating the shape of the estimated lane in the case where the lane is estimated. Then, the means which calculates the lane similarity uses the first lane shape information and the second lane shape information when calculating the lane similarity.
  • the actual lane recognizing means includes: a means which determines and outputs actual lane recognition execution information indicating whether the actual lane is recognized on the basis of the first estimation execution information, the second estimation execution information, and the result of comparison between the lane similarity and the given threshold value; and a means which determines and outputs actual lane shape information indicating the shape of the recognized lane in the case where the lane is recognized on the basis of the first lane shape information and the second lane shape information (Fifth aspect of the present invention).
  • the actual lane recognition execution information indicating whether the actual lane is recognized is determined on the basis of the first estimation execution information, the second estimation execution information, and the result of the comparison between the lane similarity and the given threshold value.
  • the actual lane shape information indicating the shape of the recognized lane in the case where the lane is recognized is determined based on the first lane shape information and the second lane shape information.
  • the lane similarity is calculated using the first lane shape information and the second lane shape information.
  • the lane is estimated by both of the estimation process by the image processing and the estimation process by the map data and position information, it is possible to grasp the reliability of the information on the lane estimated by the image processing and the reliability of the information on the lane estimated by the map data and position information from the degree of similarity between the shapes of the lanes estimated by both, and therefore the actual lane can be recognized more appropriately.
  • the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the second lane shape information as the actual lane shape information in the case where the first estimation execution information indicates that the lane is not estimated and in the case where the second estimation execution information indicates that the lane is estimated (Sixth aspect of the present invention).
  • the actual lane recognizing means sets the second lane shape information as the actual lane shape information if the first estimation execution information indicates that the lane is not estimated and if the second estimation execution information indicates that the lane is estimated. Therefore, even if the lane cannot be estimated by the image processing, the image processing result is supplemented with the information on the lane estimated by the map data and position information, which increases the opportunities for detecting the actual lane as much as possible.
  • the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the first lane shape information as the actual lane shape information in the case where the first estimation execution information indicates that the lane is estimated; and sets information indicating that the lane is not recognized as the actual lane recognition execution information in the case where the first estimation execution information indicates that the lane is not estimated (Seventh aspect of the present invention).
  • the actual lane recognizing means can use the result of the lane estimation by the image processing directly as information indicating the actual lane if the lane is estimated by the image processing in the case where the lane is not estimated by the map data and position information. Moreover, if the lane is not estimated by the map data and position information and not estimated by the image processing, either, information indicating that the lane is not recognized is set as the actual lane recognition execution information, and therefore, the actual lane recognizing means can clearly grasp that the actual lane is not recognized.
  • the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the first lane shape information as the actual lane shape information in the case where the lane similarity is greater than the given threshold value. Furthermore, in the case where the first estimation execution information indicates that the lane is not estimated and the second estimation execution information indicates that the lane is estimated, the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the second lane shape information as the actual lane shape information.
  • the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the first lane shape information as the actual lane shape information.
  • the actual lane recognizing means includes means which sets and outputs the reliability of recognition in the case where the actual lane is recognized so as to set the reliability to a first level, which is the highest level, in the case where the lane similarity is greater than the given threshold value in the case where the first estimation execution information indicates that the lane is estimated and the second estimation execution information indicates that the lane is estimated. Furthermore, actual lane recognizing means sets the reliability to a second level, which is lower than the first level, in the case where the first estimation execution information indicates that the lane is estimated and the second estimation execution information indicates that the lane is not estimated.
  • actual lane recognizing means sets the reliability to a third level, which is lower than the second level, in the case where the first estimation execution information indicates that the lane is not estimated and the second estimation execution information indicates that the lane is estimated (Eighth aspect of the present invention).
  • the reliability of the recognition is output. If the lane similarity is greater than the given threshold value, then the reliability is set to the first level, which is the highest level. Therefore, if it is possible to consider the actual lane to be estimated accurately due to the high similarity between the shape of the lane estimated by the image processing and the shape of the lane estimated by the map data and position information, the reliability is set highest.
  • the first estimation execution information indicates that the lane is estimated and the second estimation execution information indicates that the lane is not estimated, it means that the lane is estimated only by the image processing.
  • the reliability of the lane estimation by the image processing cannot be verified using the information on the lane estimated by the map data and position information, and therefore the reliability is set to the second level, which is lower than the first level.
  • the first estimation execution information indicates that the lane is not estimated and the second estimation execution information indicates that the lane is estimated, it means that the lane is estimated only by the map data and position information.
  • the lane estimation by the map data and position information is considered to be less accurate than the lane estimation by the image processing, in which a more local process is performed, in view of the position-fix accuracy of the GPS or the like and data density of the map data, and therefore the reliability is set to the third level, which is lower than the second level.
  • the setting and output of the reliability as described above allows the reliability of the information indicating the recognized actual lane to be clearly known.
  • the reliability set as described above can be used to control the vehicle or to provide information to the driver in addition to the information indicating the actual lane.
  • the lane along which the vehicle travels is generally composed of a left-hand line, which defines the left side of the lane, and a right-hand line, which defines the right side of the lane.
  • the first lane information output by the image processing means includes: first left estimation execution information which indicates whether the left-hand line defining the left side of the lane is estimated by the process of the image processing means; first left-hand line shape information which indicates the shape of the estimated left-hand line in the case where the left-hand line is estimated; first right estimation execution information which indicates whether the right-hand line defining the right side of the lane is estimated by the process of the image processing means; and first right-hand line shape information which indicates the shape of the estimated right-hand line in the case where the right-hand line is estimated.
  • the second lane information output by the lane estimating means includes: second estimation execution information which indicates whether both of the left-hand line and the right-hand line are estimated by the process of the lane estimating means; second left-hand line shape information which indicates the shape of the estimated left-hand line in the case where the left-hand line is estimated; and second right-hand line shape information which indicates the shape of the estimated right-hand line in the case where the right-hand line is estimated.
  • the actual lane recognizing means includes means which calculates left-hand line similarity, which is the degree of similarity between the first left-hand line shape information and the second left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is estimated and in the case where the second estimation execution information indicates that the left-hand line and the right-hand line are estimated. Furthermore, the actual lane recognizing means includes means which calculates right-hand line similarity, which is the degree of similarity between the first right-hand line shape information and the second right-hand line shape information in the case where the first estimation execution information indicates that the right-hand line is estimated and in the case where the second estimation execution information indicates that the left-hand line and the right-hand line are estimated.
  • the actual lane recognizing means includes means which determines and outputs: actual left recognition execution information which indicates whether the actual left-hand line defining the left side of the actual lane is recognized; actual left-hand line shape information which indicates the shape of the recognized actual left-hand line in the case where the actual left-hand line is recognized; actual right recognition execution information which indicates whether the actual right-hand line defining the right hand of the actual lane is recognized; and actual right-hand line shape information which indicates the shape of the recognized actual right-hand line in the case where the actual right-hand line is recognized, according to a result of comparing the calculated left-hand line similarity and right-hand line similarity with a given threshold value (Ninth aspect of the present invention).
  • the actual left recognition execution information, the actual left-hand line shape information, the actual right recognition execution information, and the actual right-hand line shape information are determined according to the result of comparing the calculated left-hand line similarity and right-hand line similarity with the given threshold value.
  • the actual lane recognizing means can grasp the reliability of the information on the left-hand line and the right-hand line estimated by the image processing and the reliability of the information on the left-hand line and the right-hand line estimated by the map data and position information according to the degree of similarity in shape between the left-hand line and the right-hand line estimated by both. Therefore, the actual lane recognizing means can recognize the actual left-hand line and right-hand line more appropriately.
  • the shapes of the left-hand line and right-hand line estimated by the image processing and the shapes of the left-hand line and right-hand line estimated by the map data and position information are accurately estimated from the shapes of the actual left-hand line and right-hand line, it is considered that the shapes of the left-hand line and right-hand line estimated by both have high similarity with each other. Furthermore, the image processing of the actual road ahead of the vehicle is more local than the processing by the map data and position information in view of the position-fix accuracy of the GPS or the like and data density of the map data.
  • the shapes of the left-hand line and right-hand line estimated by the image processing are thought to be more accurate imposition than the shapes of the left-hand line and right-hand line estimated by the map data and position information.
  • the actual lane recognizing means sets information indicating that the actual left-hand line and the actual right-hand line are recognized as the actual left recognition execution information and as the actual right recognition execution information, respectively, in the case where one or both of the left-hand line similarity and the right-hand line similarity are calculated and where one or both of the calculated left-hand line similarity and right-hand line similarity are greater than the given threshold value.
  • the actual lane recognizing means sets the first left-hand line shape information as the actual left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is estimated, sets the second left-hand line shape information as the actual left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is not estimated, sets the first right-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is estimated, and sets the second left-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is not estimated (Tenth aspect of the present invention).
  • the actual lane recognizing means sets the information indicating that the actual left-hand line and the actual right-hand line are recognized as the actual left recognition execution information and as the actual right recognition execution information, respectively, in the case where the one or both of the left-hand line similarity and the right-hand line similarity are calculated and where one or both of the calculated left-hand line similarity and right-hand line similarity are greater than the given threshold value.
  • the reliability of the estimation of the left-hand line and right-hand line by the image processing is verified by the information on the left-hand line and right-hand line estimated by the map data and position information and thereby the left-hand line and the right-hand line are considered to be estimated accurately by the image processing, information indicating that the lane is recognized is appropriately set as the information on the actual left-hand line and right-hand line.
  • the actual lane recognizing means sets the first left-hand line shape information as the actual left-hand line shape information if the first left estimation execution information indicates that the left-hand line is estimated and sets the first right-hand line shape information as the actual right-hand line shape information if the first right estimation execution information indicates that the right-hand line is estimated, and therefore the information on the shapes of the left-hand line and right-hand line estimated by the image processing, which is thought to be more accurate than the shapes of the left-hand line and right-hand line estimated by the map data and position information, is set appropriately as information on the shapes of the actual left-hand line and right-hand line.
  • the actual lane recognizing means sets the second left-hand line shape information as the actual left-hand line shape information if the first left estimation execution information indicates that the left-hand line is not estimated and sets the second left-hand line shape information as the actual right-hand line shape information if the first right estimation execution information indicates that the right-hand line is not estimated. Therefore, even if the left-hand line and the right-hand line cannot be estimated by the image processing, the image processing result is supplemented with the information on the left-hand line and right-hand line estimated by the map data and position information, which thereby increases the opportunities for detecting the actual left-hand line and right-hand line as much as possible.
  • the shapes of the left-hand line and right-hand line estimated by the image processing differ from the shapes of the left-hand line and right-hand line estimated by the map data and position information, there is a possibility that the actual left-hand line and right-hand line are not estimated appropriately in one or both of the estimation of the left-hand line and right-hand line by the image processing and the estimation of the left-hand line and right-hand line by the map data and position information.
  • the actual lane recognizing means sets information indicating that the actual left-hand line and the actual right-hand line are not recognized as the actual left recognition execution information and as the actual right recognition execution information, respectively, in the case where both of the left-hand line similarity and the right-hand line similarity are calculated and at least one of the calculated left-hand line similarity and right-hand line similarity is equal to or smaller than the given threshold value and where one of the left-hand line similarity and the right-hand line similarity is calculated and one of the calculated left-hand line similarity and right-hand line similarity is equal to or smaller than the given threshold value (11th aspect of the present invention).
  • the actual lane recognizing means sets the information indicating that the actual left-hand line and the actual right-hand line are not recognized as the actual left recognition execution information and as the actual right recognition execution information, respectively, in the case where both of the left-hand line similarity and the right-hand line similarity are calculated and at least one of the calculated left-hand line similarity and right-hand line similarity is equal to or smaller than the given threshold value and where one of the left-hand line similarity and the right-hand line similarity is calculated and one of the calculated left-hand line similarity and right-hand line similarity is equal to or smaller than the given threshold value.
  • the information indicating that the left-hand line and the right-hand line are not recognized is appropriately set as the actual left-hand line and right-hand line information.
  • the actual lane recognizing means sets information indicating that the actual left-hand line and the actual right-hand line are recognized as the actual left recognition execution information and as the actual right recognition execution information and sets the second left-hand line shape information as the actual left-hand line shape information and the second right-hand line shape information as the actual right-hand line shape information, in the case where the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are not estimated, respectively, and where the second estimation execution information indicates that the left-hand line and the right-hand line are estimated (12th aspect of the present invention).
  • the actual lane recognizing means sets the second left-hand line shape information as the actual left-hand line shape information and sets the second right-hand line shape information as the actual right-hand line shape information in the case where the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are not estimated, respectively, and where the second estimation execution information indicates that the left-hand line and the right-hand line are estimated.
  • the image processing result is supplemented with the information on the left-hand line and right-hand line estimated by the map data and position information, which thereby increases the opportunities for detecting the actual left-hand line and right-hand line as much as possible.
  • the actual lane recognizing means sets information indicating that the actual left-hand line is recognized as the actual left recognition execution information and sets the first left-hand line shape information as the actual left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is estimated in the case where the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated. Furthermore, the actual lane recognizing means sets information indicating that the actual left-hand line is not recognized as the actual left recognition execution information in the case where the first left estimation execution information indicates that the left-hand line is not estimated.
  • the actual lane recognizing means sets information indicating that the actual right-hand line is recognized as the actual right recognition execution information and sets the first right-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is estimated. Furthermore, preferably the actual lane recognizing means sets information indicating that the actual right-hand line is not recognized as the actual right recognition execution information in the case where the first right estimation execution information indicates that the right-hand line is not estimated (13th aspect of the present invention).
  • the actual lane recognizing means can use a result of the estimation of the left-hand line or right-hand line by the image processing directly as the information indicating the actual left-hand line or right-hand line if the left-hand line or the right-hand line is estimated by the image processing in the case where the left-hand line and the right-hand line are not estimated by the map data and position information.
  • the information indicating that the left-hand line or right-hand line is not recognized as the actual left recognition execution information or as the actual right recognition execution information by which the actual lane recognizing means can grasp clearly that the actual left-hand line or right-hand line is not recognized.
  • the actual lane recognizing means sets information indicating that the actual left-hand line and the actual right-hand line are recognized as the actual left recognition execution information and as the actual right recognition execution information, respectively, in the case where one or both of the left-hand line similarity and the right-hand line similarity are calculated and where one or both of the calculated left-hand line similarity and right-hand line similarity are greater than the given threshold value.
  • the actual lane recognizing means sets the first left-hand line shape information as the actual left-hand line shape information in the case where the left estimation execution information indicates that the left-hand line is estimated, sets the second left-hand line shape information as the actual left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is not estimated, sets the first right-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is estimated, and sets the second left-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is not estimated.
  • the actual lane recognizing means sets information indicating that the actual left-hand line and the actual right-hand line are recognized as the actual left recognition execution information and as the actual right recognition execution information in the case where the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are not estimated, respectively, and where the second estimation execution information indicates that the lane is estimated. Additionally, the actual lane recognizing means sets the second left-hand line shape information as the actual left-hand line shape information and sets the second right-hand line shape information as the actual right-hand line shape information.
  • the actual lane recognizing means sets information indicating that the actual left-hand line is recognized as the actual left recognition execution information and sets the first left-hand line shape information as the actual left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is estimated in the case where the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated.
  • the actual lane recognizing means sets information indicating that the actual left-hand line is not recognized as the actual left recognition execution information in the case where the first left estimation execution information indicates that the left-hand line is not estimated.
  • the actual lane recognizing means sets information indicating that the actual right-hand line is recognized as the actual right recognition execution information and sets the first right-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is estimated. Still further the actual lane recognizing means sets information indicating that the actual right-hand line is not recognized as the actual right recognition execution information in the case where the first right estimation execution information indicates that the right-hand line is not estimated.
  • the actual lane recognizing means includes means which sets and outputs reliability of recognition in the case where at least one of the actual left-hand line and the actual right-hand line are recognized, and it sets the reliability to a first level, which is the highest level, in the case where both of the left-hand line similarity and the right-hand line similarity are calculated and the calculated left-hand line similarity and right-hand line similarity are each greater than the given threshold value.
  • the actual lane recognizing means sets the reliability to a second level, which is lower than the first level, in the case where one of the left-hand line similarity and the right-hand line similarity is calculated and one of the calculated left-hand line similarity and right-hand line similarity is greater than the given threshold value and in the case where the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are estimated, respectively, and the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated.
  • the actual lane recognizing means sets the reliability to a third level, which is lower than the second level, in the case where one of the first left estimation execution information and the first right estimation execution information indicates that one of the left-hand line and the right-hand line is estimated and the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated and in the case where the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are not estimated, respectively, and the second estimation execution information indicates that the left-hand line and the right-hand line are estimated (14th aspect of the present invention).
  • the reliability of the recognition is output. If both of the left-hand line similarity and the right-hand line similarity are calculated and both of the left-hand line similarity and the right-hand line similarity are greater than the given threshold value in the above, the reliability is set to the first level, which is the highest level. Therefore, if actual left-hand line and right-hand line is considered to be estimated accurately due to relatively high similarity between the shapes of the left-hand line and right-hand line estimated by the image processing and the shapes of the left-hand line and right-hand line estimated by the map data and position information, the reliability is set highest.
  • the reliability is set to the second level, which is lower than the first level.
  • the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are estimated, respectively, and the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated, it means that the left-hand line and the right-hand line are estimated only by the image processing, and therefore the reliability of the estimation of the left-hand line and that of the right-hand line by the image processing cannot be verified by using information on the left-hand line and right-hand line estimated by the map data and position information, by which the reliability is set to the second level, which is lower than the first level.
  • the first left estimation execution information and the first right estimation execution information indicates that one of the left-hand line and the right-hand line is estimated and the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated, it means that the left-hand line and the right-hand line are estimated only by the image processing.
  • the reliability of the estimation of the left-hand line and that of the right-hand line by the image processing cannot be verified by using information on the left-hand line and right-hand line estimated by the map data and position information and only one of the left-hand line and the right-hand line is estimated, and therefore the reliability if set to a third level, which is lower than the second level.
  • the estimation is considered to be less accurate than the estimation by the image processing in consideration of GPS position-fix accuracy or the like, and therefore the reliability is set to the third level, which is lower than the second level.
  • the lane recognizing device can clearly recognize the reliability of information indicating the recognized actual lane by setting and outputting the reliability as described hereinabove. Moreover, the reliability set in this manner can be used to control the vehicle or inform the driver in addition to the information indicating the actual lane.
  • FIG. 1 It is a flowchart showing a lane recognition process of the lane recognizing device in FIG. 1 .
  • FIG. 1 It is an illustrative diagram showing a road lane to be recognized by the lane recognizing device in FIG. 1 .
  • FIG. 1 is a functional block diagram of a lane recognizing device according to a first embodiment of the present invention.
  • FIGS. 2 to 4 are flowcharts of a lane recognition process in the lane recognizing device in FIG. 1 .
  • FIG. 5 is an illustrative diagram of a lane to be recognized in the lane recognition process by the lane recognizing device.
  • FIGS. 6 to 8 are explanatory diagrams of the lane recognition process in the lane recognizing device in FIG. 1 .
  • the first embodiment corresponds to the first aspect of the present invention.
  • a lane recognizing device 1 is an electronic unit composed of a microcomputer and the like and is mounted on a vehicle 7 , including, as its processing functions, an image processing means 2 which performs a process of estimating a lane by obtaining a road image, a map data storage medium 3 which holds map data, a GPS unit 4 which obtains the current position information of a vehicle by GPS, a lane estimating means 5 which performs a process of estimating a lane from the map data and the current position information of the vehicle, and an actual lane recognizing means 6 which performs a process of recognizing an actual lane on the basis of a result of the process of the image processing means 2 and a result of the process of the lane estimating means 5 .
  • the image processing means 2 is attached to the front of the vehicle 7 to obtain a road image via a video camera 8 (a CCD camera or the like, corresponding to imaging means of the present invention) which captures an image of the road ahead of the vehicle 7 . Then, the image processing means 2 performs a process of estimating a lane along which the vehicle 7 travels by processing the obtained road image and outputs a result of the process as first lane information.
  • the vehicle of the present invention includes the video camera 8 and the lane recognizing device 1 .
  • the map data storage medium 3 (the holding means of the present invention) is a CD-ROM, DVD, HDD or other storage medium which records map data. A road position, a lane width of a road, and the like are recorded as map data in the map data storage medium 3 .
  • the GPS unit 4 receives information transmitted from a plurality of global positioning system (GPS) satellites to obtain the current position information (latitude, longitude, and traveling direction) of the vehicle 7 on the basis of the received information.
  • GPS global positioning system
  • the lane estimating means 5 identifies the information on the road along which the vehicle 7 is currently traveling by using the map data read from the map data storage medium 3 and the current position information of the vehicle 7 obtained by the GPS unit 4 . Then, the lane estimating means 5 performs the process of estimating the lane from the identified road information and outputs a result of the process as second lane information.
  • the actual lane recognizing means 6 performs the process of recognizing actual lane information on the basis of the first lane information output from the image processing means 2 and the second lane information output from the lane estimating means 5 and outputs a result of the process as information indicating an actual lane (hereinafter, referred to as actual lane information). Moreover, if the actual lane is recognized, the actual lane recognizing means 6 sets the reliability of the recognition and outputs it together with the actual lane information.
  • the reliability is an index of recognition accuracy and is set to one of three levels, level 1 to level 3 according to the recognition result. Level 1 indicates that the recognition reliability is highest (high accuracy), level 2 indicates that the reliability is lower than level 1 (medium accuracy), and level 3 indicates that the reliability is still lower than level 2 (low accuracy).
  • FIG. 2 is a flowchart showing the general operation of the lane recognition process (the main routine process of the lane recognizing device 1 ).
  • FIG. 3 is a flowchart showing a process of estimating a lane by image processing (a subroutine process).
  • FIG. 4 is a flowchart showing a process of estimating the lane from map data and position information (a subroutine process). In the following, assuming that the traveling direction of the vehicle 7 corresponds to the arrow direction as shown in FIG.
  • the operation is described by giving an example in which the left side of the lane of a road along which the vehicle 7 is traveling is defined by a lane mark A 0 and the right side of the lane is defined by a lane mark A 1 .
  • the lane marks A 0 and A 1 are assumed to be, for example, white lines by way of example.
  • the image processing means 2 obtains a road image by receiving a video signal output from the video camera 8 (step 001 ).
  • the image processing means 2 performs a process of estimating a lane from the obtained road image (hereinafter, referred to as a first lane estimation process) and outputs the first lane information (step 002 ).
  • the output first lane information includes first left estimation execution information indicating whether the lane mark A 0 is already estimated by the first lane estimation process, first left-hand line shape information indicating the shape of the estimated lane mark A 0 in the case where the lane mark A 0 is already estimated, a first right estimation execution information indicating whether the lane mark A 1 is already estimated by the first lane estimation process, and first right-hand line shape information indicating the shape of the estimated lane mark A 1 in the case where the lane mark A 1 is already estimated.
  • the first left estimation execution information is set to “Yes,” or otherwise, it is set to “No.”
  • the first right estimation execution information is set to “Yes” if the lane mark A 1 is already estimated, or otherwise, it is set to “No.”
  • the first lane estimation process is performed as shown in FIG. 3 .
  • the image processing means 2 performs an edge detection process to detect a white line from the obtained image (step 101 ). Subsequently, the image processing means 2 performs a Hough transform for output data of the edge detection (step 102 ). Then, the image processing means 2 searches the Hough space for straight line components and extracts them (step 103 ). Thereafter, the image processing means 2 applies a projective transformation to data of the extracted straight line components from the Hough space to the image space and further applies a projective transformation to the data from the image space to the real space (the coordinate space fixed to the vehicle) (step 104 ).
  • the image processing means 2 selects data of straight line components estimated to be the lane mark A 0 out of the data of the straight line components transformed to the real space and determines the coordinates of a plurality of points included in the selected data of the straight line components to be point sequence data A 2 .
  • the image processing means 2 selects data of straight line components estimated to be the lane mark A 1 and determines the coordinates of a plurality of points included in the selected data of the straight line components to be point sequence data A 3 .
  • the selected point sequence data A 2 and A 3 are as shown in FIG. 6 .
  • the coordinate system is a plane rectangular coordinate system having x axis and y axis as coordinate axes, with the center position of the vehicle 7 as an origin and the traveling direction of the vehicle 7 as the x-axis direction.
  • a least squares method is used as the approximation method.
  • the quadratic equation F 3 is appropriately obtained such as, for example, unless the point sequence data A 3 is obtained, unless the quadratic equation F 3 is obtained due to an insufficient number of data in the point sequence data A 3 , or if the obtained quadratic equation F 3 is poorly approximate to the point sequence data A 3 (the point sequence data A 3 varies widely with respect to the quadratic equation F 3 ), the image processing means 2 sets the first right estimation execution information to “No.”
  • the GPS unit 4 performs a process of obtaining the current position information of the vehicle 7 (step 003 ).
  • the current position information of the vehicle 7 to be obtained includes a position (latitude x 0 , longitude y 0 ) and a traveling orientation ⁇ .
  • the information is represented in a spherical coordinate system, and the traveling orientation ⁇ is set with the northern direction as zero degree and the clockwise direction as positive.
  • the lane estimating means 5 reads map data from the map data storage medium 3 (step 004 ).
  • the map data to be read includes the coordinates (X, Y) of a point sequence on the center line of the lane of the road (the lane along which the vehicle travels) and the lane width w at each point (X, Y): these data are represented in the plane rectangular coordinate system with the x axis set in the north-south direction and the y axis set in the east-west direction.
  • the lane estimating means 5 performs a process of estimating the lane by the map data and position information (hereinafter, referred to as the second lane estimation process) and outputs the second lane information (step 005 ).
  • the output second lane information includes second estimation execution information indicating whether the lane mark A 0 and the lane mark A 1 are already estimated by the second lane estimation process, second left-hand line shape information indicating the shape of the estimated lane mark A 0 in the case where the lane mark A 0 is already estimated, and second right-hand line shape information indicating the shape of the estimated lane mark A 1 in the case where the lane mark A 1 is already estimated.
  • the second estimation execution information is set to “Yes” if the lane mark A 0 and the lane mark A 1 are already estimated, or otherwise it is set to “No.”
  • the second lane estimation process is performed as shown in FIG. 4 .
  • the lane estimating means 5 checks whether the current position information of the vehicle 7 is already obtained by the GPS unit 4 (step 201 ). Unless the current position information of the vehicle 7 is obtained due to poor reception from GPS satellites or the like, the control proceeds to step 202 , where the lane estimating means 5 sets the second estimation execution information to “No.”
  • step 203 the lane estimating means 5 coordinate-transforms the obtained position (latitude x 0 , longitude y 0 ) of the vehicle 7 from the spherical coordinate system to the plane rectangular coordinate system in the map data.
  • Position P 0 of the vehicle 7 obtained by the coordinate transformation is as shown in FIGS. 7( a ) and 7 ( b ).
  • the x axis, y axis, and origin 0 represent the coordinate system xy of the read map data.
  • step 204 the lane estimating means 5 coordinate-transforms (translates) the coordinate system xy of the map data to a coordinate system x′y′ with the position P 0 of the vehicle 7 as the origin and the x′ axis and y′ axis as coordinate axes.
  • the traveling direction of the vehicle 7 is as indicated by an arrow, which is shown as a direction rotated by the traveling orientation of ⁇ degrees from the x′ axis.
  • step 205 as shown in FIG.
  • the lane estimating means 5 rotationally transforms the coordinate system x′y′ (rotates it by ⁇ degrees) to a coordinate system x′′y′′ with the traveling direction of the vehicle 7 as the x′′ direction and with the x′′ axis and y′′ axis as coordinate axes. This transforms the map data to the coordinate system x′′y′′.
  • the processes of subsequent steps 206 to 208 are performed in the coordinate system x′′y′′.
  • the lane estimating means 5 searches for a point (reference point P 1 ) on the map data closest to the origin to obtain the coordinates (x 1 , y 1 ) of the reference point P 1 and the lane width w at the reference point P 1 . Then, in step 207 , the lane estimating means 5 extracts a plurality of points (P 2 , - - - , Pn) in the range from the reference point P 1 to a point a given distance X ahead thereof in the vehicle traveling direction (n is an integer of 3 or greater). The given distance X is assumed to be, for example, loom or so.
  • the coordinate data ⁇ (x 1 , y 1 ), (x 2 , y 2 ), - - - , (xn, yn) ⁇ of the reference point P 1 and the extracted points (P 2 , - - - , and Pn) are considered to be point sequence data A 4 .
  • the point sequence data A 4 is as shown in FIG. 8 .
  • a least squares method is used then as the approximation method.
  • the lane estimating means 5 estimates the lane marks A 0 and A 1 from the obtained quadratic equation F 4 and the lane width w.
  • the actual lane recognizing means 6 performs a process of recognizing an actual lane from the first lane information and the second lane information (hereinafter, referred to as actual lane recognition process) and outputs actual lane information and reliability (step 006 ).
  • the output actual lane information includes actual left recognition execution information indicating whether the lane mark A 0 is already recognized by the actual lane recognition process, actual left-hand line shape information indicating the shape of the recognized lane mark A 0 in the case where the lane mark A 0 is already recognized, actual right recognition execution information indicating whether the lane mark A 1 is already recognized by the actual lane recognition process, and actual right-hand line shape information indicating the shape of the recognized lane mark A 1 in the case where the lane mark A 1 is already recognized.
  • the actual left recognition execution information is set to “Yes,” or otherwise, it is set to “No.”
  • the actual right recognition execution information is set to “Yes” if the lane mark A 1 is already recognized, or otherwise, it is set to “No.”
  • the actual lane recognizing means 6 checks the set values of the first left estimation execution information, the first right estimation execution information, and the second estimation execution information. Thereafter, it performs the actual lane recognition process for the following eight cases (a) to (h) according to the combination of the above three set values.
  • the actual lane recognizing means 6 calculates a left-hand line similarity index value L 1 which indicates a degree of similarity between the first left-hand line shape information and the second left-hand line shape information, first.
  • a value of a comparison function F 1 expressed by the following equation (1) is used as the left-hand line similarity index value L 1 .
  • the actual lane recognizing means 6 considers the left-hand value calculated by using ax*x+bx+c (the first left-hand line shape information) for f 1 ( x ) in the right-hand side of equation (1) and using gx*x+hx+i ⁇ w/2 (the second left-hand line shape information) for f 2 ( x ) to be the left-hand line similarity index value L 1 .
  • the calculated left-hand line similarity index value L 1 decreases as the degree of similarity (similarity) between f 1 ( x ) and f 2 ( x ) increases.
  • the actual lane recognizing means 6 calculates a right-hand line similarity index value R 1 , which indicates the degree of similarity between the first right-hand line shape information and the second right-hand line shape information, by using equation (1).
  • the actual lane recognizing means 6 considers the left-hand value calculated by using dx*x+ex+f (the first right-hand line shape information) for f 1 ( x ) in the right-hand side of equation (1) and using gx*x+hx+i+w/2 (the second right-hand line shape information) for f 2 ( x ) to be the right-hand line similarity index value R 1 .
  • the calculated right-hand line similarity index value R 1 decreases as the degree of similarity (similarity) between f 1 ( x ) and f 2 ( x ) increases, as is the case with the left-hand line similarity index value L 1 .
  • the actual lane recognizing means 6 compares the left-hand line similarity index value L 1 and the right-hand line similarity index value R 1 each with a given threshold value S 1 . Then, the actual lane recognizing means 6 sets the actual lane information and the reliability for the following two cases (a1) and (a2) according to a result of the comparison.
  • the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the first left-hand line shape information as the actual left-hand line shape information.
  • the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and sets the first right-hand line shape information as the actual right-hand line shape information.
  • the information indicating that the lane is already recognized is appropriately set as the actual lane information if the degree of similarity is high between the shape of the lane estimated by the image processing and the shape of the lane estimated by the map data and position information and if it is possible to consider that the lane is accurately estimated by the image processing as a result of verification of the reliability of lane estimation by the image processing based on the information on the lane estimated by the map data and position information.
  • the shape information on the lane estimated by the image processing that can be considered to be higher in accuracy than the shape information on the lane estimated by the map data and position information is set as the actual lane information regarding the lane marks A 0 and A 1 .
  • the actual lane recognizing means 6 sets the reliability to level 1 . Thereby, if both of the lane marks A 0 and A 1 can be considered to be estimated accurately, the reliability is set to the highest level.
  • the actual lane recognizing means 6 sets the actual left recognition execution information and the actual right recognition execution information each to “No.” Thereby, information indicating that the lane is not recognized is appropriately set as the actual lane information if the shape of the lane estimated by the image processing differs from the shape of the lane estimated by the map data and position information and if the actual lane is likely to be estimated inappropriately in one or both of the lane estimation by the image processing and the lane estimation by the map data and position information.
  • the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the first left-hand line shape information as the actual left-hand line shape information.
  • the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and sets the first right-hand line shape information as the actual right-hand line shape information.
  • the actual lane recognizing means 6 sets the reliability to level 2 . Thereby, if the lane is estimated only by the image processing and the reliability of the lane estimation by the image processing cannot be verified using the information on the lane estimated by the map data and position information, the reliability is set lower than level 1 .
  • the actual lane recognizing means 6 calculates the left-hand line similarity index value L 1 using equation (1) from the first left-hand line shape information and the second left-hand line shape information, first. Then, the actual lane recognizing means 6 compares the left-hand line similarity index value L 1 with the given threshold value S 1 . Thereafter, the actual lane recognizing means 6 sets the actual lane information and reliability for the following two cases (c1) and (c2) according to a result of the comparison.
  • the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the first left-hand line shape information as the actual left-hand line shape information.
  • the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and sets the second right-hand line shape information as the actual right-hand line shape information.
  • the information indicating that the lane is already recognized is appropriately set as the actual lane information if the degree of similarity is high between the shape of the lane estimated by the image processing and the shape of the lane estimated by the map data and position information and if it is possible to consider that the lane is accurately estimated by the image processing as a result of verification of the reliability of the lane estimation by the image processing based on the information on the lane estimated by the map data and position information.
  • the shape information on the lane estimated by the image processing that can be considered to be higher in accuracy than the shape information on the lane estimated by the map data and position information is set as the actual lane information regarding the lane mark A 0 .
  • the image processing result is supplemented with the information on the lane estimated by the map data and position information, by which the opportunities for detecting the lane can be increased as much as possible.
  • the actual lane recognizing means 6 sets the reliability to level 2 . Thereby, if only one of the lane marks A 0 and A 1 can be considered to be estimated accurately, the reliability is set lower than level 1 .
  • the actual lane recognizing means 6 sets the actual left recognition execution information and the actual right recognition execution information each to “No.” Thereby, information indicating that the lane is not recognized is appropriately set as the actual lane information if the shape of the lane estimated by the image processing differs from the shape of the lane estimated by the map data and position information and if the actual lane is likely to be estimated inappropriately in one or both of the lane estimation by the image processing and the lane estimation by the map data and position information.
  • the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the first left-hand line shape information as the actual left-hand line shape information. In addition, the actual lane recognizing means 6 sets the actual right recognition execution information to “No.” Thereby, if the lane is estimated by the image processing in the case where the lane is not estimated by the map data and position information, the result of the lane estimation by the image processing is directly used as actual lane information.
  • the actual lane recognizing means 6 sets the reliability to level 3 .
  • the reliability is set lower than level 1 .
  • only one of the lane marks A 0 and A 1 is already estimated by the image processing, and therefore the reliability is set lower than level 2 .
  • the actual lane recognizing means 6 calculates the right-hand line similarity index value R 1 using equation (1) from the first right-hand line shape information and the second right-hand line shape information, first. Then, the actual lane recognizing means 6 compares the right-hand line similarity index value R 1 with the given threshold value S 1 . Thereafter, the actual lane recognizing means 6 sets the actual lane information and reliability for the following two cases (e1) and (e2) according to a result of the comparison.
  • the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the second left-hand line shape information as the actual left-hand line shape information.
  • the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and sets the first right-hand line shape information as the actual right-hand line shape information.
  • the information indicating that the lane is already recognized is appropriately set as the actual lane information if the degree of similarity is high between the shape of the lane estimated by the image processing and the shape of the lane estimated by the map data and position information and if it is possible to consider that the lane is accurately estimated by the image processing as a result of verification of the reliability of the lane estimation by the image processing based on the information on the lane estimated by the map data and position information.
  • the shape information on the lane estimated by the image processing that can be considered to be higher in accuracy than the shape information on the lane estimated by the map data and position information is set as the actual lane information regarding the lane mark A 1 .
  • the image processing result is supplemented with the information on the lane estimated by the map data and position information, by which the opportunities for detecting the lane can be increased as much as possible.
  • the actual lane recognizing means 6 sets the reliability to level 2 . Thereby, if only one of the lane marks A 0 and A 1 can be considered to be estimated accurately, the reliability is set lower than level 1 .
  • the actual lane recognizing means 6 sets the actual left recognition execution information and the actual right recognition execution information each to “No.” Thereby, the information indicating that the lane is not recognized is appropriately set as the actual lane information if the shape of the lane estimated by the image processing differs from the shape of the lane estimated by the map data and position information and if the actual lane is likely to be estimated inappropriately in one or both of the lane estimation by the image processing and the lane estimation by the map data and position information.
  • the actual lane recognizing means 6 sets the actual left recognition execution information to “No.” In addition, the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and the first right-hand line shape information as the actual right-hand line shape information. Thereby, if the lane is already estimated by the image processing in the case where the lane is not estimated by the map data and position information, the result of the lane estimation by the image processing is directly used as actual lane information.
  • the actual lane recognizing means 6 sets the reliability to level 3 .
  • the reliability is set lower than level 1 .
  • only one of the lane marks A 0 and A 1 is estimated by the image processing, and therefore the reliability is set lower than level 2 .
  • the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the second left-hand line shape information as the actual left-hand line shape information.
  • the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and the second right-hand line shape information as the actual right-hand line shape information.
  • the actual lane recognizing means 6 sets the reliability to level 3 . Thereby, if the lane is estimated only by the map data and position information, the accuracy is considered to be lower than in the case of estimation only by the image processing in consideration of the GPS position-fix accuracy or the like and then the reliability is set lower than level 2 .
  • the actual lane recognizing means 6 sets the actual left recognition execution information and the actual right recognition execution information each to “No.” In this case, the actual lane recognizing means 6 makes no settings as the actual left-hand line shape information and the actual right-hand line shape information. Thereby, it is clearly indicted that no actual lane has been recognized.
  • the lane can be detected accurately while increasing the opportunities for detecting the lane as much as possible by using the information on the lane estimated by the image processing and the information on the lane estimated by the map data and position information.
  • the vehicle is controlled and information is provided to the driver on the basis of the output actual lane information and reliability.
  • the reliability is set to level 1
  • the steering control of the vehicle is performed on the basis of the actual lane shape information
  • the reliability is set to level 2 or level 3
  • no steering control of the vehicle is performed, but the driver is provided with information when there is a possibility that the vehicle will deviate from the lane.
  • the lane is defined by the left-hand line and the right-hand line in the first embodiment, it can be defined, for example, only by one of the left-hand line and the right-hand line as a second embodiment (which corresponds to the second aspect of the present invention).
  • This embodiment provides the same operation and effect as in the case where the lane is defined by the left-hand line and the right-hand line as described above.
  • GPS unit 4 is used as position information obtaining means in the first and second embodiments, it is also possible to use position information obtained by autonomous navigation, instead of obtaining the vehicle position information from the GPS.
  • the present invention is adapted for use in providing a driver with information in a vehicle or controlling vehicle behaviors since it can accurately detect a lane while increasing the opportunities for detecting the lane as much as possible by processing the image of a road ahead of the vehicle and obtaining information on the road from a GPS or the like and map data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Processing Or Creating Images (AREA)
  • Navigation (AREA)

Abstract

A lane recognizing device comprises: an image processing means which performs a process of estimating a lane of a road by processing an image of the road and outputs a result of the process as first lane information; a lane estimating means which performs a process of estimating the lane using a map data of the road and the current position information of a vehicle and outputs a result of the process as second lane information; and an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information and the second lane information. Thereby, even if there is an unpredictable skid mark or road repaired part, it is possible to detect the lane accurately while increasing the opportunities for detecting the lane as much as possible by processing the road image and obtaining the road information and the map data.

Description

    TECHNICAL FIELD
  • The present invention relates to a vehicle and lane recognizing device for recognizing the lane of a road by processing an image of the road obtained by an imaging means such as a camera and obtaining information on the road from a GPS or the like and map data.
  • BACKGROUND ART
  • In recent years, there has been known a technique of detecting a lane mark such as a white line on a road by capturing an image of the road along which a vehicle is traveling and processing the captured image with an imaging means such as a CCD camera mounted on the vehicle and performing vehicle control or provision of information to a driver on the basis of information on the lane (traffic lane) along which the vehicle is traveling recognized from a detection result. In this technical field, there has been suggested a technique for recognizing a lane accurately by performing image processing according to the situation of a road ahead based on the road information obtained from a GPS or map data or the like when recognizing the lane by image processing from the road image (refer to, for example, Japanese Patent Laid-Open No. Hei 11-72337/1999 (hereinafter, referred to as Patent Document 1) and Japanese Patent Laid-Open No. 2003-123058 (hereinafter, referred to as Patent Document 2). Moreover, there has been suggested a technique of performing the control or the provision of information by supplementing an image processing result with road information obtained from a GPS, map data, or the like (refer to, for example, Japanese Patent Laid-open No. 2002-92794 (hereinafter, referred to as Patent Document 3) and Japanese Patent Laid-Open No. 2003-205805 (hereinafter, referred to as Patent Document 4)).
  • The traffic lane recognizing device in Patent Document 1 includes an image information processing means which recognizes the positions of left and right white lines on the road ahead by processing image information from a camera, and a traffic lane estimating means which estimates the position and feature of the traffic lane and a positional relationship between the traffic lane and the subject vehicle from the white line position image information obtained by the image information processing means. The traffic lane recognizing device further includes a road condition determining means which determines whether there is a special particular road area (for example, a particular white line area having white lines in a striped pattern or zebra pattern) different from an area having normal white lines on the road ahead. The road condition determining means determines whether there is the particular road area on the road ahead while associating data on the particular road area stored in a ROM with vehicle position information obtained from a GPS. Although the image information processing means recognizes the midpoint of both edges of a white line as a position in the road width direction of the white line for a normal white line, a particular part of the white line area (an edge on the main line side of a recognizable white line area) is considered to be a position in the road width direction of the white line if the road condition determining means determines that there is the particular road area on the road ahead of the vehicle. Thereafter, the traffic lane estimating means estimates an area defined by the extracted left and right white lines on the road or an area defined by one of the extracted left and right white lines on the road and a previously recognized lane width as a traffic lane.
  • The traffic lane recognizing device in Patent Document 2 includes an image processing means which detects a traffic lane by processing a road image captured by a camera. The image processing means has a plurality of image processing algorithms different from each other according to a plurality of types of lane marks such as a white line, a raised marker, a post cone and so on. In this situation, the traffic lane recognizing device receives position information of the road along which the vehicle is currently traveling from a satellite by using a GPS receiving means and determines the type of lane marks on the road ahead along which the vehicle is currently traveling from a road map data file which stores lane mark types for respective roads. The traffic lane recognizing device selects an image processing algorithm suitable for the lane marks ahead out of the plurality of image processing algorithms and detects the traffic lane.
  • A warning device for a vehicle in Patent Document 3 includes a white line recognition system which recognizes left and right division lines (white lines) of the lane of the road along which the vehicle is traveling from the image obtained by a camera by image processing and a sensor group (vehicle speed sensors or the like) for use in obtaining vehicle behaviors. Moreover, the warning device for a vehicle includes a navigation system which obtains position information and road information on the road along which the vehicle is traveling by comparison with map data read from a CD-ROM on the basis of the position of a subject vehicle obtained from s GPS. The warning device for a vehicle issues a warning for a driver when predicting that the vehicle deviates from the lane on the basis of the left and right division lines of the recognized lane and vehicle behaviors obtained by the sensor group. In this situation, if the vehicle is traveling in a given road section (a road section determined to be a section where a false warning has been issued in the past) when determined from the obtained road information, the warning device for a vehicle changes a warning generation condition to a condition under which the warning is more difficult to be issued in comparison with other cases.
  • A vehicle driving support apparatus in Patent Document 4 includes a surroundings monitoring means which recognizes white lines or the like indicating a lane by processing an image ahead of a vehicle captured by a camera, and a driving support means which supports a driver's operation by a lane deviation warning control or the like on the basis of information from the surroundings monitoring means. Furthermore, the vehicle driving support apparatus includes a road condition recognizing means, which recognizes the road conditions (branch road or the like) of the road ahead of the vehicle from the current position of the vehicle obtained based on radio wave information from a plurality of satellites, map information stored in a storage medium such as a DVD-ROM, and road ahead information obtained by a road-vehicle communication from road infrastructure and which determines whether a problem will occur in driving support control according to the road ahead conditions. Then, the vehicle driving support apparatus previously provides a driver with information on the basis of information from the road condition recognizing means if reliability of the operation of the driving support means is projected to decrease.
  • In some cases, however, there may be an unpredictable skid mark or road repaired part on the road. In this situation, the existence of the skid mark or road repaired part cannot be known from previously stored road information. Moreover, it is difficult to recognize a lane accurately by image processing based on a road image due to the skid mark or road repaired part. Therefore, in this case, there is a problem that it is difficult to recognize the lane accurately even if the image processing is performed in consideration of the road ahead condition on the basis of the road information obtained from the GPS, map data, or the like, similarly to Patent Documents 1 and 2. Furthermore, in the above situation, even the device and apparatus in Patent Documents 3 and 4 find it hard to detect the lane by the image processing and cannot supplement the image processing result with the road information obtained from the GPS, map data, or the like, which leads to a problem that they cannot appropriately control the vehicle or provide the driver with information.
  • DISCLOSURE OF THE INVENTION Problem to be Solved by the Invention
  • In order to solve the above problems, an object of the present invention is to provide a vehicle and lane recognizing device capable of detecting a lane accurately while increasing opportunities for detecting the lane as much as possible by processing a road image obtained via an imaging means such as a camera and obtaining road information from GPS and map data, even if there is an unpredictable skid mark or road repaired part on the road.
  • Means for Solving the Problem
  • In order to achieve the above object, according to a first aspect of the present invention, there is provided a vehicle comprising: an imaging means; an image processing means which obtains an image of a road via the imaging means, performs a process of estimating a lane of the road by processing the obtained image, and outputs a result of the process as first lane information; holding means which holds map data of the road; position information obtaining means which obtains the current position information of the vehicle; a lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information; and an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information and the second lane information.
  • Furthermore, according to the first aspect of the present invention, there is provided a lane recognizing device comprising: an image processing means which performs a process of estimating a lane of a road by processing an image of the road obtained via an imaging means mounted on a vehicle and outputs a result of the process as first lane information; a holding means which holds map data of the road; a position information obtaining means which obtains the current position information of the vehicle; a lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information; and an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information and the second lane information (First aspect of the present invention).
  • According to the vehicle and lane recognizing device of the first aspect of the present invention, the image processing means performs the process of estimating the lane of the road by processing the image of the road obtained via the imaging means and outputs the result of the process as the first lane information. Thereby, information on the lane along which the vehicle is traveling is obtained. If, however, there is, for example, an unpredictable skid mark or road repaired part on the road when estimating the lane by the image processing, it is difficult to estimate the lane appropriately. Also in this case, however, it is preferable that the actual lane is recognized as appropriately as possible.
  • In this situation, the lane estimating means performs the process of estimating the lane along which the vehicle is traveling using the map data and the current position information and outputs the result of the process as the second lane information. Thereby, the lane information can be obtained in a different method from the method of estimating the lane by the image processing. Moreover, since the actual lane recognizing means recognizes the actual lane on the basis of both of the first lane information and the second lane information, the information on the lane estimated by the map data and position information can be used as the information indicating the actual lane even if the lane is not appropriately estimated by the image processing. Therefore, it is possible to detect the actual lane accurately while increasing the opportunities for detecting the actual lane as much as possible.
  • Furthermore, according to a second aspect of the present invention, there is provided a vehicle comprising: an imaging means; an image processing means which obtains an image of a road via the imaging means, performs a process of estimating a lane of the road by processing the obtained image, and outputs a result of the process as first lane information; a holding means which holds map data of the road; position information obtaining means which obtains the current position information of the vehicle; lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information; a means which calculates lane similarity, which is a degree of similarity between the shape of the lane indicated by the first lane information and the shape of the lane indicated by the second lane information; and an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information, the second lane information, and a result of comparison between the lane similarity and a given threshold value.
  • Furthermore, according to the second aspect of the present invention, there is provided a lane recognizing device comprising: an image processing means which performs a process of estimating a lane of a road by processing an image of the road obtained via an imaging means mounted on a vehicle and outputs a result of the process as first lane information; a holding means which holds map data of the road; a position information obtaining means which obtains the current position information of the vehicle; a lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information; a means which calculates lane similarity, which is a degree of similarity between the shape of the lane indicated by the first lane information and the shape of the lane indicated by the second lane information; and an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information, the second lane information, and a result of comparison between the lane similarity and a given threshold value (Second aspect of the present invention).
  • According to the vehicle and lane recognizing device of the second aspect of the present invention, similarly to the first aspect of the present invention, the image processing means performs the process of estimating the lane of the road by processing the image of the road obtained via the imaging means and outputs the result of the process as the first lane information. Thereby, information on the lane along which the vehicle is traveling is obtained. If, however, there is, for example, an unpredictable skid mark or road repaired part on the road when estimating the lane by the image processing, it is difficult to estimate the lane appropriately. Also in this case, however, it is preferable that the actual lane is recognized as appropriately as possible. In this situation, the lane estimating means performs the process of estimating the lane along which the vehicle is traveling using the map data and the current position information of the vehicle and outputs the result of the process as the second lane information. Thereby, the lane information can be obtained in a different method from the method of estimating the lane by the image processing.
  • Moreover, since the actual lane recognizing means recognizes the actual lane on the basis of both of the first lane information and the second lane information, the information on the lane estimated by the map data and position information can be used as the information indicating the actual lane even if the lane is not appropriately estimated by the image processing. Therefore, it is possible to detect the actual lane accurately while increasing the opportunities for detecting the actual lane as much as possible. Then, the actual lane recognizing means recognizes the actual lane on the basis of the first lane information, the second lane information, and the result of comparison between the lane similarity, which is the degree of similarity between the shape of the lane indicated by the first lane information and the shape of the lane indicated by the second lane information, and the given threshold value. Thereby, if the lane is estimated by both of the estimation process by the image processing and the estimation process by the map data and position information, the actual lane recognizing means can grasp the reliability of the information on the lane estimated by the image processing and the reliability of the information on the lane estimated by the map data and position information from the degree of similarity between the shapes of the lanes estimated by both, and therefore it can recognize the actual lane more appropriately.
  • In the above, if the shape of the lane estimated by the image processing and the shape of the lane estimated by the map data and position information are each accurately estimated from the shape of the actual lane, it is conceivable that the degree of similarity between the shapes of the lane estimated by both is high. Furthermore, if the lane is estimated by the image processing accurately, the shape of the lane estimated by the image processing is considered to be more accurate in position than the shape of the lane estimated by the map data and position information since the image processing of the actual road ahead of the vehicle is more local than the processing by the map data and position information in view of the position-fix accuracy of the GPS or the like and data density of the map data.
  • Therefore, in the vehicle and lane recognizing device according to the second aspect of the present invention, preferably the actual lane recognizing means recognizes the actual lane on the basis of the first lane information in the case where the lane similarity is greater than the given threshold value (Third aspect of the present invention).
  • According to the third aspect of the present invention, if the lane similarity is greater than the given threshold value, the actual lane is recognized on the basis of the first lane information, and therefore the reliability of the lane estimation by the image processing is verified according to the information on the lane estimated by the map data and position information.
  • Furthermore, if the lane can be considered to be accurately estimated by the image processing, the information on the shape of the lane estimated by the image processing, which is thought to be more accurate than the shape of the lane estimated by the map data and position information, is used as information indicating the actual lane, by which the actual lane is appropriately recognized.
  • Moreover, if the shape of the lane estimated by the image processing differs from the shape of the lane estimated by the map data and position information, the actual lane is not likely to be appropriately estimated in one or both of the lane estimation by the image processing and the lane estimation by the map data and position information. For example, if there is a branch or the like in the road ahead, it is conceivable that the lane shape cannot be appropriately estimated in either approach.
  • Therefore, in the vehicle and lane recognizing device according to the second aspect of the present invention, preferably the actual lane recognizing means outputs information indicating that the actual lane is not recognized in the case where the lane similarity is equal to or smaller than the given threshold value (Fourth aspect of the present invention).
  • According to the fourth aspect of the present invention, the actual lane recognizing means outputs the information indicating that the actual lane is not recognized if the lane similarity is equal to or smaller than the given threshold value, and therefore if the actual lane is not likely to be appropriately estimated in one or both of the lane estimation by the image processing and the lane estimation by the map data and position information, it can grasp that the actual lane is not recognized appropriately.
  • Moreover, in the case where there is an unpredictable skid mark, road repaired part, or the like on the road when performing the process of estimating the lane through the image processing, it is conceivable that the lane is not estimated accurately or no lane is estimated. Further, also when the process of estimating the lane is performed based on the map data and position information, for example, when the position information obtaining means obtains the vehicle position information from a GPS or other driving information providing services via signals and communication, it is conceivable that the lane cannot be estimated because the current position information of the vehicle cannot be obtained due to a signal or communication failure or the like.
  • Therefore, in the vehicle and lane recognizing device according to the second aspect of the present invention, the first lane information includes first estimation execution information indicating whether the lane is estimated by the process of the image processing means and first lane shape information indicating the shape of the estimated lane in the case where the lane is estimated. Furthermore, the second lane information includes second estimation execution information indicating whether the lane is estimated by the process of the lane estimating means and second lane shape information indicating the shape of the estimated lane in the case where the lane is estimated. Then, the means which calculates the lane similarity uses the first lane shape information and the second lane shape information when calculating the lane similarity. Furthermore, preferably the actual lane recognizing means includes: a means which determines and outputs actual lane recognition execution information indicating whether the actual lane is recognized on the basis of the first estimation execution information, the second estimation execution information, and the result of comparison between the lane similarity and the given threshold value; and a means which determines and outputs actual lane shape information indicating the shape of the recognized lane in the case where the lane is recognized on the basis of the first lane shape information and the second lane shape information (Fifth aspect of the present invention).
  • According to the fifth aspect of the present invention, the actual lane recognition execution information indicating whether the actual lane is recognized is determined on the basis of the first estimation execution information, the second estimation execution information, and the result of the comparison between the lane similarity and the given threshold value. Additionally, the actual lane shape information indicating the shape of the recognized lane in the case where the lane is recognized is determined based on the first lane shape information and the second lane shape information. In this condition, the lane similarity is calculated using the first lane shape information and the second lane shape information. Thereby, information indicating the actual lane is appropriately determined even if the lane is not estimated by the image processing or even if the lane is not estimated by the map data and position information. Therefore, it is possible to detect the actual lane accurately while increasing the opportunities for detecting the actual lane as much as possible.
  • Moreover, if the lane is estimated by both of the estimation process by the image processing and the estimation process by the map data and position information, it is possible to grasp the reliability of the information on the lane estimated by the image processing and the reliability of the information on the lane estimated by the map data and position information from the degree of similarity between the shapes of the lanes estimated by both, and therefore the actual lane can be recognized more appropriately.
  • Furthermore, in the vehicle and lane recognizing device according to the fifth aspect of the present invention, preferably the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the second lane shape information as the actual lane shape information in the case where the first estimation execution information indicates that the lane is not estimated and in the case where the second estimation execution information indicates that the lane is estimated (Sixth aspect of the present invention).
  • According to the sixth aspect of the present invention, the actual lane recognizing means sets the second lane shape information as the actual lane shape information if the first estimation execution information indicates that the lane is not estimated and if the second estimation execution information indicates that the lane is estimated. Therefore, even if the lane cannot be estimated by the image processing, the image processing result is supplemented with the information on the lane estimated by the map data and position information, which increases the opportunities for detecting the actual lane as much as possible.
  • Furthermore, in the vehicle and lane recognizing device according to the fifth aspect of the present invention, in the case where the second estimation execution information indicates that the lane is not estimated, preferably the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the first lane shape information as the actual lane shape information in the case where the first estimation execution information indicates that the lane is estimated; and sets information indicating that the lane is not recognized as the actual lane recognition execution information in the case where the first estimation execution information indicates that the lane is not estimated (Seventh aspect of the present invention).
  • According to the seventh aspect of the present invention, the actual lane recognizing means can use the result of the lane estimation by the image processing directly as information indicating the actual lane if the lane is estimated by the image processing in the case where the lane is not estimated by the map data and position information. Moreover, if the lane is not estimated by the map data and position information and not estimated by the image processing, either, information indicating that the lane is not recognized is set as the actual lane recognition execution information, and therefore, the actual lane recognizing means can clearly grasp that the actual lane is not recognized.
  • Furthermore, in the vehicle and lane recognizing device according to the fifth aspect of the present invention, in the case where the first estimation execution information indicates that the lane is estimated and where the second estimation execution information indicates that the lane is estimated, the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the first lane shape information as the actual lane shape information in the case where the lane similarity is greater than the given threshold value. Furthermore, in the case where the first estimation execution information indicates that the lane is not estimated and the second estimation execution information indicates that the lane is estimated, the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the second lane shape information as the actual lane shape information. Furthermore, in the case where the first estimation execution information indicates that the lane is not estimated and the second estimation execution information indicates that the lane is not estimated, the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the first lane shape information as the actual lane shape information.
  • Moreover, the actual lane recognizing means includes means which sets and outputs the reliability of recognition in the case where the actual lane is recognized so as to set the reliability to a first level, which is the highest level, in the case where the lane similarity is greater than the given threshold value in the case where the first estimation execution information indicates that the lane is estimated and the second estimation execution information indicates that the lane is estimated. Furthermore, actual lane recognizing means sets the reliability to a second level, which is lower than the first level, in the case where the first estimation execution information indicates that the lane is estimated and the second estimation execution information indicates that the lane is not estimated. Furthermore, actual lane recognizing means sets the reliability to a third level, which is lower than the second level, in the case where the first estimation execution information indicates that the lane is not estimated and the second estimation execution information indicates that the lane is estimated (Eighth aspect of the present invention).
  • According to the above, if the actual lane is recognized, the reliability of the recognition is output. If the lane similarity is greater than the given threshold value, then the reliability is set to the first level, which is the highest level. Therefore, if it is possible to consider the actual lane to be estimated accurately due to the high similarity between the shape of the lane estimated by the image processing and the shape of the lane estimated by the map data and position information, the reliability is set highest.
  • Furthermore, if the first estimation execution information indicates that the lane is estimated and the second estimation execution information indicates that the lane is not estimated, it means that the lane is estimated only by the image processing. In this case, the reliability of the lane estimation by the image processing cannot be verified using the information on the lane estimated by the map data and position information, and therefore the reliability is set to the second level, which is lower than the first level.
  • Still further, if the first estimation execution information indicates that the lane is not estimated and the second estimation execution information indicates that the lane is estimated, it means that the lane is estimated only by the map data and position information. In this case, the lane estimation by the map data and position information is considered to be less accurate than the lane estimation by the image processing, in which a more local process is performed, in view of the position-fix accuracy of the GPS or the like and data density of the map data, and therefore the reliability is set to the third level, which is lower than the second level.
  • The setting and output of the reliability as described above allows the reliability of the information indicating the recognized actual lane to be clearly known. In addition, the reliability set as described above can be used to control the vehicle or to provide information to the driver in addition to the information indicating the actual lane.
  • Note that the lane along which the vehicle travels is generally composed of a left-hand line, which defines the left side of the lane, and a right-hand line, which defines the right side of the lane.
  • Therefore, in the vehicle and lane recognizing device according to the second aspect of the present invention, the first lane information output by the image processing means includes: first left estimation execution information which indicates whether the left-hand line defining the left side of the lane is estimated by the process of the image processing means; first left-hand line shape information which indicates the shape of the estimated left-hand line in the case where the left-hand line is estimated; first right estimation execution information which indicates whether the right-hand line defining the right side of the lane is estimated by the process of the image processing means; and first right-hand line shape information which indicates the shape of the estimated right-hand line in the case where the right-hand line is estimated. Furthermore, the second lane information output by the lane estimating means includes: second estimation execution information which indicates whether both of the left-hand line and the right-hand line are estimated by the process of the lane estimating means; second left-hand line shape information which indicates the shape of the estimated left-hand line in the case where the left-hand line is estimated; and second right-hand line shape information which indicates the shape of the estimated right-hand line in the case where the right-hand line is estimated.
  • Moreover, the actual lane recognizing means includes means which calculates left-hand line similarity, which is the degree of similarity between the first left-hand line shape information and the second left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is estimated and in the case where the second estimation execution information indicates that the left-hand line and the right-hand line are estimated. Furthermore, the actual lane recognizing means includes means which calculates right-hand line similarity, which is the degree of similarity between the first right-hand line shape information and the second right-hand line shape information in the case where the first estimation execution information indicates that the right-hand line is estimated and in the case where the second estimation execution information indicates that the left-hand line and the right-hand line are estimated. Furthermore, preferably the actual lane recognizing means includes means which determines and outputs: actual left recognition execution information which indicates whether the actual left-hand line defining the left side of the actual lane is recognized; actual left-hand line shape information which indicates the shape of the recognized actual left-hand line in the case where the actual left-hand line is recognized; actual right recognition execution information which indicates whether the actual right-hand line defining the right hand of the actual lane is recognized; and actual right-hand line shape information which indicates the shape of the recognized actual right-hand line in the case where the actual right-hand line is recognized, according to a result of comparing the calculated left-hand line similarity and right-hand line similarity with a given threshold value (Ninth aspect of the present invention).
  • According to the ninth aspect of the present invention, the actual left recognition execution information, the actual left-hand line shape information, the actual right recognition execution information, and the actual right-hand line shape information are determined according to the result of comparing the calculated left-hand line similarity and right-hand line similarity with the given threshold value. Thereby, in the case where the left-hand line and the right-hand line are estimated by both of the estimation process by the image processing and the estimation process by the map data and position information, the actual lane recognizing means can grasp the reliability of the information on the left-hand line and the right-hand line estimated by the image processing and the reliability of the information on the left-hand line and the right-hand line estimated by the map data and position information according to the degree of similarity in shape between the left-hand line and the right-hand line estimated by both. Therefore, the actual lane recognizing means can recognize the actual left-hand line and right-hand line more appropriately.
  • In the above, in the case where the shapes of the left-hand line and right-hand line estimated by the image processing and the shapes of the left-hand line and right-hand line estimated by the map data and position information are accurately estimated from the shapes of the actual left-hand line and right-hand line, it is considered that the shapes of the left-hand line and right-hand line estimated by both have high similarity with each other. Furthermore, the image processing of the actual road ahead of the vehicle is more local than the processing by the map data and position information in view of the position-fix accuracy of the GPS or the like and data density of the map data. Therefore, if the left-hand line and the right-hand line are estimated accurately by the image processing, the shapes of the left-hand line and right-hand line estimated by the image processing are thought to be more accurate imposition than the shapes of the left-hand line and right-hand line estimated by the map data and position information.
  • Therefore, in the vehicle and lane recognizing device according to the ninth aspect of the present invention, the actual lane recognizing means sets information indicating that the actual left-hand line and the actual right-hand line are recognized as the actual left recognition execution information and as the actual right recognition execution information, respectively, in the case where one or both of the left-hand line similarity and the right-hand line similarity are calculated and where one or both of the calculated left-hand line similarity and right-hand line similarity are greater than the given threshold value. Additionally, preferably the actual lane recognizing means sets the first left-hand line shape information as the actual left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is estimated, sets the second left-hand line shape information as the actual left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is not estimated, sets the first right-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is estimated, and sets the second left-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is not estimated (Tenth aspect of the present invention).
  • According to the tenth aspect of the present invention, the actual lane recognizing means sets the information indicating that the actual left-hand line and the actual right-hand line are recognized as the actual left recognition execution information and as the actual right recognition execution information, respectively, in the case where the one or both of the left-hand line similarity and the right-hand line similarity are calculated and where one or both of the calculated left-hand line similarity and right-hand line similarity are greater than the given threshold value. Therefore, if the reliability of the estimation of the left-hand line and right-hand line by the image processing is verified by the information on the left-hand line and right-hand line estimated by the map data and position information and thereby the left-hand line and the right-hand line are considered to be estimated accurately by the image processing, information indicating that the lane is recognized is appropriately set as the information on the actual left-hand line and right-hand line.
  • Additionally, the actual lane recognizing means sets the first left-hand line shape information as the actual left-hand line shape information if the first left estimation execution information indicates that the left-hand line is estimated and sets the first right-hand line shape information as the actual right-hand line shape information if the first right estimation execution information indicates that the right-hand line is estimated, and therefore the information on the shapes of the left-hand line and right-hand line estimated by the image processing, which is thought to be more accurate than the shapes of the left-hand line and right-hand line estimated by the map data and position information, is set appropriately as information on the shapes of the actual left-hand line and right-hand line.
  • Moreover, the actual lane recognizing means sets the second left-hand line shape information as the actual left-hand line shape information if the first left estimation execution information indicates that the left-hand line is not estimated and sets the second left-hand line shape information as the actual right-hand line shape information if the first right estimation execution information indicates that the right-hand line is not estimated. Therefore, even if the left-hand line and the right-hand line cannot be estimated by the image processing, the image processing result is supplemented with the information on the left-hand line and right-hand line estimated by the map data and position information, which thereby increases the opportunities for detecting the actual left-hand line and right-hand line as much as possible.
  • In addition, if the shapes of the left-hand line and right-hand line estimated by the image processing differ from the shapes of the left-hand line and right-hand line estimated by the map data and position information, there is a possibility that the actual left-hand line and right-hand line are not estimated appropriately in one or both of the estimation of the left-hand line and right-hand line by the image processing and the estimation of the left-hand line and right-hand line by the map data and position information.
  • Accordingly, in the vehicle and lane recognizing device according to the ninth aspect of the present invention, preferably the actual lane recognizing means sets information indicating that the actual left-hand line and the actual right-hand line are not recognized as the actual left recognition execution information and as the actual right recognition execution information, respectively, in the case where both of the left-hand line similarity and the right-hand line similarity are calculated and at least one of the calculated left-hand line similarity and right-hand line similarity is equal to or smaller than the given threshold value and where one of the left-hand line similarity and the right-hand line similarity is calculated and one of the calculated left-hand line similarity and right-hand line similarity is equal to or smaller than the given threshold value (11th aspect of the present invention).
  • According thereto, the actual lane recognizing means sets the information indicating that the actual left-hand line and the actual right-hand line are not recognized as the actual left recognition execution information and as the actual right recognition execution information, respectively, in the case where both of the left-hand line similarity and the right-hand line similarity are calculated and at least one of the calculated left-hand line similarity and right-hand line similarity is equal to or smaller than the given threshold value and where one of the left-hand line similarity and the right-hand line similarity is calculated and one of the calculated left-hand line similarity and right-hand line similarity is equal to or smaller than the given threshold value. Therefore, if there is a possibility that the actual left-hand line and right-hand line are not estimated appropriately in one or both of the estimation of the left-hand line and right-hand line by the image processing and the estimation of the left-hand line and right-hand line by the map data and position information, the information indicating that the left-hand line and the right-hand line are not recognized is appropriately set as the actual left-hand line and right-hand line information.
  • Furthermore, in the vehicle and lane recognizing device according to the ninth aspect of the present invention, preferably the actual lane recognizing means sets information indicating that the actual left-hand line and the actual right-hand line are recognized as the actual left recognition execution information and as the actual right recognition execution information and sets the second left-hand line shape information as the actual left-hand line shape information and the second right-hand line shape information as the actual right-hand line shape information, in the case where the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are not estimated, respectively, and where the second estimation execution information indicates that the left-hand line and the right-hand line are estimated (12th aspect of the present invention).
  • According to the 12th aspect of the present invention, the actual lane recognizing means sets the second left-hand line shape information as the actual left-hand line shape information and sets the second right-hand line shape information as the actual right-hand line shape information in the case where the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are not estimated, respectively, and where the second estimation execution information indicates that the left-hand line and the right-hand line are estimated. Therefore, even if the left-hand line and the right-hand line cannot be estimated by the image processing, the image processing result is supplemented with the information on the left-hand line and right-hand line estimated by the map data and position information, which thereby increases the opportunities for detecting the actual left-hand line and right-hand line as much as possible.
  • Further, in the vehicle and lane recognizing device according to the ninth aspect of the present invention, the actual lane recognizing means sets information indicating that the actual left-hand line is recognized as the actual left recognition execution information and sets the first left-hand line shape information as the actual left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is estimated in the case where the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated. Furthermore, the actual lane recognizing means sets information indicating that the actual left-hand line is not recognized as the actual left recognition execution information in the case where the first left estimation execution information indicates that the left-hand line is not estimated. Still further, the actual lane recognizing means sets information indicating that the actual right-hand line is recognized as the actual right recognition execution information and sets the first right-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is estimated. Furthermore, preferably the actual lane recognizing means sets information indicating that the actual right-hand line is not recognized as the actual right recognition execution information in the case where the first right estimation execution information indicates that the right-hand line is not estimated (13th aspect of the present invention).
  • According to the 13th aspect of the present invention, the actual lane recognizing means can use a result of the estimation of the left-hand line or right-hand line by the image processing directly as the information indicating the actual left-hand line or right-hand line if the left-hand line or the right-hand line is estimated by the image processing in the case where the left-hand line and the right-hand line are not estimated by the map data and position information. Furthermore, if the left-hand line or right-hand line is not estimated by either of the map data and position information and by the image processing, the information indicating that the left-hand line or right-hand line is not recognized as the actual left recognition execution information or as the actual right recognition execution information, by which the actual lane recognizing means can grasp clearly that the actual left-hand line or right-hand line is not recognized.
  • Furthermore, in the vehicle and lane recognizing device according to the ninth aspect of the present invention, the actual lane recognizing means sets information indicating that the actual left-hand line and the actual right-hand line are recognized as the actual left recognition execution information and as the actual right recognition execution information, respectively, in the case where one or both of the left-hand line similarity and the right-hand line similarity are calculated and where one or both of the calculated left-hand line similarity and right-hand line similarity are greater than the given threshold value.
  • Additionally, the actual lane recognizing means sets the first left-hand line shape information as the actual left-hand line shape information in the case where the left estimation execution information indicates that the left-hand line is estimated, sets the second left-hand line shape information as the actual left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is not estimated, sets the first right-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is estimated, and sets the second left-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is not estimated.
  • Furthermore, the actual lane recognizing means sets information indicating that the actual left-hand line and the actual right-hand line are recognized as the actual left recognition execution information and as the actual right recognition execution information in the case where the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are not estimated, respectively, and where the second estimation execution information indicates that the lane is estimated. Additionally, the actual lane recognizing means sets the second left-hand line shape information as the actual left-hand line shape information and sets the second right-hand line shape information as the actual right-hand line shape information.
  • Still further the actual lane recognizing means sets information indicating that the actual left-hand line is recognized as the actual left recognition execution information and sets the first left-hand line shape information as the actual left-hand line shape information in the case where the first left estimation execution information indicates that the left-hand line is estimated in the case where the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated. In addition, the actual lane recognizing means sets information indicating that the actual left-hand line is not recognized as the actual left recognition execution information in the case where the first left estimation execution information indicates that the left-hand line is not estimated. Furthermore, the actual lane recognizing means sets information indicating that the actual right-hand line is recognized as the actual right recognition execution information and sets the first right-hand line shape information as the actual right-hand line shape information in the case where the first right estimation execution information indicates that the right-hand line is estimated. Still further the actual lane recognizing means sets information indicating that the actual right-hand line is not recognized as the actual right recognition execution information in the case where the first right estimation execution information indicates that the right-hand line is not estimated.
  • Moreover, the actual lane recognizing means includes means which sets and outputs reliability of recognition in the case where at least one of the actual left-hand line and the actual right-hand line are recognized, and it sets the reliability to a first level, which is the highest level, in the case where both of the left-hand line similarity and the right-hand line similarity are calculated and the calculated left-hand line similarity and right-hand line similarity are each greater than the given threshold value. Furthermore, the actual lane recognizing means sets the reliability to a second level, which is lower than the first level, in the case where one of the left-hand line similarity and the right-hand line similarity is calculated and one of the calculated left-hand line similarity and right-hand line similarity is greater than the given threshold value and in the case where the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are estimated, respectively, and the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated. Furthermore, preferably the actual lane recognizing means sets the reliability to a third level, which is lower than the second level, in the case where one of the first left estimation execution information and the first right estimation execution information indicates that one of the left-hand line and the right-hand line is estimated and the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated and in the case where the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are not estimated, respectively, and the second estimation execution information indicates that the left-hand line and the right-hand line are estimated (14th aspect of the present invention).
  • According to the 14th aspect of the present invention, if at least one of the actual left-hand line and the actual right-hand line is recognized, the reliability of the recognition is output. If both of the left-hand line similarity and the right-hand line similarity are calculated and both of the left-hand line similarity and the right-hand line similarity are greater than the given threshold value in the above, the reliability is set to the first level, which is the highest level. Therefore, if actual left-hand line and right-hand line is considered to be estimated accurately due to relatively high similarity between the shapes of the left-hand line and right-hand line estimated by the image processing and the shapes of the left-hand line and right-hand line estimated by the map data and position information, the reliability is set highest.
  • In addition, if one of the left-hand line similarity and the right-hand line similarity is calculated and one of the calculated left-hand line similarity and right-hand line similarity is greater than the given threshold value, it means that only one of the left-hand line and the right-hand line can be considered to be estimated accurately, and therefore the reliability is set to the second level, which is lower than the first level.
  • Furthermore, if the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are estimated, respectively, and the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated, it means that the left-hand line and the right-hand line are estimated only by the image processing, and therefore the reliability of the estimation of the left-hand line and that of the right-hand line by the image processing cannot be verified by using information on the left-hand line and right-hand line estimated by the map data and position information, by which the reliability is set to the second level, which is lower than the first level.
  • Moreover, if one of the first left estimation execution information and the first right estimation execution information indicates that one of the left-hand line and the right-hand line is estimated and the second estimation execution information indicates that the left-hand line and the right-hand line are not estimated, it means that the left-hand line and the right-hand line are estimated only by the image processing. In this case, the reliability of the estimation of the left-hand line and that of the right-hand line by the image processing cannot be verified by using information on the left-hand line and right-hand line estimated by the map data and position information and only one of the left-hand line and the right-hand line is estimated, and therefore the reliability if set to a third level, which is lower than the second level.
  • In addition, if the first left estimation execution information and the first right estimation execution information indicate that the left-hand line and the right-hand line are not estimated, respectively, and the second estimation execution information indicates that the left-hand line and the right-hand line are estimated, it means that the left-hand line and the right-hand line are estimated only by the map data and position information. In this case, the estimation is considered to be less accurate than the estimation by the image processing in consideration of GPS position-fix accuracy or the like, and therefore the reliability is set to the third level, which is lower than the second level.
  • The lane recognizing device according to the present invention can clearly recognize the reliability of information indicating the recognized actual lane by setting and outputting the reliability as described hereinabove. Moreover, the reliability set in this manner can be used to control the vehicle or inform the driver in addition to the information indicating the actual lane.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • It is a functional block diagram of a lane recognizing device according to a first embodiment of the present invention.
  • FIG. 2
  • It is a flowchart showing a lane recognition process of the lane recognizing device in FIG. 1.
  • FIG. 3
  • It is a flowchart showing a lane estimation process by image processing in the lane recognition process of the lane recognizing device in FIG. 1.
  • FIG. 4
  • It is a flowchart showing a lane estimation process by GPS information and map data in the lane recognition process of the lane recognizing device in FIG. 1.
  • FIG. 5
  • It is an illustrative diagram showing a road lane to be recognized by the lane recognizing device in FIG. 1.
  • FIG. 6
  • It is an explanatory diagram of the lane estimation process by image processing of the lane recognizing device in FIG. 1.
  • FIG. 7
  • It is an explanatory diagram of the lane estimation process by GPS information and map data of the lane recognizing device in FIG. 1.
  • FIG. 8
  • It is an explanatory diagram of the lane estimation process by GPS information and map data of the lane recognizing device in FIG. 1.
  • BEST MODE FOR CARRYING OUT THE INVENTION
  • An embodiment of the present invention will be described with reference to the accompanying drawings. FIG. 1 is a functional block diagram of a lane recognizing device according to a first embodiment of the present invention. FIGS. 2 to 4 are flowcharts of a lane recognition process in the lane recognizing device in FIG. 1. FIG. 5 is an illustrative diagram of a lane to be recognized in the lane recognition process by the lane recognizing device. FIGS. 6 to 8 are explanatory diagrams of the lane recognition process in the lane recognizing device in FIG. 1. Note that the first embodiment corresponds to the first aspect of the present invention.
  • Referring to FIG. 1, a lane recognizing device 1 is an electronic unit composed of a microcomputer and the like and is mounted on a vehicle 7, including, as its processing functions, an image processing means 2 which performs a process of estimating a lane by obtaining a road image, a map data storage medium 3 which holds map data, a GPS unit 4 which obtains the current position information of a vehicle by GPS, a lane estimating means 5 which performs a process of estimating a lane from the map data and the current position information of the vehicle, and an actual lane recognizing means 6 which performs a process of recognizing an actual lane on the basis of a result of the process of the image processing means 2 and a result of the process of the lane estimating means 5.
  • The image processing means 2 is attached to the front of the vehicle 7 to obtain a road image via a video camera 8 (a CCD camera or the like, corresponding to imaging means of the present invention) which captures an image of the road ahead of the vehicle 7. Then, the image processing means 2 performs a process of estimating a lane along which the vehicle 7 travels by processing the obtained road image and outputs a result of the process as first lane information. The vehicle of the present invention includes the video camera 8 and the lane recognizing device 1.
  • The map data storage medium 3 (the holding means of the present invention) is a CD-ROM, DVD, HDD or other storage medium which records map data. A road position, a lane width of a road, and the like are recorded as map data in the map data storage medium 3.
  • The GPS unit 4 (the position information obtaining means of the present invention) receives information transmitted from a plurality of global positioning system (GPS) satellites to obtain the current position information (latitude, longitude, and traveling direction) of the vehicle 7 on the basis of the received information.
  • The lane estimating means 5 identifies the information on the road along which the vehicle 7 is currently traveling by using the map data read from the map data storage medium 3 and the current position information of the vehicle 7 obtained by the GPS unit 4. Then, the lane estimating means 5 performs the process of estimating the lane from the identified road information and outputs a result of the process as second lane information.
  • The actual lane recognizing means 6 performs the process of recognizing actual lane information on the basis of the first lane information output from the image processing means 2 and the second lane information output from the lane estimating means 5 and outputs a result of the process as information indicating an actual lane (hereinafter, referred to as actual lane information). Moreover, if the actual lane is recognized, the actual lane recognizing means 6 sets the reliability of the recognition and outputs it together with the actual lane information. The reliability is an index of recognition accuracy and is set to one of three levels, level 1 to level 3 according to the recognition result. Level 1 indicates that the recognition reliability is highest (high accuracy), level 2 indicates that the reliability is lower than level 1 (medium accuracy), and level 3 indicates that the reliability is still lower than level 2 (low accuracy).
  • Subsequently, the operation of the lane recognizing device 1 according to this embodiment will be described with reference to the flowcharts shown in FIG. 2 to FIG. 4. FIG. 2 is a flowchart showing the general operation of the lane recognition process (the main routine process of the lane recognizing device 1). FIG. 3 is a flowchart showing a process of estimating a lane by image processing (a subroutine process). FIG. 4 is a flowchart showing a process of estimating the lane from map data and position information (a subroutine process). In the following, assuming that the traveling direction of the vehicle 7 corresponds to the arrow direction as shown in FIG. 5, the operation is described by giving an example in which the left side of the lane of a road along which the vehicle 7 is traveling is defined by a lane mark A0 and the right side of the lane is defined by a lane mark A1. The lane marks A0 and A1 are assumed to be, for example, white lines by way of example.
  • Referring to FIG. 2, first, the image processing means 2 obtains a road image by receiving a video signal output from the video camera 8 (step 001).
  • Subsequently, the image processing means 2 performs a process of estimating a lane from the obtained road image (hereinafter, referred to as a first lane estimation process) and outputs the first lane information (step 002). The output first lane information includes first left estimation execution information indicating whether the lane mark A0 is already estimated by the first lane estimation process, first left-hand line shape information indicating the shape of the estimated lane mark A0 in the case where the lane mark A0 is already estimated, a first right estimation execution information indicating whether the lane mark A1 is already estimated by the first lane estimation process, and first right-hand line shape information indicating the shape of the estimated lane mark A1 in the case where the lane mark A1 is already estimated. If the lane mark A0 is already estimated, the first left estimation execution information is set to “Yes,” or otherwise, it is set to “No.” Similarly, the first right estimation execution information is set to “Yes” if the lane mark A1 is already estimated, or otherwise, it is set to “No.”
  • The first lane estimation process is performed as shown in FIG. 3. First, the image processing means 2 performs an edge detection process to detect a white line from the obtained image (step 101). Subsequently, the image processing means 2 performs a Hough transform for output data of the edge detection (step 102). Then, the image processing means 2 searches the Hough space for straight line components and extracts them (step 103). Thereafter, the image processing means 2 applies a projective transformation to data of the extracted straight line components from the Hough space to the image space and further applies a projective transformation to the data from the image space to the real space (the coordinate space fixed to the vehicle) (step 104).
  • Next, in step 105, the image processing means 2 selects data of straight line components estimated to be the lane mark A0 out of the data of the straight line components transformed to the real space and determines the coordinates of a plurality of points included in the selected data of the straight line components to be point sequence data A2. Similarly, the image processing means 2 selects data of straight line components estimated to be the lane mark A1 and determines the coordinates of a plurality of points included in the selected data of the straight line components to be point sequence data A3. The selected point sequence data A2 and A3 are as shown in FIG. 6. In FIG. 6, the coordinate system is a plane rectangular coordinate system having x axis and y axis as coordinate axes, with the center position of the vehicle 7 as an origin and the traveling direction of the vehicle 7 as the x-axis direction.
  • Subsequently, as shown in FIG. 6, the image processing means 2 obtains a quadratic equation F2 y=ax*x+bx+c) which approximates the point sequence data A2 and a quadratic equation F3 (y=dx*x+cx+f) which approximates the point sequence data A3 (a, b, c, d, e, and f are given coefficients). A least squares method is used as the approximation method.
  • Then, in step 106, the image processing means 2 sets first lane information. If the quadratic equation F2 is appropriately obtained then, the image processing means 2 sets the first estimation execution information to “Yes” and sets the first left-hand line shape information to y=ax*x+bx+c. Unless the quadratic equation F2 is appropriately obtained such as, for example, unless the point sequence data A2 is obtained, unless the quadratic equation F2 is obtained due to an insufficient number of data in the point sequence data A2, or if the obtained quadratic equation F2 is poorly approximate to the point sequence data A2 (the point sequence data A2 varies widely with respect to the quadratic equation F2), the image processing means 2 sets the first left estimation execution information to “No.”
  • If the quadratic equation F3 is appropriately obtained, the image processing means 2 sets the first right estimation execution information to “Yes” and sets the first right-hand line shape information to y=dx*x+ex+f. Unless the quadratic equation F3 is appropriately obtained such as, for example, unless the point sequence data A3 is obtained, unless the quadratic equation F3 is obtained due to an insufficient number of data in the point sequence data A3, or if the obtained quadratic equation F3 is poorly approximate to the point sequence data A3 (the point sequence data A3 varies widely with respect to the quadratic equation F3), the image processing means 2 sets the first right estimation execution information to “No.”
  • Thereafter, returning to FIG. 2, the GPS unit 4 performs a process of obtaining the current position information of the vehicle 7 (step 003). The current position information of the vehicle 7 to be obtained includes a position (latitude x0, longitude y0) and a traveling orientation θ. The information is represented in a spherical coordinate system, and the traveling orientation θ is set with the northern direction as zero degree and the clockwise direction as positive. Then, the lane estimating means 5 reads map data from the map data storage medium 3 (step 004). The map data to be read includes the coordinates (X, Y) of a point sequence on the center line of the lane of the road (the lane along which the vehicle travels) and the lane width w at each point (X, Y): these data are represented in the plane rectangular coordinate system with the x axis set in the north-south direction and the y axis set in the east-west direction.
  • Subsequently, the lane estimating means 5 performs a process of estimating the lane by the map data and position information (hereinafter, referred to as the second lane estimation process) and outputs the second lane information (step 005). The output second lane information includes second estimation execution information indicating whether the lane mark A0 and the lane mark A1 are already estimated by the second lane estimation process, second left-hand line shape information indicating the shape of the estimated lane mark A0 in the case where the lane mark A0 is already estimated, and second right-hand line shape information indicating the shape of the estimated lane mark A1 in the case where the lane mark A1 is already estimated. The second estimation execution information is set to “Yes” if the lane mark A0 and the lane mark A1 are already estimated, or otherwise it is set to “No.”
  • The second lane estimation process is performed as shown in FIG. 4. First, the lane estimating means 5 checks whether the current position information of the vehicle 7 is already obtained by the GPS unit 4 (step 201). Unless the current position information of the vehicle 7 is obtained due to poor reception from GPS satellites or the like, the control proceeds to step 202, where the lane estimating means 5 sets the second estimation execution information to “No.”
  • If the current position information of the vehicle 7 is already obtained in step 201, the control proceeds to step 203, where the lane estimating means 5 coordinate-transforms the obtained position (latitude x0, longitude y0) of the vehicle 7 from the spherical coordinate system to the plane rectangular coordinate system in the map data. Position P0 of the vehicle 7 obtained by the coordinate transformation is as shown in FIGS. 7( a) and 7(b). In FIGS. 7( a) and 7(b), the x axis, y axis, and origin 0 represent the coordinate system xy of the read map data.
  • Next, in step 204, as shown in FIG. 7( a), the lane estimating means 5 coordinate-transforms (translates) the coordinate system xy of the map data to a coordinate system x′y′ with the position P0 of the vehicle 7 as the origin and the x′ axis and y′ axis as coordinate axes. In this situation, the traveling direction of the vehicle 7 is as indicated by an arrow, which is shown as a direction rotated by the traveling orientation of θ degrees from the x′ axis. Next, in step 205, as shown in FIG. 7( b), the lane estimating means 5 rotationally transforms the coordinate system x′y′ (rotates it by θ degrees) to a coordinate system x″y″ with the traveling direction of the vehicle 7 as the x″ direction and with the x″ axis and y″ axis as coordinate axes. This transforms the map data to the coordinate system x″y″. The processes of subsequent steps 206 to 208 are performed in the coordinate system x″y″.
  • Next, in step 206, the lane estimating means 5 searches for a point (reference point P1) on the map data closest to the origin to obtain the coordinates (x1, y1) of the reference point P1 and the lane width w at the reference point P1. Then, in step 207, the lane estimating means 5 extracts a plurality of points (P2, - - - , Pn) in the range from the reference point P1 to a point a given distance X ahead thereof in the vehicle traveling direction (n is an integer of 3 or greater). The given distance X is assumed to be, for example, loom or so. The coordinate data {(x1, y1), (x2, y2), - - - , (xn, yn)} of the reference point P1 and the extracted points (P2, - - - , and Pn) are considered to be point sequence data A4. The point sequence data A4 is as shown in FIG. 8.
  • Next, in step 207, as shown in FIG. 8, the lane estimating means 5 obtains a quadratic equation F4 y=gx*x+hx+i) which approximates the point sequence data A4 (g, h, and i are given coefficients). A least squares method is used then as the approximation method.
  • Next, in step 208, as shown in FIG. 8, the lane estimating means 5 estimates the lane marks A0 and A1 from the obtained quadratic equation F4 and the lane width w. In this process, the lane estimating means 5 estimates the lane mark A0 by a quadratic equation F5 (y=gx*x+hx+i−w/2) and estimates the lane mark A1 by a quadratic equation F6 (y=gx*x+hx+i+w/2). Then, in step 209, the lane estimating means 5 sets the second estimation execution information to “Yes” and sets the second left-hand line shape information to y=gx*x+hx+i−w/2 and the second right-hand line shape information to y=gx*x+hx+i+w/2.
  • Subsequently, returning to FIG. 2, the actual lane recognizing means 6 performs a process of recognizing an actual lane from the first lane information and the second lane information (hereinafter, referred to as actual lane recognition process) and outputs actual lane information and reliability (step 006). The output actual lane information includes actual left recognition execution information indicating whether the lane mark A0 is already recognized by the actual lane recognition process, actual left-hand line shape information indicating the shape of the recognized lane mark A0 in the case where the lane mark A0 is already recognized, actual right recognition execution information indicating whether the lane mark A1 is already recognized by the actual lane recognition process, and actual right-hand line shape information indicating the shape of the recognized lane mark A1 in the case where the lane mark A1 is already recognized. If the lane mark A0 is already recognized, the actual left recognition execution information is set to “Yes,” or otherwise, it is set to “No.” Similarly, the actual right recognition execution information is set to “Yes” if the lane mark A1 is already recognized, or otherwise, it is set to “No.”
  • First, the actual lane recognizing means 6 checks the set values of the first left estimation execution information, the first right estimation execution information, and the second estimation execution information. Thereafter, it performs the actual lane recognition process for the following eight cases (a) to (h) according to the combination of the above three set values.
  • (a) If the first left estimation execution information=“Yes,” the first right estimation execution information=“Yes,” and the second estimation execution information=“Yes”:
  • In this case, the actual lane recognizing means 6 calculates a left-hand line similarity index value L1 which indicates a degree of similarity between the first left-hand line shape information and the second left-hand line shape information, first. As the left-hand line similarity index value L1, a value of a comparison function F1 expressed by the following equation (1) is used.
  • [ Equation 1 ] F 1 ( f 1 ( x ) , f 2 ( x ) ) = x = 0 N ( f 1 ( x ) - f 2 ( x ) ) 2 ( 1 )
  • The actual lane recognizing means 6 considers the left-hand value calculated by using ax*x+bx+c (the first left-hand line shape information) for f1(x) in the right-hand side of equation (1) and using gx*x+hx+i−w/2 (the second left-hand line shape information) for f2(x) to be the left-hand line similarity index value L1. Note that the calculated left-hand line similarity index value L1 decreases as the degree of similarity (similarity) between f1(x) and f2(x) increases. In addition, N in the right-hand side of equation (1) is the number of samples of the values of the predetermined f1(x) and f2(x), for example, N=50.
  • Subsequently, similarly to the left-hand line similarity index value L1, the actual lane recognizing means 6 calculates a right-hand line similarity index value R1, which indicates the degree of similarity between the first right-hand line shape information and the second right-hand line shape information, by using equation (1). The actual lane recognizing means 6 considers the left-hand value calculated by using dx*x+ex+f (the first right-hand line shape information) for f1(x) in the right-hand side of equation (1) and using gx*x+hx+i+w/2 (the second right-hand line shape information) for f2(x) to be the right-hand line similarity index value R1. Note that the calculated right-hand line similarity index value R1 decreases as the degree of similarity (similarity) between f1(x) and f2(x) increases, as is the case with the left-hand line similarity index value L1.
  • Subsequently, the actual lane recognizing means 6 compares the left-hand line similarity index value L1 and the right-hand line similarity index value R1 each with a given threshold value S1. Then, the actual lane recognizing means 6 sets the actual lane information and the reliability for the following two cases (a1) and (a2) according to a result of the comparison.
  • (a1) If L1<S1 and R1<S1:
  • In this case, the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the first left-hand line shape information as the actual left-hand line shape information. In addition, the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and sets the first right-hand line shape information as the actual right-hand line shape information. Thereby, the information indicating that the lane is already recognized is appropriately set as the actual lane information if the degree of similarity is high between the shape of the lane estimated by the image processing and the shape of the lane estimated by the map data and position information and if it is possible to consider that the lane is accurately estimated by the image processing as a result of verification of the reliability of lane estimation by the image processing based on the information on the lane estimated by the map data and position information. In this condition, the shape information on the lane estimated by the image processing that can be considered to be higher in accuracy than the shape information on the lane estimated by the map data and position information is set as the actual lane information regarding the lane marks A0 and A1.
  • Moreover, in this case, the actual lane recognizing means 6 sets the reliability to level 1. Thereby, if both of the lane marks A0 and A1 can be considered to be estimated accurately, the reliability is set to the highest level.
  • (a2) If L1≧S1 or R1≦S1:
  • In this case, the actual lane recognizing means 6 sets the actual left recognition execution information and the actual right recognition execution information each to “No.” Thereby, information indicating that the lane is not recognized is appropriately set as the actual lane information if the shape of the lane estimated by the image processing differs from the shape of the lane estimated by the map data and position information and if the actual lane is likely to be estimated inappropriately in one or both of the lane estimation by the image processing and the lane estimation by the map data and position information.
  • (b) If the first left estimation execution information=“Yes,” the first right estimation execution information=“Yes,” and the second estimation execution information=“No”:
  • In this case, the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the first left-hand line shape information as the actual left-hand line shape information. In addition, the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and sets the first right-hand line shape information as the actual right-hand line shape information. Thereby, if the lane is estimated by the image processing in the case where the lane is not estimated by the map data and position information, the result of the lane estimation by the image processing is directly used as actual lane information.
  • In this case, the actual lane recognizing means 6 sets the reliability to level 2. Thereby, if the lane is estimated only by the image processing and the reliability of the lane estimation by the image processing cannot be verified using the information on the lane estimated by the map data and position information, the reliability is set lower than level 1.
  • (c) If the first left estimation execution information=“Yes,” the first right estimation execution information=“No,” and the second estimation execution information=“Yes”:
  • In this case, the actual lane recognizing means 6 calculates the left-hand line similarity index value L1 using equation (1) from the first left-hand line shape information and the second left-hand line shape information, first. Then, the actual lane recognizing means 6 compares the left-hand line similarity index value L1 with the given threshold value S1. Thereafter, the actual lane recognizing means 6 sets the actual lane information and reliability for the following two cases (c1) and (c2) according to a result of the comparison.
  • (c1) If L1<S1:
  • In this case, the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the first left-hand line shape information as the actual left-hand line shape information. In addition, the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and sets the second right-hand line shape information as the actual right-hand line shape information. Thereby, the information indicating that the lane is already recognized is appropriately set as the actual lane information if the degree of similarity is high between the shape of the lane estimated by the image processing and the shape of the lane estimated by the map data and position information and if it is possible to consider that the lane is accurately estimated by the image processing as a result of verification of the reliability of the lane estimation by the image processing based on the information on the lane estimated by the map data and position information. In this condition, the shape information on the lane estimated by the image processing that can be considered to be higher in accuracy than the shape information on the lane estimated by the map data and position information is set as the actual lane information regarding the lane mark A0. Moreover, regarding the lane mark A1 of the lane that cannot be estimated by the image processing, the image processing result is supplemented with the information on the lane estimated by the map data and position information, by which the opportunities for detecting the lane can be increased as much as possible.
  • Moreover, in this case, the actual lane recognizing means 6 sets the reliability to level 2. Thereby, if only one of the lane marks A0 and A1 can be considered to be estimated accurately, the reliability is set lower than level 1.
  • (c2) If L1≧S1:
  • In this case, the actual lane recognizing means 6 sets the actual left recognition execution information and the actual right recognition execution information each to “No.” Thereby, information indicating that the lane is not recognized is appropriately set as the actual lane information if the shape of the lane estimated by the image processing differs from the shape of the lane estimated by the map data and position information and if the actual lane is likely to be estimated inappropriately in one or both of the lane estimation by the image processing and the lane estimation by the map data and position information.
  • (d) If the first left estimation execution information=“Yes,” the first right estimation execution information=“No,” and the second estimation execution information=“No”:
  • In this case, the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the first left-hand line shape information as the actual left-hand line shape information. In addition, the actual lane recognizing means 6 sets the actual right recognition execution information to “No.” Thereby, if the lane is estimated by the image processing in the case where the lane is not estimated by the map data and position information, the result of the lane estimation by the image processing is directly used as actual lane information.
  • In this case, the actual lane recognizing means 6 sets the reliability to level 3. Thereby, if the lane is estimated only by the image processing and the reliability of the lane estimation by the image processing cannot be verified using the information on the lane estimated by the map data and position information, the reliability is set lower than level 1. Furthermore, only one of the lane marks A0 and A1 is already estimated by the image processing, and therefore the reliability is set lower than level 2.
  • (c) If the first left estimation execution information=“No,” the first right estimation execution information=“Yes,” and the second estimation execution information=“Yes”:
  • In this case, the actual lane recognizing means 6 calculates the right-hand line similarity index value R1 using equation (1) from the first right-hand line shape information and the second right-hand line shape information, first. Then, the actual lane recognizing means 6 compares the right-hand line similarity index value R1 with the given threshold value S1. Thereafter, the actual lane recognizing means 6 sets the actual lane information and reliability for the following two cases (e1) and (e2) according to a result of the comparison.
  • (e1) If R1<S1:
  • In this case, the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the second left-hand line shape information as the actual left-hand line shape information. In addition, the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and sets the first right-hand line shape information as the actual right-hand line shape information. Thereby, the information indicating that the lane is already recognized is appropriately set as the actual lane information if the degree of similarity is high between the shape of the lane estimated by the image processing and the shape of the lane estimated by the map data and position information and if it is possible to consider that the lane is accurately estimated by the image processing as a result of verification of the reliability of the lane estimation by the image processing based on the information on the lane estimated by the map data and position information.
  • In this condition, the shape information on the lane estimated by the image processing that can be considered to be higher in accuracy than the shape information on the lane estimated by the map data and position information is set as the actual lane information regarding the lane mark A1. Moreover, regarding the lane mark A0 of a lane that cannot be estimated by the image processing, the image processing result is supplemented with the information on the lane estimated by the map data and position information, by which the opportunities for detecting the lane can be increased as much as possible.
  • Moreover, in this case, the actual lane recognizing means 6 sets the reliability to level 2. Thereby, if only one of the lane marks A0 and A1 can be considered to be estimated accurately, the reliability is set lower than level 1.
  • (e2) If R1≧S1:
  • In this case, the actual lane recognizing means 6 sets the actual left recognition execution information and the actual right recognition execution information each to “No.” Thereby, the information indicating that the lane is not recognized is appropriately set as the actual lane information if the shape of the lane estimated by the image processing differs from the shape of the lane estimated by the map data and position information and if the actual lane is likely to be estimated inappropriately in one or both of the lane estimation by the image processing and the lane estimation by the map data and position information.
  • (f) If the first left estimation execution information=“No,” the first right estimation execution information=“Yes,” and the second estimation execution information=“No”:
  • In this case, the actual lane recognizing means 6 sets the actual left recognition execution information to “No.” In addition, the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and the first right-hand line shape information as the actual right-hand line shape information. Thereby, if the lane is already estimated by the image processing in the case where the lane is not estimated by the map data and position information, the result of the lane estimation by the image processing is directly used as actual lane information.
  • In this case, the actual lane recognizing means 6 sets the reliability to level 3. Thereby, if the lane is estimated only by the image processing and the reliability of the lane estimation by the image processing cannot be verified using the information on the lane estimated by the map data and position information, the reliability is set lower than level 1. Furthermore, only one of the lane marks A0 and A1 is estimated by the image processing, and therefore the reliability is set lower than level 2.
  • (g) If the first left estimation execution information=“No,” the first right estimation execution information=“No,” and the second estimation execution information=“Yes”:
  • In this case, the actual lane recognizing means 6 sets the actual left recognition execution information to “Yes” and sets the second left-hand line shape information as the actual left-hand line shape information. In addition, the actual lane recognizing means 6 sets the actual right recognition execution information to “Yes” and the second right-hand line shape information as the actual right-hand line shape information. Thereby, even if the lane cannot be estimated by the image processing, the image processing result is supplemented with the information on the lane estimated by the map data and position information, by which the opportunities for detecting the lane can be increased as much as possible.
  • In this case, the actual lane recognizing means 6 sets the reliability to level 3. Thereby, if the lane is estimated only by the map data and position information, the accuracy is considered to be lower than in the case of estimation only by the image processing in consideration of the GPS position-fix accuracy or the like and then the reliability is set lower than level 2.
  • (h) If the first left estimation execution information=“No,” the first right estimation execution information=“No,” and the second estimation execution information=“No”:
  • In this case, the actual lane recognizing means 6 sets the actual left recognition execution information and the actual right recognition execution information each to “No.” In this case, the actual lane recognizing means 6 makes no settings as the actual left-hand line shape information and the actual right-hand line shape information. Thereby, it is clearly indicted that no actual lane has been recognized.
  • As a result of the foregoing processes, the lane can be detected accurately while increasing the opportunities for detecting the lane as much as possible by using the information on the lane estimated by the image processing and the information on the lane estimated by the map data and position information. Moreover, in this embodiment, the vehicle is controlled and information is provided to the driver on the basis of the output actual lane information and reliability. In the above, for example, if the reliability is set to level 1, the steering control of the vehicle is performed on the basis of the actual lane shape information, and if the reliability is set to level 2 or level 3, no steering control of the vehicle is performed, but the driver is provided with information when there is a possibility that the vehicle will deviate from the lane.
  • While the lane is defined by the left-hand line and the right-hand line in the first embodiment, it can be defined, for example, only by one of the left-hand line and the right-hand line as a second embodiment (which corresponds to the second aspect of the present invention). This embodiment provides the same operation and effect as in the case where the lane is defined by the left-hand line and the right-hand line as described above.
  • While the GPS unit 4 is used as position information obtaining means in the first and second embodiments, it is also possible to use position information obtained by autonomous navigation, instead of obtaining the vehicle position information from the GPS.
  • INDUSTRIAL APPLICABILITY
  • As described above, the present invention is adapted for use in providing a driver with information in a vehicle or controlling vehicle behaviors since it can accurately detect a lane while increasing the opportunities for detecting the lane as much as possible by processing the image of a road ahead of the vehicle and obtaining information on the road from a GPS or the like and map data.

Claims (14)

1. (canceled)
2. A vehicle comprising:
an imaging means;
an image processing means which obtains an image of a road via the imaging means, performs a process of estimating a lane of the road by processing the obtained image obtained by the image processing means, and outputs a result of the process as first lane information;
a holding means which holds map data of the road;
a position information obtaining means which obtains the current position information of the vehicle;
a lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information;
a means which calculates lane similarity, which is a degree of similarity between the shape of the lane indicated by the first lane information and the shape of the lane indicated by the second lane information; and
an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information, the second lane information, and a result of comparison between the lane similarity and a given threshold value, wherein the actual lane recognizing means outputs information indicating that the actual lane is not recognized in the case where the lane similarity is equal to or smaller than the given threshold value.
3. The vehicle according to claim 2, wherein the actual lane recognizing means recognizes the actual lane on the basis of the first lane information in the case where the lane similarity is greater than the given threshold value.
4. (canceled)
5. A vehicle comprising:
an imaging means;
an image processing means which obtains an image of a road via the imaging means, performs a process of estimating a lane of the road by processing the obtained image, and outputs a result of the process as first lane information;
a holding means which holds map data of the road;
a position information obtaining means which obtains the current position information of the vehicle;
a lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information;
a means which calculates lane similarity, which is a degree of similarity between the shape of the lane indicated by the first lane information and the shape of the lane indicated by the second lane information; and
an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information, the second lane information, and a result of comparison between the lane similarity and a given threshold value, wherein:
the first lane information includes first estimation execution information indicating whether the lane is estimated by the process of the image processing means and first lane shape information indicating the shape of the estimated lane in the case where the lane is estimated;
the second lane information includes second estimation execution information indicating whether the lane is estimated by the process of the lane estimating means and second lane shape information indicating the shape of the estimated lane in the case where the lane is estimated; and
the actual lane recognizing means includes: a means which determines and outputs actual lane recognition execution information indicating whether the actual lane is recognized on the basis of the first estimation execution information, the second estimation execution information, and the result of comparison between the lane similarity and the given threshold value; and a means which determines and outputs actual lane shape information indicating the shape of the recognized lane, in the case where the lane is recognized, on the basis of the first lane shape information and the second lane shape information.
6. The vehicle according to claim 5, wherein the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the second lane shape information as the actual lane shape information in the case where the first estimation execution information indicates that the lane is not estimated and in the case where the second estimation execution information indicates that the lane is estimated.
7. The vehicle according to claim 5, wherein, in the case where the second estimation execution information indicates that the lane is not estimated, the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the first lane shape information as the actual lane shape information in the case where the first estimation execution information indicates that the lane is estimated; and sets information indicating that the lane is not recognized as the actual lane recognition execution information in the case where the first estimation execution information indicates that the lane is not estimated.
8. (canceled)
9. A lane recognizing device comprising:
an image processing means which performs a process of estimating a lane of a road by processing an image of the road obtained via an imaging means mounted on a vehicle and outputs a result of the process as first lane information;
a holding means which holds map data of the road;
a position information obtaining means which obtains the current position information of the vehicle;
a lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information;
a means which calculates lane similarity, which is a degree of similarity between the shape of the lane indicated by the first lane information and the shape of the lane indicated by the second lane information; and
an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information, the second lane information, and a result of comparison between the lane similarity and a given threshold value, wherein the actual lane recognizing means outputs information indicating that the actual lane is not recognized in the case where the lane similarity is equal to or smaller than the given threshold value.
10. The lane recognizing device according to claim 9, wherein the actual lane recognizing means recognizes the actual lane on the basis of the first lane information in the case where the lane similarity is greater than the given threshold value.
11. (canceled)
12. A lane recognizing device, comprising:
an image processing means which performs a process of estimating a lane of a road by processing an image of the road obtained via an imaging means mounted on a vehicle and outputs a result of the process as first lane information;
a holding means which holds map data of the road;
a position information obtaining means which obtains the current position information of the vehicle;
a lane estimating means which performs a process of estimating the lane of the road using the map data and the current position information and outputs a result of the process as second lane information;
a means which calculates lane similarity, which is a decree of similarity between the shape of the lane indicated by the first lane information and the shape of the lane indicated by the second lane information; and
an actual lane recognizing means which recognizes an actual lane of the road on the basis of the first lane information, the second lane information, and a result of comparison between the lane similarity and a given threshold value,
wherein:
the first lane information includes first estimation execution information indicating whether the lane is estimated by the process of the image processing means and first lane shape information indicating the shape of the estimated lane in the case where the lane is estimated;
the second lane information includes second estimation execution information indicating whether the lane is estimated by the process of the lane estimating means and second lane shape information indicating the shape of the estimated lane in the case where the lane is estimated; and
the actual lane recognizing means includes: a means which determines and outputs actual lane recognition execution information indicating whether the actual lane is recognized on the basis of the first estimation execution information, the second estimation execution information, and the result of comparison between the lane similarity and the given threshold value; and a means which determines and outputs actual lane shape information indicating the shape of the recognized lane in the case where the lane is recognized on the basis of the first lane shape information and the second lane shape information.
13. The lane recognizing device according to claim 12, wherein the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the second lane shape information as the actual lane shape information in the case where the first estimation execution information indicates that the lane is not estimated and in the case where the second estimation execution information indicates that the lane is estimated.
14. The lane recognizing device according to claim 12, wherein, in the case where the second estimation execution information indicates that the lane is not estimated, the actual lane recognizing means sets information indicating that the lane is recognized as the actual lane recognition execution information and sets the first lane shape information as the actual lane shape information in the case where the first estimation execution information indicates that the lane is estimated; and sets information indicating that the lane is not recognized as the actual lane recognition execution information in the case where the first estimation execution information indicates that the lane is not estimated.
US11/919,634 2005-06-27 2006-06-20 Vehicle and lane recognizing device Expired - Fee Related US7970529B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2005-186383 2005-06-27
JP2005186383A JP4392389B2 (en) 2005-06-27 2005-06-27 Vehicle and lane recognition device
PCT/JP2006/312308 WO2007000912A1 (en) 2005-06-27 2006-06-20 Vehicle and lane recognizing device

Publications (2)

Publication Number Publication Date
US20090118994A1 true US20090118994A1 (en) 2009-05-07
US7970529B2 US7970529B2 (en) 2011-06-28

Family

ID=37595166

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/919,634 Expired - Fee Related US7970529B2 (en) 2005-06-27 2006-06-20 Vehicle and lane recognizing device

Country Status (4)

Country Link
US (1) US7970529B2 (en)
EP (1) EP1901259A4 (en)
JP (1) JP4392389B2 (en)
WO (1) WO2007000912A1 (en)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100002911A1 (en) * 2008-07-06 2010-01-07 Jui-Hung Wu Method for detecting lane departure and apparatus thereof
US20100332050A1 (en) * 2008-03-12 2010-12-30 Honda Motor Co., Ltd. Vehicle travel support device, vehicle, and vehicle travel support program
US20110010021A1 (en) * 2008-03-12 2011-01-13 Honda Motor Co., Ltd. Vehicle travel support device, vehicle, and vehicle travel support program
DE102009060600A1 (en) * 2009-12-23 2011-06-30 Volkswagen AG, 38440 Method for assigning driving strip of road to car, involves compensating marking characteristics that are extracted from digital map and determined by image processing device using classifier to find driving path regarding actual position
US20120150428A1 (en) * 2010-12-08 2012-06-14 Wolfgang Niem Method and device for recognizing road signs in the vicinity of a vehicle and for synchronization thereof to road sign information from a digital map
US20120203452A1 (en) * 2011-02-04 2012-08-09 GM Global Technology Operations LLC Method for operating a motor vehicle and motor vehicle
US20130169679A1 (en) * 2011-12-30 2013-07-04 Automotive Research & Test Center Vehicle image display system and correction method thereof
US20130184938A1 (en) * 2012-01-17 2013-07-18 Limn Tech LLC Gps-based machine vision roadway mark locator, inspection apparatus, and marker
US20140032100A1 (en) * 2012-07-24 2014-01-30 Plk Technologies Co., Ltd. Gps correction system and method using image recognition information
US8700251B1 (en) * 2012-04-13 2014-04-15 Google Inc. System and method for automatically detecting key behaviors by vehicles
US20140152829A1 (en) * 2011-07-20 2014-06-05 Denso Corporation Cruising lane recognition device
US20140236472A1 (en) * 2011-12-29 2014-08-21 Barbara Rosario Navigation systems and associated methods
US20140257686A1 (en) * 2013-03-05 2014-09-11 GM Global Technology Operations LLC Vehicle lane determination
US8935057B2 (en) 2012-01-17 2015-01-13 LimnTech LLC Roadway mark data acquisition and analysis apparatus, systems, and methods
US20150248771A1 (en) * 2014-02-28 2015-09-03 Core Logic, Inc. Apparatus and Method for Recognizing Lane
US20150266422A1 (en) * 2014-03-20 2015-09-24 Magna Electronics Inc. Vehicle vision system with curvature estimation
US20170015317A1 (en) * 2015-07-13 2017-01-19 Cruise Automation, Inc. Method for image-based vehicle localization
WO2017023197A1 (en) * 2015-08-03 2017-02-09 Scania Cv Ab Method and system for controlling driving of a vehicle along a road
US9711051B2 (en) 2012-02-03 2017-07-18 Renault S.A.S. Method of determining the position of a vehicle in a traffic lane of a road and methods for detecting alignment and risk of collision between two vehicles
US9784843B2 (en) 2012-01-17 2017-10-10 Limn Tech LLC Enhanced roadway mark locator, inspection apparatus, and marker
CN107636751A (en) * 2015-06-15 2018-01-26 三菱电机株式会社 Traveling lane discriminating gear and traveling lane method of discrimination
TWI625260B (en) * 2012-11-20 2018-06-01 Method and system for detecting lane curvature by using body signal
EP3337197A1 (en) * 2016-12-15 2018-06-20 Dura Operating, LLC Method and system for performing advanced driver assistance system functions using beyond line-of-sight situational awareness
US20180345963A1 (en) * 2015-12-22 2018-12-06 Aisin Aw Co., Ltd. Autonomous driving assistance system, autonomous driving assistance method, and computer program
CN109035218A (en) * 2018-07-09 2018-12-18 武汉武大卓越科技有限责任公司 Pavement patching method for detecting area
US10324472B2 (en) * 2017-03-31 2019-06-18 Honda Motor Co., Ltd. Vehicle control device
US10332400B2 (en) 2015-07-01 2019-06-25 Denso Corporation Intra-lane travel control apparatus and intra-lane travel control method
US20190193726A1 (en) * 2017-12-27 2019-06-27 Honda Motor Co., Ltd. Vehicle control device, vehicle control method, and storage medium
US10508922B2 (en) * 2015-12-24 2019-12-17 Hyundai Motor Company Road boundary detection system and method, and vehicle using the same
US10885355B2 (en) * 2016-11-08 2021-01-05 Mitsubishi Electric Cornoration Object detection device and object detection method
US10984551B2 (en) 2016-03-07 2021-04-20 Denso Corporation Traveling position detection apparatus and traveling position detection method
US11210953B2 (en) * 2016-12-15 2021-12-28 Denso Corporation Driving support device
US11261571B2 (en) 2012-01-17 2022-03-01 LimnTech LLC Roadway maintenance striping control system
US11321572B2 (en) * 2016-09-27 2022-05-03 Nissan Motor Co., Ltd. Self-position estimation method and self-position estimation device
US20220230017A1 (en) * 2021-01-20 2022-07-21 Qualcomm Incorporated Robust lane-boundary association for road map generation

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4246693B2 (en) * 2004-12-24 2009-04-02 富士通テン株式会社 Driving assistance device
EP2333484B1 (en) * 2008-09-25 2014-04-16 Clarion Co., Ltd. Lane determining device and navigation system
DE102008042538A1 (en) * 2008-10-01 2010-04-08 Robert Bosch Gmbh Method and device for determining a lane course
JP4862898B2 (en) * 2009-01-09 2012-01-25 株式会社デンソー Vehicle alarm generator
JP5461065B2 (en) * 2009-05-21 2014-04-02 クラリオン株式会社 Current position specifying device and current position specifying method
EP2316705B1 (en) 2009-10-28 2012-06-20 Honda Research Institute Europe GmbH Behavior-based learning of visual characteristics from real-world traffic scenes for driver assistance systems
EP2574958B1 (en) 2011-09-28 2017-02-22 Honda Research Institute Europe GmbH Road-terrain detection method and system for driver assistance systems
US9145139B2 (en) * 2013-06-24 2015-09-29 Google Inc. Use of environmental information to aid image processing for autonomous vehicles
SE538984C2 (en) * 2013-07-18 2017-03-14 Scania Cv Ab Determination of lane position
KR101582572B1 (en) * 2013-12-24 2016-01-11 엘지전자 주식회사 Driver assistance apparatus and Vehicle including the same
EP2899669A1 (en) 2014-01-22 2015-07-29 Honda Research Institute Europe GmbH Lane relative position estimation method and system for driver assistance systems
WO2015112133A1 (en) 2014-01-22 2015-07-30 Empire Technology Development Llc Relative gps data refinement
US9460624B2 (en) 2014-05-06 2016-10-04 Toyota Motor Engineering & Manufacturing North America, Inc. Method and apparatus for determining lane identification in a roadway
JP6306429B2 (en) * 2014-05-20 2018-04-04 国立大学法人電気通信大学 Steering control system
JP2018120303A (en) * 2017-01-23 2018-08-02 株式会社Soken Object detection device
US10101745B1 (en) 2017-04-26 2018-10-16 The Charles Stark Draper Laboratory, Inc. Enhancing autonomous vehicle perception with off-vehicle collected data
JP7048353B2 (en) 2018-02-28 2022-04-05 本田技研工業株式会社 Driving control device, driving control method and program
FR3082044A1 (en) * 2018-05-31 2019-12-06 Psa Automobiles Sa METHOD AND DEVICE FOR DETECTING THE TRACK ON WHICH A VEHICLE IS MOVING, ACCORDING TO THE DETERMINED DETERMINATIONS
WO2020004817A1 (en) * 2018-06-26 2020-01-02 에스케이텔레콤 주식회사 Apparatus and method for detecting lane information, and computer-readable recording medium storing computer program programmed to execute same method
KR102483649B1 (en) 2018-10-16 2023-01-02 삼성전자주식회사 Vehicle localization method and vehicle localization apparatus
JP7183729B2 (en) * 2018-11-26 2022-12-06 トヨタ自動車株式会社 Imaging abnormality diagnosis device
US11249184B2 (en) 2019-05-07 2022-02-15 The Charles Stark Draper Laboratory, Inc. Autonomous collision avoidance through physical layer tracking
CN111523471B (en) * 2020-04-23 2023-08-04 阿波罗智联(北京)科技有限公司 Method, device, equipment and storage medium for determining lane where vehicle is located
CN113386771A (en) * 2021-07-30 2021-09-14 蔚来汽车科技(安徽)有限公司 Road model generation method and equipment
KR102477547B1 (en) * 2021-09-02 2022-12-14 (주)이씨스 Real-time driving lane detection method and system for autonomous/remote driving and V2X service

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6748302B2 (en) * 2001-01-18 2004-06-08 Nissan Motor Co., Ltd. Lane tracking control system for vehicle
US20050169501A1 (en) * 2004-01-29 2005-08-04 Fujitsu Limited Method and apparatus for determining driving lane of vehicle, and computer product
US20050273215A1 (en) * 2004-06-02 2005-12-08 Nissan Motor Co., Ltd. Adaptive intention estimation method and system

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05297941A (en) 1992-04-21 1993-11-12 Daihatsu Motor Co Ltd Road shape detecting method
JP3556766B2 (en) 1996-05-28 2004-08-25 松下電器産業株式会社 Road white line detector
JPH1172337A (en) 1997-08-28 1999-03-16 Mitsubishi Motors Corp Recognition apparatus for running lane
JP3378490B2 (en) 1998-01-29 2003-02-17 富士重工業株式会社 Road information recognition device
JP3795299B2 (en) 2000-04-07 2006-07-12 本田技研工業株式会社 Vehicle control device
JP3768779B2 (en) 2000-06-02 2006-04-19 三菱電機株式会社 Vehicle steering driving support device
JP3692910B2 (en) 2000-06-27 2005-09-07 日産自動車株式会社 Lane tracking control device
JP3723065B2 (en) 2000-09-19 2005-12-07 トヨタ自動車株式会社 Vehicle alarm device
JP4327389B2 (en) 2001-10-17 2009-09-09 株式会社日立製作所 Travel lane recognition device
JP3888166B2 (en) 2002-01-16 2007-02-28 三菱自動車工業株式会社 Vehicle driving support device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6748302B2 (en) * 2001-01-18 2004-06-08 Nissan Motor Co., Ltd. Lane tracking control system for vehicle
US20050169501A1 (en) * 2004-01-29 2005-08-04 Fujitsu Limited Method and apparatus for determining driving lane of vehicle, and computer product
US20050273215A1 (en) * 2004-06-02 2005-12-08 Nissan Motor Co., Ltd. Adaptive intention estimation method and system

Cited By (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8306700B2 (en) 2008-03-12 2012-11-06 Honda Motor Co., Ltd. Vehicle travel support device, vehicle, and vehicle travel support program
US20100332050A1 (en) * 2008-03-12 2010-12-30 Honda Motor Co., Ltd. Vehicle travel support device, vehicle, and vehicle travel support program
US20110010021A1 (en) * 2008-03-12 2011-01-13 Honda Motor Co., Ltd. Vehicle travel support device, vehicle, and vehicle travel support program
US8346427B2 (en) 2008-03-12 2013-01-01 Honda Motor Co., Ltd. Vehicle travel support device, vehicle, and vehicle travel support program with imaging device for recognizing a travel area
US20100002911A1 (en) * 2008-07-06 2010-01-07 Jui-Hung Wu Method for detecting lane departure and apparatus thereof
US8311283B2 (en) * 2008-07-06 2012-11-13 Automotive Research&Testing Center Method for detecting lane departure and apparatus thereof
DE102009060600A1 (en) * 2009-12-23 2011-06-30 Volkswagen AG, 38440 Method for assigning driving strip of road to car, involves compensating marking characteristics that are extracted from digital map and determined by image processing device using classifier to find driving path regarding actual position
CN102568236A (en) * 2010-12-08 2012-07-11 罗伯特·博世有限公司 Method and device for recognizing road signs and comparing with road signs information
US20120150428A1 (en) * 2010-12-08 2012-06-14 Wolfgang Niem Method and device for recognizing road signs in the vicinity of a vehicle and for synchronization thereof to road sign information from a digital map
US8918277B2 (en) * 2010-12-08 2014-12-23 Robert Bosch Gmbh Method and device for recognizing road signs in the vicinity of a vehicle and for synchronization thereof to road sign information from a digital map
US20120203452A1 (en) * 2011-02-04 2012-08-09 GM Global Technology Operations LLC Method for operating a motor vehicle and motor vehicle
GB2488015A (en) * 2011-02-04 2012-08-15 Gm Global Tech Operations Inc Vehicle navigation system including a camera for determining a lane of travel
US20140152829A1 (en) * 2011-07-20 2014-06-05 Denso Corporation Cruising lane recognition device
US9544546B2 (en) * 2011-07-20 2017-01-10 Denso Corporation Cruising lane recognition in a tunnel
US10222225B2 (en) * 2011-12-29 2019-03-05 Intel Corporation Navigation systems and associated methods
US10222227B2 (en) * 2011-12-29 2019-03-05 Intel Corporation Navigation systems and associated methods
US20140236472A1 (en) * 2011-12-29 2014-08-21 Barbara Rosario Navigation systems and associated methods
US10222226B2 (en) * 2011-12-29 2019-03-05 Intel Corporation Navigation systems and associated methods
US10753760B2 (en) 2011-12-29 2020-08-25 Intel Corporation Navigation systems and associated methods
US9651395B2 (en) * 2011-12-29 2017-05-16 Intel Corporation Navigation systems and associated methods
US9043133B2 (en) * 2011-12-29 2015-05-26 Intel Corporation Navigation systems and associated methods
US20160018237A1 (en) * 2011-12-29 2016-01-21 Intel Corporation Navigation systems and associated methods
US20130169679A1 (en) * 2011-12-30 2013-07-04 Automotive Research & Test Center Vehicle image display system and correction method thereof
US9784843B2 (en) 2012-01-17 2017-10-10 Limn Tech LLC Enhanced roadway mark locator, inspection apparatus, and marker
US11261571B2 (en) 2012-01-17 2022-03-01 LimnTech LLC Roadway maintenance striping control system
US20130184938A1 (en) * 2012-01-17 2013-07-18 Limn Tech LLC Gps-based machine vision roadway mark locator, inspection apparatus, and marker
US9298991B2 (en) * 2012-01-17 2016-03-29 LimnTech LLC GPS-based machine vision roadway mark locator, inspection apparatus, and marker
US8935057B2 (en) 2012-01-17 2015-01-13 LimnTech LLC Roadway mark data acquisition and analysis apparatus, systems, and methods
US9711051B2 (en) 2012-02-03 2017-07-18 Renault S.A.S. Method of determining the position of a vehicle in a traffic lane of a road and methods for detecting alignment and risk of collision between two vehicles
US9216737B1 (en) * 2012-04-13 2015-12-22 Google Inc. System and method for automatically detecting key behaviors by vehicles
US8700251B1 (en) * 2012-04-13 2014-04-15 Google Inc. System and method for automatically detecting key behaviors by vehicles
USRE49650E1 (en) * 2012-04-13 2023-09-12 Waymo Llc System and method for automatically detecting key behaviors by vehicles
USRE49649E1 (en) * 2012-04-13 2023-09-12 Waymo Llc System and method for automatically detecting key behaviors by vehicles
US8935034B1 (en) * 2012-04-13 2015-01-13 Google Inc. System and method for automatically detecting key behaviors by vehicles
US9109907B2 (en) * 2012-07-24 2015-08-18 Plk Technologies Co., Ltd. Vehicle position recognition apparatus and method using image recognition information
US20140032100A1 (en) * 2012-07-24 2014-01-30 Plk Technologies Co., Ltd. Gps correction system and method using image recognition information
TWI625260B (en) * 2012-11-20 2018-06-01 Method and system for detecting lane curvature by using body signal
US20140257686A1 (en) * 2013-03-05 2014-09-11 GM Global Technology Operations LLC Vehicle lane determination
US20150248771A1 (en) * 2014-02-28 2015-09-03 Core Logic, Inc. Apparatus and Method for Recognizing Lane
US11745659B2 (en) 2014-03-20 2023-09-05 Magna Electronics Inc. Vehicular system for controlling vehicle
US20150266422A1 (en) * 2014-03-20 2015-09-24 Magna Electronics Inc. Vehicle vision system with curvature estimation
US11247608B2 (en) 2014-03-20 2022-02-15 Magna Electronics Inc. Vehicular system and method for controlling vehicle
US10406981B2 (en) * 2014-03-20 2019-09-10 Magna Electronics Inc. Vehicle vision system with curvature estimation
CN107636751A (en) * 2015-06-15 2018-01-26 三菱电机株式会社 Traveling lane discriminating gear and traveling lane method of discrimination
US20180165525A1 (en) * 2015-06-15 2018-06-14 Mitsubishi Electric Corporation Traveling lane determining device and traveling lane determining method
US10332400B2 (en) 2015-07-01 2019-06-25 Denso Corporation Intra-lane travel control apparatus and intra-lane travel control method
US20170015317A1 (en) * 2015-07-13 2017-01-19 Cruise Automation, Inc. Method for image-based vehicle localization
US9884623B2 (en) * 2015-07-13 2018-02-06 GM Global Technology Operations LLC Method for image-based vehicle localization
WO2017023197A1 (en) * 2015-08-03 2017-02-09 Scania Cv Ab Method and system for controlling driving of a vehicle along a road
US10703362B2 (en) * 2015-12-22 2020-07-07 Aisin Aw Co., Ltd. Autonomous driving autonomous system, automated driving assistance method, and computer program
US20180345963A1 (en) * 2015-12-22 2018-12-06 Aisin Aw Co., Ltd. Autonomous driving assistance system, autonomous driving assistance method, and computer program
US10508922B2 (en) * 2015-12-24 2019-12-17 Hyundai Motor Company Road boundary detection system and method, and vehicle using the same
US10984551B2 (en) 2016-03-07 2021-04-20 Denso Corporation Traveling position detection apparatus and traveling position detection method
US11321572B2 (en) * 2016-09-27 2022-05-03 Nissan Motor Co., Ltd. Self-position estimation method and self-position estimation device
US10885355B2 (en) * 2016-11-08 2021-01-05 Mitsubishi Electric Cornoration Object detection device and object detection method
CN108388240A (en) * 2016-12-15 2018-08-10 德韧营运有限责任公司 The method and system of advanced driving assistance system function is executed using over the horizon context aware
US11210953B2 (en) * 2016-12-15 2021-12-28 Denso Corporation Driving support device
EP3337197A1 (en) * 2016-12-15 2018-06-20 Dura Operating, LLC Method and system for performing advanced driver assistance system functions using beyond line-of-sight situational awareness
CN110435651A (en) * 2017-03-31 2019-11-12 本田技研工业株式会社 Controller of vehicle
US10324472B2 (en) * 2017-03-31 2019-06-18 Honda Motor Co., Ltd. Vehicle control device
US20190193726A1 (en) * 2017-12-27 2019-06-27 Honda Motor Co., Ltd. Vehicle control device, vehicle control method, and storage medium
CN109035218A (en) * 2018-07-09 2018-12-18 武汉武大卓越科技有限责任公司 Pavement patching method for detecting area
US20220230017A1 (en) * 2021-01-20 2022-07-21 Qualcomm Incorporated Robust lane-boundary association for road map generation
US11636693B2 (en) * 2021-01-20 2023-04-25 Qualcomm Incorporated Robust lane-boundary association for road map generation

Also Published As

Publication number Publication date
JP2007004669A (en) 2007-01-11
EP1901259A4 (en) 2008-11-12
EP1901259A1 (en) 2008-03-19
JP4392389B2 (en) 2009-12-24
US7970529B2 (en) 2011-06-28
WO2007000912A1 (en) 2007-01-04

Similar Documents

Publication Publication Date Title
US7970529B2 (en) Vehicle and lane recognizing device
US11639853B2 (en) Self-localization estimation device
CN106767853B (en) Unmanned vehicle high-precision positioning method based on multi-information fusion
JP5747787B2 (en) Lane recognition device
JP6241422B2 (en) Driving support device, driving support method, and recording medium for storing driving support program
US9208389B2 (en) Apparatus and method for recognizing current position of vehicle using internal network of the vehicle and image sensor
JP4557288B2 (en) Image recognition device, image recognition method, position specifying device using the same, vehicle control device, and navigation device
JP5968064B2 (en) Traveling lane recognition device and traveling lane recognition method
CN110969055B (en) Method, apparatus, device and computer readable storage medium for vehicle positioning
CN108573611B (en) Speed limit sign fusion method and speed limit sign fusion system
US11092442B2 (en) Host vehicle position estimation device
US10679077B2 (en) Road marking recognition device
KR101704405B1 (en) System and method for lane recognition
JP2006208223A (en) Vehicle position recognition device and vehicle position recognition method
CN110858405A (en) Attitude estimation method, device and system of vehicle-mounted camera and electronic equipment
US10846546B2 (en) Traffic signal recognition device
US11042759B2 (en) Roadside object recognition apparatus
JPWO2014115319A1 (en) Road environment recognition system
US11928871B2 (en) Vehicle position estimation device and traveling position estimation method
JP2018048949A (en) Object recognition device
CN115917255A (en) Vision-based location and turn sign prediction
JP2018159752A (en) Method and device for learning map information
CN116524454A (en) Object tracking device, object tracking method, and storage medium
US11267477B2 (en) Device and method for estimating the attention level of a driver of a vehicle
Brown et al. Lateral vehicle state and environment estimation using temporally previewed mapped lane features

Legal Events

Date Code Title Description
AS Assignment

Owner name: HONDA MOTOR CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MORI, NAOKI;KOBAYASHI, SACHIO;REEL/FRAME:020126/0281

Effective date: 20071017

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20230628