Nothing Special   »   [go: up one dir, main page]

US7904247B2 - Drive assist system for vehicle - Google Patents

Drive assist system for vehicle Download PDF

Info

Publication number
US7904247B2
US7904247B2 US12/079,028 US7902808A US7904247B2 US 7904247 B2 US7904247 B2 US 7904247B2 US 7902808 A US7902808 A US 7902808A US 7904247 B2 US7904247 B2 US 7904247B2
Authority
US
United States
Prior art keywords
vehicle
information
determination
detection range
autonomous
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US12/079,028
Other versions
US20080243390A1 (en
Inventor
Yasutaka Nakamori
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Corp
Original Assignee
Denso Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp filed Critical Denso Corp
Assigned to DENSO CORPORATION reassignment DENSO CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAKAMORI, YASUTAKA
Publication of US20080243390A1 publication Critical patent/US20080243390A1/en
Application granted granted Critical
Publication of US7904247B2 publication Critical patent/US7904247B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

Definitions

  • the present invention relates to a drive assist system for a vehicle.
  • the roadside information provided by the roadside apparatus includes information on an obstacle which cannot be sensed or viewed by various sensors (hereinafter referred to as autonomous sensors) mounted in the vehicle. Therefore, the use of such roadside information by the vehicle operator allows an early risk determination to be made for an obstacle or the like present around the vehicle but not viewable from the vehicle.
  • the roadside information has the possibility of being “spoofing” information and the accuracy of the roadside information (e.g., the accuracy of obstacle detection) is unknown to the vehicle operator. Accordingly, the reliability of the information is generally low compared with information provided by the autonomous sensors.
  • the roadside information allows an early risk determination, but the reliability thereof is low.
  • the information provided by the autonomous sensors is higher in reliability than the roadside information, but it does not allow an early risk determination.
  • the roadside information and the information provided by the autonomous sensors have their respective advantages and disadvantages so that an in-vehicle drive assist system which allows an early risk determination has been in demand.
  • the present invention has been achieved in view of the foregoing problems. It is an object of the present invention is to provide an in-vehicle drive assist system which allows an early risk determination to be made, while maintaining reliability.
  • a drive assist system for a vehicle receives, from outside the vehicle, external information including positional information of an object outside the vehicle to thereby assist the vehicle to travel.
  • the drive assist system includes the following.
  • An autonomous detecting unit is configured to detect a viewable object which is at least one of (i) an other vehicle and (ii) a pedestrian.
  • a control unit is configured to perform a control operation for adjusting at least one of (i) a detection range of the autonomous detecting unit and (ii) sighting of the detection range, using positional information of the object included in the received external information.
  • An object detection determining unit is configured to make a determination as to whether or not an object is detected by the autonomous detecting unit after at least one of (i) the detection range of the autonomous detecting unit and (ii) the sighting of the detection range is adjusted by the control unit.
  • a risk determining unit is configured to make a determination about a risk of the vehicle relative to the object when the determination by the object detection determining unit is affirmed.
  • a drive assist unit is configured to provide a drive assistance in accordance with a result of the determination by the risk determining unit.
  • a method for performing drive assistance in a vehicle having an autonomous detecting unit detecting a viewable object which is at least one of (i) an other vehicle and (ii) a pedestrian.
  • the method includes the following: receiving, from outside the vehicle, external information including positional information of an object outside the vehicle; performing a control operation for adjusting at least one of (i) a detection range of the autonomous detecting unit and (ii) sighting of the detection range, using positional information of the object included in the received external information; making a first determination as to whether or not an object is detected by the autonomous detecting unit after at least one of (i) the detection range of the autonomous detecting unit and (ii) the sighting of the detection range is adjusted; making a second determination about a risk of the vehicle relative to the object when the first determination is affirmed; and providing a drive assistance in accordance with a result of the second determination.
  • FIG. 1 is a block diagram showing an overall structure of an in-vehicle drive assist system according to an embodiment of the present invention
  • FIG. 2 is a block diagram showing a structure of a set of autonomous sensors
  • FIG. 3 is block diagram showing a structure of an image detecting apparatus
  • FIG. 4A is a view showing a viewing angle of a camera before the viewing angle is reduced
  • FIG. 4B is a view showing a viewing angle of the camera when the viewing angle is reduced
  • FIG. 5 is a bock diagram showing a structure of a radar device
  • FIG. 6 is a flow chart for illustrating a drive assist process
  • FIG. 7 is a view for illustrating a specific example in making a right-hand turn at an intersection
  • FIG. 8 is a view for illustrating a specific example in approaching a T-junction.
  • FIG. 9 is a view for illustrating a specific example in approaching an intersection which is low in visibility in the absence of a traffic light.
  • FIG. 1 shows an overall structure of the in-vehicle drive assist system.
  • the in-vehicle drive assist system is mounted in a subject vehicle.
  • the in-vehicle drive assist system includes an infrastructural information receiver 10 , a set of autonomous sensors 20 , a drive assist apparatus 30 , an information providing device 40 , and a vehicle-drive control device 50 .
  • the infrastructural information receiver 10 may function as an external information acquiring device as follows.
  • the infrastructural information receiver 10 performs roadside-to-vehicle communication for receiving, from a roadside apparatus disposed on the side of a road, infrastructural information acquired by the roadside apparatus.
  • the infrastructural information can be referred to as external information or environmental information and includes object-by-object positional information (latitude/longitude or the like) of each object such as another vehicle, a pedestrian, or the like present around the vehicle, travel direction information (an angle of deflection when due north is assumed to be a 0 degree or the like), and driving speed information (a speed per hour or the like).
  • the infrastructural information primarily includes information on an object which cannot be viewed from the vehicle, but may also include information on an object which can be viewed from the vehicle.
  • the infrastructural information receiver 10 may also be capable of vehicle-to-vehicle communication.
  • the infrastructural information receiver 10 may also receive, from another vehicle, another-vehicle information including the positional information, the travel direction information, and the driving speed information of the other vehicle.
  • the set of autonomous sensors 20 is composed of an image detecting apparatus 21 , a radar device 22 , and a position detecting apparatus 23 , as shown in FIG. 2 .
  • the position detecting apparatus 23 detects the current position of the subject vehicle and is composed of a geomagnetic sensor 23 a for detecting the bearing of travel of the vehicle from earth magnetism, a gyroscope sensor 23 b for detecting the magnitude of the rotational movement of the vehicle, a distance sensor 23 c for detecting a mileage from an acceleration generated in the fore-and-aft direction of the vehicle or the like, and a GPS receiver 23 d for receiving an electric wave transmitted from a satellite for a GPS (Global Positioning System) via a GPS antenna to detect the current position of the vehicle.
  • GPS Global Positioning System
  • the image detecting apparatus 21 is composed of a camera 21 a using an image sensing element such as a CCD (Charge Coupled Device) or the like, a specified pixel extracting device 21 b for receiving image data of all the pixels of an image sensed by the camera 21 a , extracting those of the pixels specified from the drive assist apparatus 30 , and transmitting image data of the extracted pixels to an image processing device 21 c , and the image processing device 21 c for receiving the image data transmitted from the specified pixel extracting device 21 b in accordance with an NTSC (National Television Standards Committee, 30 frames per second, and about 300 thousand pixels per frame) method and performing predetermined image processing, as shown in FIG. 3 .
  • NTSC National Television Standards Committee
  • an image sensing element having a high resolution of, e.g., several millions of pixels is adopted. Since the specified pixel extracting device 21 b transmits image data to the image processing device 21 c in accordance with the NTSC method, the number of pixels is limited to about 300 thousand. Accordingly, the specified pixel extracting device 21 b extracts only the pixels specified to reduce the sensed image to an image of about 300 thousand pixels and transmits the extracted image data to the image processing device 21 c.
  • the drive assist apparatus 30 specifies pixels to be extracted such that the pixels composing the image are generally uniformly thinned out.
  • the drive assist apparatus 30 specifies pixels to be extracted such that, of the pixels composing the image, those included in the range corresponding to the reduced viewing angle are extracted.
  • the image processing device 21 c performs the predetermined image processing with respect to the image data and recognizes an object displayed as an image.
  • the radar device 22 is composed of an antenna AT including a transmission antenna AS and a reception antenna AR, a transmission circuit 22 a , a reception circuit 22 b , a radar ECU 22 c , and a motor M for rotating the transmission/reception direction (sighting of the detection range or center detection direction of the detection range) of the antenna AT in the lateral direction of the vehicle around the rotation axis CR of the antenna AT.
  • the radar device 22 sends an electric wave in the millimeter wave band from the transmission antenna AS and receives the wave reflected by an object at the reception antenna AR, thereby detecting the distance between the object which has reflected the electric wave and the vehicle, the relative speed of the object, and the relative bearing thereof to the vehicle.
  • the radar ECU 22 c drives the motor M such that the transmission/reception direction of the antenna AT matches the bearing specified from the drive assist apparatus 30 .
  • the drive assist apparatus 30 shown in FIG. 1 acquires the infrastructural information received by the infrastructural information receiver 10 as well as autonomous sensor information detected by the set of autonomous sensors 20 , determines the risk of the vehicle relative to an object present around or outside the vehicle, and provides drive assistance in accordance with the result of the determination using an information providing device 40 and a vehicle-drive control device 50 .
  • the information providing device 40 is composed of a display device and a sound output device each not shown to provide information or an alarm for invoking the attention of the driver to an object around the vehicle depending on the level of the risk of the vehicle relative to the object.
  • the vehicle-drive control device 50 drives a brake actuator (not shown) in response to an instruction from the drive assist apparatus 30 to perform a control operation for decelerating or halting the vehicle.
  • the drive assist apparatus 30 is composed of an infrastructural information determining device 31 , a detection range control device 32 , a risk-level determining device 33 , and a drive-assist-instruction output device 34 , as shown in FIG. 1 .
  • the infrastructural information determining device 31 determines a target object (hereinafter simply referred to as the target object) having the risk of a collision with the vehicle from the object-by-object positional information, travel direction information, and driving speed information of each object included in the infrastructural information obtained from the infrastructural information receiver 10 and transmits the result of the determination to the risk-level determining device 33 .
  • the infrastructural information determining device 31 also transmits the infrastructural information and the result of the determination to the detection range control device 32 .
  • the detection range control device 32 calculates the objective detection range of the set of autonomous sensors 20 and the sighting of the detection range based on the infrastructural information by using the location of the target object as a reference. That is, based on the infrastructural information, the future behavior (locus) of the target object is estimated from the current position thereof, and the reduced viewing angle ⁇ /n of the camera 21 a and the bearing of the transmission/reception direction of the antenna AT of the radar device 22 are calculated from the estimated behavior, the current position of the vehicle, the speed thereof, the bearing of travel thereof, and the like. From the results of the calculation, the coordinates of the pixels to be extracted corresponding to the viewing angle ⁇ /n after the viewing angle of the camera 21 a is reduced and the angle of rotation of the antenna AT are further calculated and transmitted to the set of autonomous sensors 20 .
  • FIG. 4A shows the viewing angle ⁇ when the viewing angle of the cameral 21 a is not reduced. Because the camera 21 a uses a wide angle lens (not shown), it has a viewing angle of about 180 degrees laterally of the vehicle.
  • FIG. 4B shows the viewing angle ⁇ /n when the viewing angle is reduced to 1/n.
  • the present embodiment adjusts the viewing angle of the camera 21 a based on the position of the target object such that it is reduced.
  • the viewing angle of the camera 21 a is reduced, the number of pixels assigned to the reduced viewing angle is larger than before the viewing angle is reduced. This allows recognition of an object at a further distance away from the camera than before the viewing angle is reduced. For example, when the viewing angle ⁇ is reduced to 1/n, it becomes possible to recognize a target object at an approximately n times larger distance away from the camera.
  • the pixels at the pixel coordinates specified from the drive assist apparatus 30 are extracted and transmitted by the specified pixel extracting device 21 b to the image processing device 21 c .
  • the image data received by the image processing device 21 c forms an image at the reduced viewing angle.
  • the specified pixel extracting device 21 b may also be embedded in the camera 21 a.
  • the radar device 22 drives the motor M such that the transmission/reception direction of the antenna AT matches the bearing specified from the drive assist apparatus 30 . This allows an early detection of the target object using the image detecting apparatus 21 and the radar device 22 each capable of obtaining highly reliable autonomous sensor information.
  • the risk-level determining device 33 shown in FIG. 1 determines the level (e.g., high and low two levels) of the risk of the vehicle relative to the target object.
  • the drive-assist-instruction output device 34 outputs, to the information providing device 40 , a drive assist instruction for causing it to provide information or generate an alarm indicating the possible risk of colliding with the target object. As a result, it is possible to invoke the attention of the driver of the vehicle to the target object.
  • the drive-assist-instruction output device 34 When the risk level as the result of the determination by the risk-level determining device 33 is high, the drive-assist-instruction output device 34 outputs, to the information providing device 40 , the drive assist instruction for causing it to provide information or generate an alarm indicating the possible risk of colliding with the target object, while outputting a drive signal for driving the brake actuator to the vehicle-drive control device 50 .
  • the drive-assist-instruction output device 34 outputs, to the information providing device 40 , the drive assist instruction for causing it to provide information or generate an alarm indicating the possible risk of colliding with the target object, while outputting a drive signal for driving the brake actuator to the vehicle-drive control device 50 .
  • step S 1 of FIG. 6 it is determined whether or not the most recently acquired infrastructural information is present.
  • YES is given as a result of the determination
  • the process advances to a step S 2 .
  • NO is given as a result of the determination, a standby state is held until the latest infrastructural information is acquired.
  • step S 2 the infrastructural information and the autonomous sensor information are acquired.
  • step S 3 a target object with the possible risk of colliding with the vehicle is determined.
  • step S 4 it is determined from the result of the determination in the step S 3 whether or not there is the target object with the possible risk of colliding with the vehicle.
  • YES is given as a result of the determination
  • the process advances to a step S 5 .
  • NO is given as a result of the determination, the process moves to the step S 1 where the process described above is repeated.
  • the behavior of the target object is estimated from the infrastructural information.
  • the time (a time measured from the current time or the like) and position at which the target object enters the detectable range of the set of autonomous sensors 20 are calculated from the result of behavior estimation in the step S 5 .
  • a step S 7 it is determined whether or not the time at which the target object enters the detectable range of the set of autonomous sensors 20 is reached.
  • the process advances to a step S 8 .
  • NO is given as a result of the determination, the standby state is held until the time at which the target object enters the detectable range of the set of autonomous sensors 20 is reached.
  • step S 8 instructions to tune or adjust the detection range of the set of autonomous sensors 20 and the sighting of the detection range based on the position of the target object are outputted to the set of autonomous sensors 20 .
  • the instructions include the coordinates of the pixels to be extracted to the image detecting apparatus 21 and the angle of rotation of the antenna AT to the radar device 22 .
  • step S 9 it is determined whether or not the target object is detected by the set of autonomous sensors 20 . When YES is given as a result of the determination, the process advances to a step S 10 . When NO is given as a result of the determination, the standby state is held until the target object is detected by the set of autonomous sensors 20 .
  • step S 10 the set of autonomous sensors 20 performs a tracking detection of the target object and an instruction is outputted to provide drive assistance in accordance with the level of the risk relative to the target object.
  • step S 11 it is determined whether or not a normal drive assist using only the set of autonomous sensors 20 is restored. When YES is given as a result of the determination, the present process is ended. When NO is given as a result of the determination, the process moves to the step S 1 where the process described above is repeated.
  • FIG. 7 shows a scene in which a two-wheeler is present in a blind spot behind an oncoming vehicle when the vehicle makes a right-hand turn at an intersection.
  • a roadside apparatus can detect the position of the two-wheeler present in the blind spot behind the oncoming vehicle, the direction of travel thereof, and the driving speed thereof and allows the vehicle to recognize the presence of the two-wheeler from the infrastructural information.
  • the set of autonomous sensors 20 cannot detect the two-wheeler which cannot be viewed from the vehicle.
  • the viewing angle of the camera 21 a is adjusted to be reduced to a position where the two-wheeler will assumedly rush out (appear from the blind spot behind the oncoming vehicle), as shown in FIG. 7 .
  • the infrastructural information includes information indicating that the target object is the two-wheeler (i.e., information specifying the type of the target object)
  • the recognition of the two-wheeler need not be made from the position and speed of the target object detected by the set of autonomous sensors 20 (or the level of the recognition of the two-wheeler may also be lowered instead).
  • Information is provided which allows the provision of drive assistance in accordance with the risk level when the two-wheeler is detected by the set of autonomous sensors 20 , and allows the vehicle to make a safe start when the two-wheeler passes in front of the vehicle.
  • the embodiment adopts such a method for tuning the detection range of the set of autonomous sensors 20 to the estimated position where the two-wheeler will appear and for adjusting sighting of the detection range.
  • the method may also include calculating the direction of travel of the vehicle from the tracking locus of the vehicle, the angle of steering of the vehicle, or the like.
  • FIG. 8 shows a scene in which, when a vehicle passes through a T-junction at which a stop sign is placed, another vehicle is approaching the T-junction from a non-preferential direction in which the other vehicle cannot be viewed from the vehicle.
  • a roadside apparatus can detect the position of the other vehicle which cannot be viewed from the vehicle, the direction of travel thereof, and the driving speed thereof, and the like and allows the vehicle to know the presence of the other vehicle from the infrastructural information.
  • the set of autonomous sensors 20 cannot detect the other vehicle which cannot be viewed from the vehicle.
  • the viewing angle of the camera 21 a is adjusted to be reduced to a position where the other vehicle will assumedly rush out (appear from the blind spot), as shown in FIG. 8 .
  • This allows a high-accuracy and early detection of the other vehicle using the camera 21 a.
  • the vehicle is decelerated until the other vehicle is detected using the set of autonomous sensors 20 .
  • the other vehicle is then detected using the set of autonomous sensors 20 .
  • the deceleration of the vehicle is stopped.
  • the vehicle is driving, the vehicle is halted.
  • the other vehicle has halted, information which allows the vehicle to make a safe start is provided.
  • the drive assist process is ended.
  • FIG. 9 shows a scene in which, when a vehicle passes through an intersection which is low in visibility in the absence of a traffic light, another vehicle is approaching the intersection from a right direction in which the other vehicle cannot be viewed from the vehicle.
  • a risk determination is made using only the autonomous sensor information. For instance, if the other vehicle is detected only using the image detecting apparatus 21 , whether the other vehicle is approaching from the right direction or the left direction cannot be recognized until the vehicle approaches sufficiently close to the intersection.
  • the viewing angle of the camera 21 a is about 180 degrees laterally of the vehicle, the displayed image of the other vehicle is small due to the wide viewing angle. From the sensed image, the other vehicle cannot be recognized as such until it approaches sufficiently close to the vehicle. Thus, an early risk determination cannot be made only with the autonomous sensor information.
  • a risk determination may be made using only the infrastructural information.
  • the infrastructural information is not usually real-time information because it may be information provided spotwise such as a light beacon or information limited to a specified range under the constraints on the detection range of the roadside apparatus.
  • the infrastructural information is lower in reliability than the autonomous sensor information, as described above.
  • the viewing angle of the camera 21 a is reduced into a diagonally right forward direction based on the information, as shown in FIG. 9 . This allows the other vehicle as the target object to be displayed as a large image.
  • the viewing angle is reduced to 1/n, the other vehicle can be recognized from a position at an approximately n times larger distance away from the camera 21 a.
  • the in-vehicle drive assist system has focused attention on the respective advantages of the infrastructural information and the autonomous sensor information and adjusts the detection range of the set of autonomous sensors 20 and the sighting of the detection range using the infrastructural information obtained from outside the vehicle.
  • This allows a preferential detection of the direction in which an object such as another vehicle or a pedestrian present around the vehicle exists and the direction in which the object can be viewed first from the vehicle.
  • by making a risk determination from the autonomous sensor information resulting from the detection by the set of autonomous sensors 20 it is possible to perform an early risk determination, while maintaining reliability.
  • a method for providing information may also be changed. By changing the method to one which intensifies the color, shape, sound, or the like of the display, it is possible to give a sense of relief to the driver.
  • a software unit e.g., subroutine
  • a hardware unit e.g., circuit or integrated circuit
  • the hardware unit can be constructed inside of a microcomputer.
  • the software unit or any combinations of multiple software units can be included in a software program, which can be contained in a computer-readable storage media or can be downloaded and installed in a computer via a communications network.
  • an in-vehicle drive assist system is provided as follows.
  • the system is configured to receive, from outside a vehicle, external information including positional information of at least one object which is either an other vehicle or a pedestrian present around the vehicle and to assist the vehicle to drive using the received external information.
  • the system includes the following.
  • An autonomous detecting unit is configured to detect the at least one object which is either the other vehicle or the pedestrian viewable from the vehicle.
  • a control unit is configured to perform a control operation for adjusting at least one of a detection range of the autonomous detecting unit and sighting of the detection range using the positional information of the object included in the external information.
  • An object detection determining unit is configured to detect whether or not the object is detected by the autonomous detecting unit of which at least one of the detection range and the sighting of the detection range has been adjusted by the control unit.
  • a risk determining unit is configured to determine a risk of the vehicle relative to the object when it is determined that the object is detected by the object detection determining unit.
  • a drive assist unit is configured to provide drive assistance in accordance with a result of the determination by the risk determining unit.
  • the autonomous sensors 20 may serve as the above autonomous detecting unit.
  • the detection range control device 32 may serve as the above control unit.
  • the infrastructural information determining device 31 may serve as the object detection determining unit.
  • the risk-level determining device 33 may serve as the risk determining unit.
  • the information providing device 40 and the vehicle-drive control device 50 may serve as the drive assist unit.
  • roadside information and information (which will be referred to as autonomous sensor information) provided by the autonomous detecting unit and adjusts the detection range of the autonomous detecting unit, the sighting of the detection range, and the like using the roadside information obtained from outside the vehicle.
  • This allows the autonomous detecting unit to perform a preferential detection in the direction in which an object such as another vehicle or a pedestrian present around the vehicle exists and in the direction in which the object can be viewed first from the vehicle.
  • determining a risk from the information detected by the autonomous detecting unit it is possible to make an early risk determination, while maintaining reliability.
  • control unit may perform the control operation for adjusting the sighting of the detection range based on the position of the object. This allows an early-stage detection of the object using the autonomous detecting unit capable of obtaining highly reliable information.
  • the control unit may perform the control operation for adjusting the detection range such that it is narrowed down based on the position of the object.
  • the autonomous detecting unit may include a camera for sensing an image around the vehicle, and the control unit can achieve a more remarkable effect by performing a control operation for adjusting a viewing angle which is an image sensing range of the camera such that it is reduced.
  • the number of pixels assigned to the reduced viewing angle is larger than that before the viewing angle is reduced. This allows recognition of an object at a further distance away from the camera than before the viewing angle is reduced. For example, when the viewing angle is reduced to 1/n, it becomes possible to recognize an object at an approximately n times larger distance away from the camera.
  • a method for performing drive assistance in a vehicle having an autonomous detecting unit detecting a viewable object which is at least one of (i) an other vehicle and (ii) a pedestrian.
  • the method includes the following: receiving, from outside the vehicle, external information including positional information of an object outside the vehicle; performing a control operation for adjusting at least one of (i) a detection range of the autonomous detecting unit and (ii) sighting of the detection range, using positional information of the object included in the received external information; making a first determination as to whether or not an object is detected by the autonomous detecting unit after at least one of (i) the detection range of the autonomous detecting unit and (ii) the sighting of the detection range is adjusted; making a second determination about a risk of the vehicle relative to the object when the first determination is affirmed; and providing a drive assistance in accordance with a result of the second determination.
  • Adopting the method of the another aspect can produce the same advantage as that of the above drive assist system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

An in-vehicle drive assist system adjusts a viewing angle of a camera mounted in a vehicle using environmental information obtained from outside the vehicle. An early risk determination is thus allowed to be made in the vehicle, while maintaining reliability.

Description

CROSS REFERENCE TO RELATED APPLICATION
This application is based on and incorporates herein by reference Japanese Patent Application No. 2007-82707 filed on Mar. 27, 2007.
FIELD OF THE INVENTION
The present invention relates to a drive assist system for a vehicle.
BACKGROUND OF THE INVENTION
Conventionally, drive assist technologies such as disclosed in Patent Documents 1 to 6 have been proposed in each of which a roadside apparatus provides information required by a vehicle to drive safely to an in-vehicle apparatus, and the in-vehicle apparatus assists the vehicle to drive based on the provided information.
    • Patent Document 1: JP-2001-23098 A
    • Patent Document 2: JP-2001-101599 A
    • Patent Document 3: JP-2001-167387 A
    • Patent Document 4: JP-2001-319295 A
    • Patent Document 5: JP-2004-38244 A
    • Patent Document 6: JP-2006-236094 A
The roadside information provided by the roadside apparatus includes information on an obstacle which cannot be sensed or viewed by various sensors (hereinafter referred to as autonomous sensors) mounted in the vehicle. Therefore, the use of such roadside information by the vehicle operator allows an early risk determination to be made for an obstacle or the like present around the vehicle but not viewable from the vehicle.
However, the roadside information has the possibility of being “spoofing” information and the accuracy of the roadside information (e.g., the accuracy of obstacle detection) is unknown to the vehicle operator. Accordingly, the reliability of the information is generally low compared with information provided by the autonomous sensors.
That is, compared with the information provided by the autonomous sensors, the roadside information allows an early risk determination, but the reliability thereof is low. On the other hand, the information provided by the autonomous sensors is higher in reliability than the roadside information, but it does not allow an early risk determination.
Thus, the roadside information and the information provided by the autonomous sensors have their respective advantages and disadvantages so that an in-vehicle drive assist system which allows an early risk determination has been in demand.
SUMMARY OF THE INVENTION
The present invention has been achieved in view of the foregoing problems. It is an object of the present invention is to provide an in-vehicle drive assist system which allows an early risk determination to be made, while maintaining reliability.
According to an example of the present invention, a drive assist system for a vehicle is provided as follows. The drive assist system receives, from outside the vehicle, external information including positional information of an object outside the vehicle to thereby assist the vehicle to travel. The drive assist system includes the following. An autonomous detecting unit is configured to detect a viewable object which is at least one of (i) an other vehicle and (ii) a pedestrian. A control unit is configured to perform a control operation for adjusting at least one of (i) a detection range of the autonomous detecting unit and (ii) sighting of the detection range, using positional information of the object included in the received external information. An object detection determining unit is configured to make a determination as to whether or not an object is detected by the autonomous detecting unit after at least one of (i) the detection range of the autonomous detecting unit and (ii) the sighting of the detection range is adjusted by the control unit. A risk determining unit is configured to make a determination about a risk of the vehicle relative to the object when the determination by the object detection determining unit is affirmed. A drive assist unit is configured to provide a drive assistance in accordance with a result of the determination by the risk determining unit.
According to another example of the present invention, a method is provided for performing drive assistance in a vehicle having an autonomous detecting unit detecting a viewable object which is at least one of (i) an other vehicle and (ii) a pedestrian. The method includes the following: receiving, from outside the vehicle, external information including positional information of an object outside the vehicle; performing a control operation for adjusting at least one of (i) a detection range of the autonomous detecting unit and (ii) sighting of the detection range, using positional information of the object included in the received external information; making a first determination as to whether or not an object is detected by the autonomous detecting unit after at least one of (i) the detection range of the autonomous detecting unit and (ii) the sighting of the detection range is adjusted; making a second determination about a risk of the vehicle relative to the object when the first determination is affirmed; and providing a drive assistance in accordance with a result of the second determination.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other objects, features, and advantages of the present invention will become more apparent from the following detailed description made with reference to the accompanying drawings. In the drawings:
FIG. 1 is a block diagram showing an overall structure of an in-vehicle drive assist system according to an embodiment of the present invention;
FIG. 2 is a block diagram showing a structure of a set of autonomous sensors;
FIG. 3 is block diagram showing a structure of an image detecting apparatus;
FIG. 4A is a view showing a viewing angle of a camera before the viewing angle is reduced;
FIG. 4B is a view showing a viewing angle of the camera when the viewing angle is reduced;
FIG. 5 is a bock diagram showing a structure of a radar device;
FIG. 6 is a flow chart for illustrating a drive assist process;
FIG. 7 is a view for illustrating a specific example in making a right-hand turn at an intersection;
FIG. 8 is a view for illustrating a specific example in approaching a T-junction; and
FIG. 9 is a view for illustrating a specific example in approaching an intersection which is low in visibility in the absence of a traffic light.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring to the drawings, an embodiment of an in-vehicle drive assist system for a vehicle according to an embodiment of the present invention will be described. FIG. 1 shows an overall structure of the in-vehicle drive assist system. The in-vehicle drive assist system is mounted in a subject vehicle. The in-vehicle drive assist system includes an infrastructural information receiver 10, a set of autonomous sensors 20, a drive assist apparatus 30, an information providing device 40, and a vehicle-drive control device 50.
The infrastructural information receiver 10 may function as an external information acquiring device as follows. The infrastructural information receiver 10 performs roadside-to-vehicle communication for receiving, from a roadside apparatus disposed on the side of a road, infrastructural information acquired by the roadside apparatus. The infrastructural information can be referred to as external information or environmental information and includes object-by-object positional information (latitude/longitude or the like) of each object such as another vehicle, a pedestrian, or the like present around the vehicle, travel direction information (an angle of deflection when due north is assumed to be a 0 degree or the like), and driving speed information (a speed per hour or the like). The infrastructural information primarily includes information on an object which cannot be viewed from the vehicle, but may also include information on an object which can be viewed from the vehicle.
The infrastructural information receiver 10 may also be capable of vehicle-to-vehicle communication. In other words, the infrastructural information receiver 10 may also receive, from another vehicle, another-vehicle information including the positional information, the travel direction information, and the driving speed information of the other vehicle.
The set of autonomous sensors 20 is composed of an image detecting apparatus 21, a radar device 22, and a position detecting apparatus 23, as shown in FIG. 2. The position detecting apparatus 23 detects the current position of the subject vehicle and is composed of a geomagnetic sensor 23 a for detecting the bearing of travel of the vehicle from earth magnetism, a gyroscope sensor 23 b for detecting the magnitude of the rotational movement of the vehicle, a distance sensor 23 c for detecting a mileage from an acceleration generated in the fore-and-aft direction of the vehicle or the like, and a GPS receiver 23 d for receiving an electric wave transmitted from a satellite for a GPS (Global Positioning System) via a GPS antenna to detect the current position of the vehicle.
The image detecting apparatus 21 is composed of a camera 21 a using an image sensing element such as a CCD (Charge Coupled Device) or the like, a specified pixel extracting device 21 b for receiving image data of all the pixels of an image sensed by the camera 21 a, extracting those of the pixels specified from the drive assist apparatus 30, and transmitting image data of the extracted pixels to an image processing device 21 c, and the image processing device 21 c for receiving the image data transmitted from the specified pixel extracting device 21 b in accordance with an NTSC (National Television Standards Committee, 30 frames per second, and about 300 thousand pixels per frame) method and performing predetermined image processing, as shown in FIG. 3.
For the camera 21 a, an image sensing element having a high resolution of, e.g., several millions of pixels is adopted. Since the specified pixel extracting device 21 b transmits image data to the image processing device 21 c in accordance with the NTSC method, the number of pixels is limited to about 300 thousand. Accordingly, the specified pixel extracting device 21 b extracts only the pixels specified to reduce the sensed image to an image of about 300 thousand pixels and transmits the extracted image data to the image processing device 21 c.
In a normal case, the drive assist apparatus 30 specifies pixels to be extracted such that the pixels composing the image are generally uniformly thinned out. On the other hand, when the viewing angle of the camera 21 a is reduced to a given angle range, the drive assist apparatus 30 specifies pixels to be extracted such that, of the pixels composing the image, those included in the range corresponding to the reduced viewing angle are extracted. The image processing device 21 c performs the predetermined image processing with respect to the image data and recognizes an object displayed as an image.
As shown in FIG. 5, the radar device 22 is composed of an antenna AT including a transmission antenna AS and a reception antenna AR, a transmission circuit 22 a, a reception circuit 22 b, a radar ECU 22 c, and a motor M for rotating the transmission/reception direction (sighting of the detection range or center detection direction of the detection range) of the antenna AT in the lateral direction of the vehicle around the rotation axis CR of the antenna AT. The radar device 22 sends an electric wave in the millimeter wave band from the transmission antenna AS and receives the wave reflected by an object at the reception antenna AR, thereby detecting the distance between the object which has reflected the electric wave and the vehicle, the relative speed of the object, and the relative bearing thereof to the vehicle. The radar ECU 22 c drives the motor M such that the transmission/reception direction of the antenna AT matches the bearing specified from the drive assist apparatus 30.
The drive assist apparatus 30 shown in FIG. 1 acquires the infrastructural information received by the infrastructural information receiver 10 as well as autonomous sensor information detected by the set of autonomous sensors 20, determines the risk of the vehicle relative to an object present around or outside the vehicle, and provides drive assistance in accordance with the result of the determination using an information providing device 40 and a vehicle-drive control device 50.
The information providing device 40 is composed of a display device and a sound output device each not shown to provide information or an alarm for invoking the attention of the driver to an object around the vehicle depending on the level of the risk of the vehicle relative to the object. The vehicle-drive control device 50 drives a brake actuator (not shown) in response to an instruction from the drive assist apparatus 30 to perform a control operation for decelerating or halting the vehicle.
The drive assist apparatus 30 is composed of an infrastructural information determining device 31, a detection range control device 32, a risk-level determining device 33, and a drive-assist-instruction output device 34, as shown in FIG. 1. The infrastructural information determining device 31 determines a target object (hereinafter simply referred to as the target object) having the risk of a collision with the vehicle from the object-by-object positional information, travel direction information, and driving speed information of each object included in the infrastructural information obtained from the infrastructural information receiver 10 and transmits the result of the determination to the risk-level determining device 33. The infrastructural information determining device 31 also transmits the infrastructural information and the result of the determination to the detection range control device 32.
To allow the set of autonomous sensors 20 to detect the target object at an early stage, the detection range control device 32 calculates the objective detection range of the set of autonomous sensors 20 and the sighting of the detection range based on the infrastructural information by using the location of the target object as a reference. That is, based on the infrastructural information, the future behavior (locus) of the target object is estimated from the current position thereof, and the reduced viewing angle θ/n of the camera 21 a and the bearing of the transmission/reception direction of the antenna AT of the radar device 22 are calculated from the estimated behavior, the current position of the vehicle, the speed thereof, the bearing of travel thereof, and the like. From the results of the calculation, the coordinates of the pixels to be extracted corresponding to the viewing angle θ/n after the viewing angle of the camera 21 a is reduced and the angle of rotation of the antenna AT are further calculated and transmitted to the set of autonomous sensors 20.
FIG. 4A shows the viewing angle θ when the viewing angle of the cameral 21 a is not reduced. Because the camera 21 a uses a wide angle lens (not shown), it has a viewing angle of about 180 degrees laterally of the vehicle. FIG. 4B shows the viewing angle θ/n when the viewing angle is reduced to 1/n.
Thus, the present embodiment adjusts the viewing angle of the camera 21 a based on the position of the target object such that it is reduced. When the viewing angle of the camera 21 a is reduced, the number of pixels assigned to the reduced viewing angle is larger than before the viewing angle is reduced. This allows recognition of an object at a further distance away from the camera than before the viewing angle is reduced. For example, when the viewing angle θ is reduced to 1/n, it becomes possible to recognize a target object at an approximately n times larger distance away from the camera.
When the viewing angle θ is reduced to the viewing angle θ/n, the pixels at the pixel coordinates specified from the drive assist apparatus 30 are extracted and transmitted by the specified pixel extracting device 21 b to the image processing device 21 c. As a result, the image data received by the image processing device 21 c forms an image at the reduced viewing angle. The specified pixel extracting device 21 b may also be embedded in the camera 21 a.
The radar device 22 drives the motor M such that the transmission/reception direction of the antenna AT matches the bearing specified from the drive assist apparatus 30. This allows an early detection of the target object using the image detecting apparatus 21 and the radar device 22 each capable of obtaining highly reliable autonomous sensor information.
When the target object is detected by the set of autonomous sensor 20, the risk-level determining device 33 shown in FIG. 1 determines the level (e.g., high and low two levels) of the risk of the vehicle relative to the target object. When the risk level as the result of the determination by the risk-level determining device 33 is low, the drive-assist-instruction output device 34 outputs, to the information providing device 40, a drive assist instruction for causing it to provide information or generate an alarm indicating the possible risk of colliding with the target object. As a result, it is possible to invoke the attention of the driver of the vehicle to the target object.
When the risk level as the result of the determination by the risk-level determining device 33 is high, the drive-assist-instruction output device 34 outputs, to the information providing device 40, the drive assist instruction for causing it to provide information or generate an alarm indicating the possible risk of colliding with the target object, while outputting a drive signal for driving the brake actuator to the vehicle-drive control device 50. As a result, it is possible to invoke the attention of the driver of the vehicle to the target object, while preventing a collision with the target object in advance and reducing the shock of the collision.
Next, a description will be given to a drive assist process by the drive assist apparatus 30 in the in-vehicle drive assist system using the flow chart shown in FIG. 6. In a step S1 of FIG. 6, it is determined whether or not the most recently acquired infrastructural information is present. When YES is given as a result of the determination, the process advances to a step S2. When NO is given as a result of the determination, a standby state is held until the latest infrastructural information is acquired.
In the step S2, the infrastructural information and the autonomous sensor information are acquired. In a step S3, a target object with the possible risk of colliding with the vehicle is determined. In a step S4, it is determined from the result of the determination in the step S3 whether or not there is the target object with the possible risk of colliding with the vehicle. When YES is given as a result of the determination, the process advances to a step S5. When NO is given as a result of the determination, the process moves to the step S1 where the process described above is repeated.
In the step S5, the behavior of the target object is estimated from the infrastructural information. In a step S6, the time (a time measured from the current time or the like) and position at which the target object enters the detectable range of the set of autonomous sensors 20 are calculated from the result of behavior estimation in the step S5.
In a step S7, it is determined whether or not the time at which the target object enters the detectable range of the set of autonomous sensors 20 is reached. When YES is given as a result of the determination, the process advances to a step S8. When NO is given as a result of the determination, the standby state is held until the time at which the target object enters the detectable range of the set of autonomous sensors 20 is reached.
In the step S8, instructions to tune or adjust the detection range of the set of autonomous sensors 20 and the sighting of the detection range based on the position of the target object are outputted to the set of autonomous sensors 20. The instructions include the coordinates of the pixels to be extracted to the image detecting apparatus 21 and the angle of rotation of the antenna AT to the radar device 22. In a step S9, it is determined whether or not the target object is detected by the set of autonomous sensors 20. When YES is given as a result of the determination, the process advances to a step S10. When NO is given as a result of the determination, the standby state is held until the target object is detected by the set of autonomous sensors 20.
In the step S10, the set of autonomous sensors 20 performs a tracking detection of the target object and an instruction is outputted to provide drive assistance in accordance with the level of the risk relative to the target object. In a step S11, it is determined whether or not a normal drive assist using only the set of autonomous sensors 20 is restored. When YES is given as a result of the determination, the present process is ended. When NO is given as a result of the determination, the process moves to the step S1 where the process described above is repeated.
With reference to FIGS. 7 to 9, a specific description will be given to in what scenes the in-vehicle drive assist system according to the present embodiment achieves effects. The description will be given hereinbelow to the case when the viewing angle of the camera 21 a is reduced.
(Right-Hand Turn at Intersection)
FIG. 7 shows a scene in which a two-wheeler is present in a blind spot behind an oncoming vehicle when the vehicle makes a right-hand turn at an intersection. In such a scene, a roadside apparatus can detect the position of the two-wheeler present in the blind spot behind the oncoming vehicle, the direction of travel thereof, and the driving speed thereof and allows the vehicle to recognize the presence of the two-wheeler from the infrastructural information. On the other hand, the set of autonomous sensors 20 cannot detect the two-wheeler which cannot be viewed from the vehicle.
In such a case, therefore, the viewing angle of the camera 21 a is adjusted to be reduced to a position where the two-wheeler will assumedly rush out (appear from the blind spot behind the oncoming vehicle), as shown in FIG. 7. This allows a high-accuracy and early detection of the two-wheeler using the camera 21 a. When the infrastructural information includes information indicating that the target object is the two-wheeler (i.e., information specifying the type of the target object), the recognition of the two-wheeler need not be made from the position and speed of the target object detected by the set of autonomous sensors 20 (or the level of the recognition of the two-wheeler may also be lowered instead). Information is provided which allows the provision of drive assistance in accordance with the risk level when the two-wheeler is detected by the set of autonomous sensors 20, and allows the vehicle to make a safe start when the two-wheeler passes in front of the vehicle.
Thus, the embodiment adopts such a method for tuning the detection range of the set of autonomous sensors 20 to the estimated position where the two-wheeler will appear and for adjusting sighting of the detection range. The method may also include calculating the direction of travel of the vehicle from the tracking locus of the vehicle, the angle of steering of the vehicle, or the like. Alternatively, it is also possible to detect a roadside object or another vehicle using the camera 21 a, the radar device 22, and the like and then calculate the direction of travel of the vehicle from the position of the detected roadside object or other vehicle and the position of the vehicle.
(Approach to T-Junction)
FIG. 8 shows a scene in which, when a vehicle passes through a T-junction at which a stop sign is placed, another vehicle is approaching the T-junction from a non-preferential direction in which the other vehicle cannot be viewed from the vehicle. In such a case, a roadside apparatus can detect the position of the other vehicle which cannot be viewed from the vehicle, the direction of travel thereof, and the driving speed thereof, and the like and allows the vehicle to know the presence of the other vehicle from the infrastructural information. On the other hand, the set of autonomous sensors 20 cannot detect the other vehicle which cannot be viewed from the vehicle.
In such a case, therefore, the viewing angle of the camera 21 a is adjusted to be reduced to a position where the other vehicle will assumedly rush out (appear from the blind spot), as shown in FIG. 8. This allows a high-accuracy and early detection of the other vehicle using the camera 21 a.
The vehicle is decelerated until the other vehicle is detected using the set of autonomous sensors 20. The other vehicle is then detected using the set of autonomous sensors 20. When the other vehicle is at a halt, the deceleration of the vehicle is stopped. When the other vehicle is driving, the vehicle is halted. When the other vehicle has halted, information which allows the vehicle to make a safe start is provided. When the vehicle has passed through the T-junction, the drive assist process is ended.
(Unexpected Encounter at Intersection)
FIG. 9 shows a scene in which, when a vehicle passes through an intersection which is low in visibility in the absence of a traffic light, another vehicle is approaching the intersection from a right direction in which the other vehicle cannot be viewed from the vehicle. In such a scene, in a conventional method, a risk determination is made using only the autonomous sensor information. For instance, if the other vehicle is detected only using the image detecting apparatus 21, whether the other vehicle is approaching from the right direction or the left direction cannot be recognized until the vehicle approaches sufficiently close to the intersection. When the viewing angle of the camera 21 a is about 180 degrees laterally of the vehicle, the displayed image of the other vehicle is small due to the wide viewing angle. From the sensed image, the other vehicle cannot be recognized as such until it approaches sufficiently close to the vehicle. Thus, an early risk determination cannot be made only with the autonomous sensor information.
In the scene described above, a risk determination may be made using only the infrastructural information. The infrastructural information is not usually real-time information because it may be information provided spotwise such as a light beacon or information limited to a specified range under the constraints on the detection range of the roadside apparatus. In addition, the infrastructural information is lower in reliability than the autonomous sensor information, as described above.
Therefore, in the scene described above, when information on the approach of the other vehicle from the right-hand side of the vehicle is obtained, the viewing angle of the camera 21 a is reduced into a diagonally right forward direction based on the information, as shown in FIG. 9. This allows the other vehicle as the target object to be displayed as a large image. When the viewing angle is reduced to 1/n, the other vehicle can be recognized from a position at an approximately n times larger distance away from the camera 21 a.
When the viewing angle of the camera 21 a cannot be reduced, it is also possible to narrow down the image recognition range of the image processing device 21 c to 1/n. This reduces an image recognition time in the image processing device 21 c to about 1/n and thereby allows an early detection of the target object.
Thus, the in-vehicle drive assist system according to the present invention described above has focused attention on the respective advantages of the infrastructural information and the autonomous sensor information and adjusts the detection range of the set of autonomous sensors 20 and the sighting of the detection range using the infrastructural information obtained from outside the vehicle. This allows a preferential detection of the direction in which an object such as another vehicle or a pedestrian present around the vehicle exists and the direction in which the object can be viewed first from the vehicle. As a result, by making a risk determination from the autonomous sensor information resulting from the detection by the set of autonomous sensors 20, it is possible to perform an early risk determination, while maintaining reliability.
Although the preferred embodiments of the present invention have thus been described above, the present invention is by no means limited to the embodiments described above. Various modifications can be made in practicing the present invention without departing from the gist thereof.
For example, in providing information to the driver, when the same target object can be recognized from either of the infrastructural information and the autonomous sensor information, the reliability of the information is sufficiently high compared with the case when the target object can be recognized only from the infrastructural information or from the autonomous sensor information. In that case, therefore, a method for providing information may also be changed. By changing the method to one which intensifies the color, shape, sound, or the like of the display, it is possible to give a sense of relief to the driver.
Each or any combination of processes, steps, or means explained in the above can be achieved as a software unit (e.g., subroutine) and/or a hardware unit (e.g., circuit or integrated circuit), including or not including a function of a related device; furthermore, the hardware unit can be constructed inside of a microcomputer.
Furthermore, the software unit or any combinations of multiple software units can be included in a software program, which can be contained in a computer-readable storage media or can be downloaded and installed in a computer via a communications network.
(Aspects)
Aspects of the disclosure described herein are set out in the following clauses.
As an aspect of the disclosure, an in-vehicle drive assist system is provided as follows. The system is configured to receive, from outside a vehicle, external information including positional information of at least one object which is either an other vehicle or a pedestrian present around the vehicle and to assist the vehicle to drive using the received external information. The system includes the following. An autonomous detecting unit is configured to detect the at least one object which is either the other vehicle or the pedestrian viewable from the vehicle. A control unit is configured to perform a control operation for adjusting at least one of a detection range of the autonomous detecting unit and sighting of the detection range using the positional information of the object included in the external information. An object detection determining unit is configured to detect whether or not the object is detected by the autonomous detecting unit of which at least one of the detection range and the sighting of the detection range has been adjusted by the control unit. A risk determining unit is configured to determine a risk of the vehicle relative to the object when it is determined that the object is detected by the object detection determining unit. A drive assist unit is configured to provide drive assistance in accordance with a result of the determination by the risk determining unit.
Here, the autonomous sensors 20 may serve as the above autonomous detecting unit. The detection range control device 32 may serve as the above control unit. The infrastructural information determining device 31 may serve as the object detection determining unit. The risk-level determining device 33 may serve as the risk determining unit. The information providing device 40 and the vehicle-drive control device 50 may serve as the drive assist unit.
Thus, attention is focused on the respective advantages of roadside information and information (which will be referred to as autonomous sensor information) provided by the autonomous detecting unit and adjusts the detection range of the autonomous detecting unit, the sighting of the detection range, and the like using the roadside information obtained from outside the vehicle. This allows the autonomous detecting unit to perform a preferential detection in the direction in which an object such as another vehicle or a pedestrian present around the vehicle exists and in the direction in which the object can be viewed first from the vehicle. As a result, by determining a risk from the information detected by the autonomous detecting unit, it is possible to make an early risk determination, while maintaining reliability.
In the drive assist system, the control unit may perform the control operation for adjusting the sighting of the detection range based on the position of the object. This allows an early-stage detection of the object using the autonomous detecting unit capable of obtaining highly reliable information.
In the drive assist system, the control unit may perform the control operation for adjusting the detection range such that it is narrowed down based on the position of the object. In the drive assist system, the autonomous detecting unit may include a camera for sensing an image around the vehicle, and the control unit can achieve a more remarkable effect by performing a control operation for adjusting a viewing angle which is an image sensing range of the camera such that it is reduced.
That is, after the viewing angle of the camera is reduced, the number of pixels assigned to the reduced viewing angle is larger than that before the viewing angle is reduced. This allows recognition of an object at a further distance away from the camera than before the viewing angle is reduced. For example, when the viewing angle is reduced to 1/n, it becomes possible to recognize an object at an approximately n times larger distance away from the camera.
As another aspect, a method is provided for performing drive assistance in a vehicle having an autonomous detecting unit detecting a viewable object which is at least one of (i) an other vehicle and (ii) a pedestrian. The method includes the following: receiving, from outside the vehicle, external information including positional information of an object outside the vehicle; performing a control operation for adjusting at least one of (i) a detection range of the autonomous detecting unit and (ii) sighting of the detection range, using positional information of the object included in the received external information; making a first determination as to whether or not an object is detected by the autonomous detecting unit after at least one of (i) the detection range of the autonomous detecting unit and (ii) the sighting of the detection range is adjusted; making a second determination about a risk of the vehicle relative to the object when the first determination is affirmed; and providing a drive assistance in accordance with a result of the second determination. Adopting the method of the another aspect can produce the same advantage as that of the above drive assist system.
It will be obvious to those skilled in the art that various changes may be made in the above-described embodiments of the present invention. However, the scope of the present invention should be determined by the following claims.

Claims (5)

1. A drive assist system for a vehicle, the drive assist system receiving, from outside the vehicle, external information including positional information of an object outside the vehicle to thereby assist the vehicle to travel, the drive assist system comprising:
an autonomous detecting unit configured to detect a viewable object which is at least one of (i) an other vehicle and (ii) a pedestrian;
a control unit configured to perform a control operation for adjusting at least one of (i) a detection range of the autonomous detecting unit and (ii) sighting of the detection range, using positional information of the object included in the received external information;
an object detection determining unit configured to make a determination as to whether or not an object is detected by the autonomous detecting unit after at least one of (i) the detection range of the autonomous detecting unit and (ii) the sighting of the detection range is adjusted by the control unit;
a risk determining unit configured to make a determination about a risk of the vehicle relative to the object when the determination by the object detection determining unit is affirmed; and
a drive assist unit configured to provide a drive assistance in accordance with a result of the determination by the risk determining unit.
2. The drive assist system according to claim 1, wherein
the control unit performs the control operation for adjusting the sighting of the detection range based on the positional information of the object.
3. The drive assist system according to claim 1, wherein
the control unit performs the control operation for the detection range such that it is narrowed down based on the positional information of the object.
4. The drive assist system according to claim 3, wherein
the autonomous detecting unit includes a camera for sensing an image outside the vehicle, and
the control unit performs the control operation for adjusting a viewing angle, which is an image sensing range of the camera, such that it is reduced.
5. A method for performing drive assistance in a vehicle having an autonomous detecting unit detecting a viewable object which is at least one of (i) an other vehicle and (ii) a pedestrian,
the method comprising:
receiving, from outside the vehicle, external information including positional information of an object outside the vehicle;
performing a control operation for adjusting at least one of (i) a detection range of the autonomous detecting unit and (ii) sighting of the detection range, using positional information of the object included in the received external information;
making a first determination as to whether or not an object is detected by the autonomous detecting unit after at least one of (i) the detection range of the autonomous detecting unit and (ii) the sighting of the detection range is adjusted;
making a second determination about a risk of the vehicle relative to the object when the first determination is affirmed; and
providing a drive assistance in accordance with a result of the second determination.
US12/079,028 2007-03-27 2008-03-24 Drive assist system for vehicle Expired - Fee Related US7904247B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007082707A JP4434224B2 (en) 2007-03-27 2007-03-27 In-vehicle device for driving support
JP2007-082707 2007-03-27

Publications (2)

Publication Number Publication Date
US20080243390A1 US20080243390A1 (en) 2008-10-02
US7904247B2 true US7904247B2 (en) 2011-03-08

Family

ID=39795781

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/079,028 Expired - Fee Related US7904247B2 (en) 2007-03-27 2008-03-24 Drive assist system for vehicle

Country Status (2)

Country Link
US (1) US7904247B2 (en)
JP (1) JP4434224B2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100198513A1 (en) * 2009-02-03 2010-08-05 Gm Global Technology Operations, Inc. Combined Vehicle-to-Vehicle Communication and Object Detection Sensing
US8456327B2 (en) 2010-02-26 2013-06-04 Gentex Corporation Automatic vehicle equipment monitoring, warning, and control system
US20130204516A1 (en) * 2010-09-08 2013-08-08 Toyota Jidosha Kabushiki Kaisha Risk potential calculation apparatus
US20140180568A1 (en) * 2011-08-10 2014-06-26 Toyota Jidosha Kabushiki Kaisha Drive assist apparatus
US9041838B2 (en) 2012-02-14 2015-05-26 Gentex Corporation High dynamic range imager system
US20160202694A1 (en) * 2008-12-19 2016-07-14 Reconrobotics, Inc. System and method for autonomous vehicle control
US9769430B1 (en) 2011-06-23 2017-09-19 Gentex Corporation Imager system with median filter and method thereof
US11173925B2 (en) * 2016-04-01 2021-11-16 Denso Corporation Driving assistance device and driving assistance program product

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4788798B2 (en) * 2009-04-23 2011-10-05 トヨタ自動車株式会社 Object detection device
JP5613398B2 (en) * 2009-10-29 2014-10-22 富士重工業株式会社 Intersection driving support device
JP2012027605A (en) * 2010-07-21 2012-02-09 Toyota Motor Corp Surrounding vehicle recognition system
KR101896715B1 (en) * 2012-10-31 2018-09-07 현대자동차주식회사 Apparatus and method for position tracking of peripheral vehicle
RU2668149C2 (en) * 2014-04-02 2018-09-26 Ниссан Мотор Ко., Лтд. Vehicular information presentation device
US10192122B2 (en) * 2014-08-21 2019-01-29 Mitsubishi Electric Corporation Driving assist apparatus, driving assist method, and non-transitory computer readable recording medium storing program
US9649979B2 (en) * 2015-01-29 2017-05-16 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle operation in view-obstructed environments
JP2016143308A (en) * 2015-02-04 2016-08-08 パイオニア株式会社 Notification device, control method, program, and storage medium
KR101702888B1 (en) * 2015-10-12 2017-02-06 현대자동차주식회사 Vehicle view angle controlling apparatus, vehicle having the same and vehicle view angle controlling method
WO2017085857A1 (en) 2015-11-20 2017-05-26 三菱電機株式会社 Driving assistance device, driving assistance system, driving assistance method, and driving assistance program
JP6443318B2 (en) * 2015-12-17 2018-12-26 株式会社デンソー Object detection device
JP6493422B2 (en) 2016-02-10 2019-04-03 株式会社デンソー Driving support device
US9535423B1 (en) * 2016-03-29 2017-01-03 Adasworks Kft. Autonomous vehicle with improved visual detection ability
US10473777B2 (en) * 2016-08-31 2019-11-12 Robert Bosch Gmbh ASIC implemented motion detector
JP2018045482A (en) * 2016-09-15 2018-03-22 ソニー株式会社 Imaging apparatus, signal processing apparatus, and vehicle control system
US10453344B2 (en) * 2017-02-16 2019-10-22 Panasonic Intellectual Corporation Of America Information processing apparatus and non-transitory recording medium
WO2019073526A1 (en) * 2017-10-10 2019-04-18 日産自動車株式会社 Driving control method and driving control apparatus
JP6854357B2 (en) * 2017-10-10 2021-04-07 日産自動車株式会社 Operation control method and operation control device
CN107977641A (en) * 2017-12-14 2018-05-01 东软集团股份有限公司 A kind of method, apparatus, car-mounted terminal and the vehicle of intelligent recognition landform
CN111103874A (en) * 2018-10-26 2020-05-05 百度在线网络技术(北京)有限公司 Method, apparatus, device, and medium for controlling automatic driving of vehicle
JP7148436B2 (en) 2019-02-28 2022-10-05 株式会社日立製作所 Server, vehicle control system
JP7134916B2 (en) * 2019-05-29 2022-09-12 京セラ株式会社 Roadside units, in-vehicle units, and traffic communication systems
JP2019211485A (en) * 2019-08-05 2019-12-12 パイオニア株式会社 Notification device, control method, program, and storage medium
EP3819665B1 (en) 2019-11-06 2022-01-19 Yandex Self Driving Group LLC Method and computer device for calibrating lidar system
JP7533372B2 (en) 2021-06-17 2024-08-14 トヨタ自動車株式会社 Information processing device, information processing method, and program

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5680313A (en) * 1990-02-05 1997-10-21 Caterpillar Inc. System and method for detecting obstacles in a road
JPH10105880A (en) 1996-09-30 1998-04-24 Hitachi Ltd Mobile object control system
US6069581A (en) * 1998-02-20 2000-05-30 Amerigon High performance vehicle radar system
JP2001023098A (en) 1999-07-05 2001-01-26 Matsushita Electric Ind Co Ltd On-vehicle information communication system
JP2001101599A (en) 1999-09-30 2001-04-13 Toyota Motor Corp On-vehicle device for assisting traveling and method for assisting traveling
JP2001167387A (en) 1999-12-13 2001-06-22 Toyota Motor Corp On-vehicle device for supporting traveling and travelling supporting method
JP2001283380A (en) 2000-03-30 2001-10-12 Toshiba Corp Method and device for managing behavior of event on 'its' platform and method and device for processing on- vehicle information
JP2001319295A (en) 2000-05-02 2001-11-16 Nippon Signal Co Ltd:The Optical beacon device and optical beacon system
JP2001331900A (en) 2000-05-23 2001-11-30 Matsushita Electric Ind Co Ltd On-vehicle danger forecast alarm system and method
JP2002104115A (en) 2000-09-27 2002-04-10 Auto Network Gijutsu Kenkyusho:Kk Front obstacle confirming system
US6400308B1 (en) * 1998-02-20 2002-06-04 Amerigon Inc. High performance vehicle radar system
JP2002245597A (en) 2001-02-19 2002-08-30 Nissan Motor Co Ltd Information outputting device for vehicle
JP2004038244A (en) 2002-06-28 2004-02-05 Nissan Motor Co Ltd Traveling assistance information providing method and traveling assistance information providing device
JP2006195641A (en) 2005-01-12 2006-07-27 Nissan Motor Co Ltd Information providing device for vehicle
JP2006236094A (en) 2005-02-25 2006-09-07 Toyota Motor Corp Obstacle recognition system

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5680313A (en) * 1990-02-05 1997-10-21 Caterpillar Inc. System and method for detecting obstacles in a road
JPH10105880A (en) 1996-09-30 1998-04-24 Hitachi Ltd Mobile object control system
US6400308B1 (en) * 1998-02-20 2002-06-04 Amerigon Inc. High performance vehicle radar system
US6069581A (en) * 1998-02-20 2000-05-30 Amerigon High performance vehicle radar system
JP2001023098A (en) 1999-07-05 2001-01-26 Matsushita Electric Ind Co Ltd On-vehicle information communication system
JP2001101599A (en) 1999-09-30 2001-04-13 Toyota Motor Corp On-vehicle device for assisting traveling and method for assisting traveling
JP2001167387A (en) 1999-12-13 2001-06-22 Toyota Motor Corp On-vehicle device for supporting traveling and travelling supporting method
JP2001283380A (en) 2000-03-30 2001-10-12 Toshiba Corp Method and device for managing behavior of event on 'its' platform and method and device for processing on- vehicle information
JP2001319295A (en) 2000-05-02 2001-11-16 Nippon Signal Co Ltd:The Optical beacon device and optical beacon system
JP2001331900A (en) 2000-05-23 2001-11-30 Matsushita Electric Ind Co Ltd On-vehicle danger forecast alarm system and method
JP2002104115A (en) 2000-09-27 2002-04-10 Auto Network Gijutsu Kenkyusho:Kk Front obstacle confirming system
JP2002245597A (en) 2001-02-19 2002-08-30 Nissan Motor Co Ltd Information outputting device for vehicle
JP2004038244A (en) 2002-06-28 2004-02-05 Nissan Motor Co Ltd Traveling assistance information providing method and traveling assistance information providing device
JP2006195641A (en) 2005-01-12 2006-07-27 Nissan Motor Co Ltd Information providing device for vehicle
JP2006236094A (en) 2005-02-25 2006-09-07 Toyota Motor Corp Obstacle recognition system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Office Action dated Dec. 24, 2008 in Japanese Application No. 2007-082707.
Office Action dated Sep. 24, 2009 in corresponding Japanese Application No. 2007-082707.

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11501526B2 (en) 2008-12-19 2022-11-15 Landing Technologies, Inc. System and method for autonomous vehicle control
US10430653B2 (en) 2008-12-19 2019-10-01 Landing Technologies, Inc. System and method for autonomous vehicle control
US10331952B2 (en) 2008-12-19 2019-06-25 Landing Technologies, Inc. System and method for determining an orientation and position of an object
US20160202694A1 (en) * 2008-12-19 2016-07-14 Reconrobotics, Inc. System and method for autonomous vehicle control
US9710710B2 (en) * 2008-12-19 2017-07-18 Xollai Inc. System and method for autonomous vehicle control
US20100198513A1 (en) * 2009-02-03 2010-08-05 Gm Global Technology Operations, Inc. Combined Vehicle-to-Vehicle Communication and Object Detection Sensing
US8229663B2 (en) * 2009-02-03 2012-07-24 GM Global Technology Operations LLC Combined vehicle-to-vehicle communication and object detection sensing
US8456327B2 (en) 2010-02-26 2013-06-04 Gentex Corporation Automatic vehicle equipment monitoring, warning, and control system
US9230183B2 (en) 2010-02-26 2016-01-05 Gentex Corporation Automatic vehicle equipment monitoring, warning, and control system
US20130204516A1 (en) * 2010-09-08 2013-08-08 Toyota Jidosha Kabushiki Kaisha Risk potential calculation apparatus
US9058247B2 (en) * 2010-09-08 2015-06-16 Toyota Jidosha Kabushiki Kaisha Risk potential calculation apparatus
US10044991B2 (en) 2011-06-23 2018-08-07 Gentex Corporation Imager system with median filter and method thereof
US9769430B1 (en) 2011-06-23 2017-09-19 Gentex Corporation Imager system with median filter and method thereof
US9123252B2 (en) * 2011-08-10 2015-09-01 Toyota Jidosha Kabushiki Kaisha Drive assist apparatus
US20140180568A1 (en) * 2011-08-10 2014-06-26 Toyota Jidosha Kabushiki Kaisha Drive assist apparatus
US9041838B2 (en) 2012-02-14 2015-05-26 Gentex Corporation High dynamic range imager system
US11173925B2 (en) * 2016-04-01 2021-11-16 Denso Corporation Driving assistance device and driving assistance program product
US11845462B2 (en) 2016-04-01 2023-12-19 Denso Corporation Driving assistance device and driving assistance program product

Also Published As

Publication number Publication date
JP4434224B2 (en) 2010-03-17
US20080243390A1 (en) 2008-10-02
JP2008242844A (en) 2008-10-09

Similar Documents

Publication Publication Date Title
US7904247B2 (en) Drive assist system for vehicle
US11994581B2 (en) Information processing device and information processing method, imaging device, computer program, information processing system, and moving body device
US10262629B2 (en) Display device
US8204678B2 (en) Vehicle drive assist system
US8461976B2 (en) On-vehicle device and recognition support system
US11740093B2 (en) Lane marking localization and fusion
US10810877B2 (en) Vehicle control device
US20240142607A1 (en) Information processing device, information processing method, computer program, and mobile device
JP2021099793A (en) Intelligent traffic control system and control method for the same
US20150055120A1 (en) Image system for automotive safety applications
KR20170079096A (en) Intelligent black-box for vehicle
US20200342761A1 (en) Notification apparatus and in-vehicle device
US11650321B2 (en) Apparatus and method for detecting tilt of LiDAR apparatus mounted to vehicle
JP2906894B2 (en) Inter-vehicle distance detection device
CN114026436B (en) Image processing device, image processing method, and program
US20200377016A1 (en) Vehicle occupant warning system
KR20220097656A (en) Driver asistance apparatus, vehicle and control method thereof
JP2022056153A (en) Temporary stop detection device, temporary stop detection system, and temporary stop detection program
US11066078B2 (en) Vehicle position attitude calculation apparatus and vehicle position attitude calculation program
US20240221389A1 (en) System and method for deep learning based lane curvature detection from 2d images
EP4130790A1 (en) Driver assistance system and method for determining a region of interest
US20230075998A1 (en) Steering control apparatus and method
US20230408264A1 (en) Lane marking localization and fusion
US20220017117A1 (en) Information processing apparatus, information processing method, program, mobile-object control apparatus, and mobile object
JP2023166227A (en) Information processing device, information processing system, information processing method, and information processing program

Legal Events

Date Code Title Description
AS Assignment

Owner name: DENSO CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NAKAMORI, YASUTAKA;REEL/FRAME:020743/0828

Effective date: 20080306

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20230308