WO2013034182A1 - Method for creating a vehicle surroundings map, driver assistance device, and vehicle having a driver assistance device - Google Patents
Method for creating a vehicle surroundings map, driver assistance device, and vehicle having a driver assistance device Download PDFInfo
- Publication number
- WO2013034182A1 WO2013034182A1 PCT/EP2011/065507 EP2011065507W WO2013034182A1 WO 2013034182 A1 WO2013034182 A1 WO 2013034182A1 EP 2011065507 W EP2011065507 W EP 2011065507W WO 2013034182 A1 WO2013034182 A1 WO 2013034182A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- driver
- map
- information
- perceptibility
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3807—Creation or updating of map data characterised by the type of data
- G01C21/3815—Road data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3848—Data obtained from both position sensors and additional sensors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9316—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9321—Velocity regulation, e.g. cruise control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9322—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using additional data, e.g. driver condition, road state or weather data
Definitions
- the invention relates to a method for creating a vehicle surroundings map, in which surroundings information is captured and considered for creating the vehicle
- the invention further relates to a driver assistance device which is designed for performing a method according to the invention. Moreover the invention relates to a vehicle having a corresponding driver assistance device.
- Vehicles are commonly designed to have a plurality of sensors, which can capture both the surroundings of the vehicle and the interior of the vehicle.
- sensors which can capture both the surroundings of the vehicle and the interior of the vehicle.
- ultrasonic sensors and/or radar sensors and/or infrared sensors and/or lidar sensors and/or cameras which can capture corresponding information are known.
- sensors are mounted in the vehicle which can capture information about the vehicle itself. There are sensors with regard to the velocity of the vehicle, the steering angle, the inclination of the vehicle, and the like.
- Driver assistance devices of the vehicle operate on the basis of at least some of this sensor information.
- this sensor information In this connection are to be named merely as examples parking assistance systems, lane keeping systems, distance holding systems, night vision systems, and blind angle detection systems.
- Driver assistance systems moreover can be designed to emit an information or warning to the driver, only if specific conditions, for instance critical driving conditions are reached. Moreover, driver assistance systems can also be designed for at least semi- autonomous intervention into the driving behaviour of the vehicle.
- ADAS Advanced Driver Assistance Systems
- ADAS Advanced Driver Assistance Systems
- Such systems support the vehicle driver in many situations, for example, if a headlight is to be automatically switched on or off depending on the surroundings information.
- an ADAS system is known, in which from satellite information and vehicle information as well as vehicle surroundings information an exact position of the vehicle on the road is to be determined.
- an ADAS horizon for supporting the driving of a vehicle, with dynamic data of other sensors are already known.
- DE 10 2009 022 278 A1 a method for determining a hindrance-free space in the surroundings of a vehicle is known. By means of a sensor system a distance image is established and from the distance image a depth map of a
- the vehicle surroundings map is created at least in parts as object perception map, in which perceptibility zones are created. For at least one of the perceptibility zones at least one probability value for the perceptibility of an object in the perceptibility zone is determined.
- the vehicle surroundings can be subdivided into highly selective and specific zones, wherein then for the improved assessment of the perceptibility of an object into such a zone at least one probability value is moreover taken as a basis.
- an object within the scope of present invention does not refer only to physical objects like vehicles, infrastructure, a human body (pedestrian, child), an animal, an item (ball) but also to other characteristics like road shape (hairpin curve), presence of ice or water on the road. Those latter characteristics are also of importance to be detected in time for a correct anticipation of the vehicle behaviour e.g. in case of braking activation by the driver or recommended speed to the driver.
- each perceptibility zone at least one probability value is determined.
- quasi the entire vehicle surroundings can be correspondingly categorized and classified, so that in particular around the vehicle a contiguous area of perceptibility zones is formed, all of which with regard to the object perceptibility are characterized by specific probability values.
- a probability value depending on the surroundings information and/or vehicle information is determined.
- highly essential influential factors can be taken into consideration, so that the probability values are extremely exact and thus also the entire object perception map can be represented, in particular covering the entire area contiguously, by means of very exact probability values with regard to the corresponding object perception.
- weather conditions and/or road information and/or traffic information in the area of the vehicle's position are considered.
- visibility conditions and/or the time of day can be considered as weather conditions.
- the perceptibility of an object for example can be very different depending on the time of day.
- weather conditions such as sunshine, on the one hand, or heavy and dense fog, on the other hand, can crucially affect the object perceptibility within a perceptibility zone.
- weather conditions such as rain or snowfall.
- these different weather conditions can be captured very precisely using some built-in sensor like camera and/or provided by some application possibly from a wireless communication device with GPS like smartphone, on this basis of this information very precise probability value determinations for the object perception in specific perceptibility zones can be established.
- road information in particular the state of the road surface with regard to its material consistency and/or its surface structure can be taken into consideration.
- the road information of the lane course can equally be taken as a basis. Additionally or instead, also the dimensions of the road, such as the width of a lane, can be taken into account.
- the kind of road can be considered, with the "kind of road” referring to whether it is for example a motor way and there are at least two lanes for each driving direction, or a federal road or state road, in which case there is only one lane for each driving direction.
- road information such as for instance an as straight as possible course of a lane or else a curvaceous lane or a very narrow lane, taken individually or in combination render a plurality of scenarios forming very specific and individual object perception scenarios, which possibly depend very delicately and subtly on these properties and thus lead to very different probability values for an object perception within a perceptibility zone.
- Vehicles can move at a very large scope of velocity ranging from very small velocities up to very great velocities beyond 200 km/h, with the perceptibility of objects being very different.
- design of a vehicle with regard to the dimensions and/or the all-round vision design in this context plays a decisive role.
- the all- round vision design for example a very low-riding sports car is comprised, which only has a relatively small and/or very flatly arranged or oriented rear window.
- vehicles can be designed without rear window and for example with only two seats, a driver's seat and one passenger's seat, and thus without rear seating space with side windows. Rear and side vision thereby is considerably limited for a driver.
- a sedan car can be named as a vehicle with relatively large windows having large side windows and a large rear window also in the rear seating space.
- a probability value is determined in dependency on the driver information. Also thereby very decisive information is considered, which could lead to very strong differences in the probability value for the determination of the probability value of a respective perceptibility zone.
- driver information for instance the driving behaviour of the driver and/or the age of the driver and/or an agility state of the driver and/the viewing direction of the driver can be considered. With regard to the agility state the emotional state and/or the degree of tiredness of the driver are taken into account. All this information can be captured with suitable sensors and detectors and the corresponding information can be analysed to this effect. Alternately or in combination, it is provided that a probability value is determined in dependency on human characteristics of the driver particularly the eye sensitivity.
- Latter is directly related to among other ambient light, size of the detected object, contrast, and spatial frequency and can be predefined from well known eye sensitivity curve.
- human body intrinsic characteristics can be used to define an object detection map linked to driver perception and possibly to vehicle equipment perception.
- the driver perception can advantageously be defined by some human object perception model which is used as transfer function between sensors information and human vision characteristics.
- objects to be detected are grouped in classes, with a classification being performed in dependency on the shape and/or the size and/or the optical contrast property of an object.
- classification being performed in dependency on the shape and/or the size and/or the optical contrast property of an object.
- a first class in which objects with a light source of their own are grouped.
- Such objects which can emit light themselves, possibly can be perceived more easily and faster.
- a second class is formed, in which objects without light source of their own but with high-contrast colour difference from the object surroundings are grouped. If such object thus at least in parts is designed to have a colour which is clearly different and in stark contrast to a colour of the immediate surroundings of this object, here, too, a larger probability value can be taken as a basis with regard to the object recognition.
- a third class is formed in which objects without light source of their own and with low-contrast colour difference from the object surroundings are grouped. Such objects in comparison with the other classes are much harder to be detected, so that they are grouped at a lower probability value. Due to the plurality of different information that can be rendered the basis for determining the probability value already named in the above, a plurality of determination scenarios are rendered, which can be of more or less complex design.
- a probability value of a perceptibility zone and/or the number of probability values for a single perceptibility zone is/are determined in dependence on the location of the perceptibility zone relative to the vehicle and/or the shape of the perceptibility zone.
- the precision and exactness with regard to the statement on the possibility of an object detection in the vehicle surroundings can be enhanced in a highly individual way.
- at least two probability values are taken as a basis, in order to be able to further subdivide this possibly critical perceptibility zone with regard to the object perception.
- a subdivision into discrete graded probability values can be performed.
- the width and/or length of a perceptibility zone a relatively contiguous changing of the probability value occurs. Additionally or instead it may also be provided that the number of probability values for a perceptibility zone is dependent on the already named driver information and/or surroundings information and/or vehicle information.
- the object perception map is created on the basis of sensor information from sensors of the vehicle and/or vehicle-external sensors and that the object perception map is made available to a driver assistance system or a driver assistance device.
- the created object perception map is not created and made available for the vehicle driver himself, but merely for an electronic system of the vehicle.
- an information in particular to an ADAS system of the vehicle, can be made possible with regard to the actual real perceptibility or to the visibility conditions.
- this object perception map thus the information captured by the individual sensor types is rendered more precise.
- the information capturable by the individual sensors more or less exactly in each case thereby can be rendered more precise or be readjusted in a certain sense, so that also their functionality with regard to the object detection and operations resulting therefrom can in a way be more adapted to the individual requirements and situation.
- supportive information through the object perception map can be taken as a basis for individual sensor types, which under specific conditions, for instance particular weather conditions can only detect up to a certain degree of precision. Then, however, this information can be improved by the object perception map.
- Other sensors have a restricted detection precision in heavy rain, while still others have blind angle fields, in which they cannot detect or only to a very limited extent.
- sensor information can be rendered more precise, or additional information which cannot be captured and determined by the sensors can be created, which then can be used as a basis in the electronic system, in particular a driver assistance device.
- object perception map as a basis for electronic systems of a vehicle also further information, for instance about other vehicles in road traffic and/or navigation data, can be considered.
- the object perception map is created on the basis of sensor information from sensors of the vehicle and/or vehicle-external sensors and in dependency on the object perception map at least one driver probability value for an object perception is determined through the driver.
- a corresponding correction with regard to the object perception is effected through the sensors, on the one hand, and through the person, on the other hand.
- the object perception map is created and an adaptation to the object perception, as it occurs through a person, in particular a driver, is performed.
- behavioural patterns of the driver and/or his position are considered. With regard to the behaviour of the driver in particular his viewing direction and/or movements, in particular those of his head and/or eyes are taken into account.
- a driver probability value for an object perception in dependency on the object probability map is calculated and/or estimated.
- the calculation and/or estimation may in particular be based upon the above-named parameters relating to the driver possibly including the characteristics of the human body characteristics like eye sensitivity.
- the driver in dependency on the driver probability value the driver is warned and/or the intervention into the driving behaviour of the vehicle is performed.
- the warning in this set-up may be specified to the effect that it only is effected, if the vehicle driver cannot detect an object and its position within the vehicle surroundings. This may for instance equally be the case with regard to the position of the object relative to the viewing direction of the driver. Due to visibility conditions this may even then be the case if the driver does look into the direction of the object, but cannot see it because of darkness or other conditions.
- this may for example be the case if the driver cannot detect an object due to a sudden change in the visibility conditions, as this is for instance the case with sharp light/dark boundaries, if for example the vehicle enters a tunnel or a shaded area of a tree or the like.
- An intervention into the driving behaviour of the vehicle can take many different ways possibly by directly interacting with some vehicle equipment. In latter case an activation or adaptation of visibility systems (lighting, wiping) or some specific ADAS applications or functionalities (braking, speed limitation, steering control ...) can be performed in dependency of the driver probability value of the driver.
- at least some of the information captured by means of sensors, which is used as a basis to the object perception map is redundantly captured by various sensor types.
- Different sensor types may be the initially named examples, which may be designed as ultrasonic sensors, radar sensors, infrared sensors, laser scanners, cameras, and the like. With regard to the design of the camera this may be sensitive within the spectral range visible to human beings.
- the object perception map is created as two-dimensional or as three-dimensional map.
- the creation of the object perception map is performed in real time.
- an extremely fast object perception map is provided, which is specific to individual demands and can be permanently updated.
- a safety map is created for driving the vehicle in consideration of the object perception map, wherein for the creation of the safety map information about other vehicles in the surroundings of the vehicle and/or critical driving behaviour for which the safety map is determined, are considered.
- the object perception map thus serves as a basis for a further surroundings map, namely the safety map, which then allows for making statements with regard to aspects in the individual zones of the map that might lead to critical situations with objects or other vehicles or driver behaviour, as the vehicle moves further on.
- This too, can be determined in dependency on the driving direction of the vehicle in which the safety map is determined or on the velocity of this vehicle or other vehicle information of this vehicle.
- navigation data can be used as a basis in this connection. Equally, information may be considered that are exchanged by a car-to-car communication between two vehicles.
- safety probability values can be determined, on the basis of which a statement about possible collisions or the like in the corresponding zone can be made. This may in turn be effected in dependency on the named driver information and/or surroundings information and/or road information.
- information that is created through the safety map can be directly provided to an electronic system of the vehicle to interact with vehicle equipment, in particular a driver assistance device, or a vehicle driver.
- information can also be used to be displayed as image on a display unit.
- Other possible mode for display can be considered like front vehicle vision in forward direction. Equally, additionally or instead also acoustic and/or optical warnings may be given off.
- the object perception map and/or the safety map are displayed on a display unit, in particular a screen, in the vehicle.
- the object perception map and/or the safety map are displayed as a plan view of the vehicle and the vehicle surroundings.
- An essential idea of the invention consequently consists in a method in which vehicle surroundings information, such as for instance road information, optical contrast information of the object, and atmospheric conditions, in particular weather conditions, with driver information, such as for instance driver behaviour and/or driver characteristics are combined, in order to create a perceptibility map on the basis of probability values of an object perception in perceptibility zones.
- vehicle surroundings information such as for instance road information, optical contrast information of the object, and atmospheric conditions, in particular weather conditions
- driver information such as for instance driver behaviour and/or driver characteristics
- driver information such as for instance driver behaviour and/or driver characteristics
- Such object perception map then can serve as a basis for the most varied applications. For example it can be used as the basis to an adaptive activation or non-activation of a warning to the vehicle driver with regard to the critical situation. Through the object perception map the occurrence of unnecessary alarms can be reduced.
- the object perception map though, can also be rendered the basis to the work of further electronic devices of the vehicle, in particular driver assistance devices. In particular it can serve as basis for ADA
- the relevant object perception maps are created through the system, with these maps being created individually and independently of the respective situation as required.
- the maps are created as required, in particular the situation to be assessed is used as a basis.
- ADAS information for instance all information of the vehicle or the systems incorporated in the vehicle are collected.
- raw data are equally collected, in order to be able to receive relevant information for the perceptibility determination.
- stationary objects such as tunnels, traffic signs, trees, road markings, rails for a train, which at least partly extend within the road, and the like can be captured.
- This information can for example be obtained via sensors incorporated in the vehicle and/or via navigation information and/or via car-to-car communication.
- road information and/or surroundings information can be obtained from vehicle- internal sensors and/or surroundings information from via vehicle-external information sources, such as for instance also a car-to-car communication, or via navigation systems.
- the invention relates to a driver assistance device which is designed for performing a method according to the invention or an advantageous embodiment thereof.
- the driver assistance device comprises at least one evaluation unit, which on the basis of information obtained, for example from vehicle-internal sensors and/or vehicle-external sensors, creates the object perception map and/or the safety map.
- the driver assistance device moreover can have a display unit on which the object perception map and/or the safety map is displayable.
- the driver assistance device can be designed to be merely for giving off an acoustic and/or optical warning. However, additionally or instead it can be designed for at least semiautonomous intervention into the driving behaviour of the vehicle.
- the driver assistance device with regard to its functionality can for example be designed as a parking assistance system, as a distance holding system, as a lane keeping system, as a lane departure warning system, as a blind angle detection system, as night vision system etc.
- the invention relates to a vehicle with a driver assistance device according to the invention.
- Fig. 1 an embodiment of a created object perception map
- Fig. 2 a further embodiment of an object perception map
- Fig. 3 a block diagram showing the method sequence of creating a safety map
- Fig. 4 a sketched representation of a work scenario of a driver assistance device for which an object perception map is used as a basis.
- Fig. 5 the scenario according to Fig. 4 with an ambient condition that is different to that in the representation according to Fig. 4, in particular with regard to the visibility conditions;
- Fig. 6 a further schematic scenario of the driver's mode of operation assistance device for which an object perception map is used as a basis;
- Fig. 7 the scenario according to Fig. 6 with an ambient condition which is
- an embodiment of an object perception map 1 is shown as it is used in present invention by storing corresponding information into some memory of the system to be let available possibly in real time for different possible applications.
- This object perception map can be adapted in real time regarding current driving situation. Some of those information may be directly represented on a display unit of a driver assistance device of a vehicle.
- the object perception map 1 is shown as plan view of a vehicle 2 and the vehicle surroundings area 3. As can be seen, the vehicle surroundings area is shown in a vehicle surroundings map 4, which at least in parts is completely characterized in the embodiment by the object perception map 3.
- the object perception map 3 is subdivided into a plurality of contiguous perceptibility zones 5 to 17. For each of the perceptibility zones 5 to 17 moreover also at least one probability value for the perceptibility of an object in the respective perceptibility zone 5 to 17 is indicated and displayed. In the embodiment shown in Fig. 1 for each
- perceptibility zone 5 to 17 only one probability value is indicated.
- Each probability value is given in percentage. It describes the probability with which an object can be perceived in the respective area or in the respective perceptibility zone 5 to 17.
- Such probability value is determined in particular in dependency on the surroundings information of the vehicle 2, which in the shape of an image symbolizes the actual vehicle in the object perception map 3 and/or determines vehicle information and/or driver information.
- surroundings information are considered for instance weather conditions and/or road information and/or traffic information in the area of the vehicle's position. This may in particular be within the surroundings area of the vehicle, which area may vary in size and shape.
- those perceptibility zones that are formed in front of the vehicle 2, with regard to their extension towards the respectively facing vehicle side are larger than the perceptibility zones that are arranged laterally to the vehicle 2. This is given in particular if the vehicle 2 moves forward.
- a probability value for an object detection in the perceptibility zone 5, which is formed centrally in front of the vehicle 2 amounts to 90%.
- Fig. 1 a situation is given which allows for free and clear vision, for instance sunshine.
- road information for instance the state of the road surface and/or the course of the road and/or the dimensions of the road and/or the kind of road are considered.
- vehicle information for instance the velocity and/or the vehicle dimensions and/or the vehicle all-round vision design and/or the driving direction of the vehicle is/are considered.
- driver information for example the driving behaviour of the driver and/or the age of the driver and/or the agility state of the driver and/or the present viewing direction of the driver are considered.
- a grouping of the objects into classes can be performed.
- at least three classes are formed, which are defined in dependency on the shape and/or the size and/or the optical contrast properties of the respective objects.
- the optical contrast properties classifications are performed to the effect that a first class is provided, in which objects having a light source of their own are grouped.
- a second class those objects are grouped which lack a light source of their own, but have a high-contrast colour difference from the object surroundings.
- those objects are grouped which have no light source of their own and have a low-contrast colour difference from the object surroundings.
- a probability of 0% is indicated, since here the vision of the driver through the carriage of the vehicle 2 in particular down towards the road is obstructed. In particular small object thus cannot be detected in this perceptibility zone 17.
- a probability of 0% is typical when considering driver perception using the human characteristics.
- a different probability could be set for that zone 17 if some detectors like ultrasonic and/or radar and/or other sensors are available at suitable places on the vehicle.
- the further perceptibility zones in comparison with the perceptibility zone 5 are predetermined with clearly reduced probability values for the object detection. Since commonly a vehicle driver during the forward movement of a vehicle 2 keeps his viewing direction to the front, the perceptibility in this lateral and rear perceptibility zones is clearly reduced. Even though here also external mirrors and an interior mirror, which facilitates rear vision, may be arranged within the vehicle 2, nevertheless the object perception here is clearly reduced.
- the object perception map 3 in Fig. 1 also an object of class 1 with regard to its optical contrast properties is used as a basis. In particular also dry road conditions are assumed.
- the number and design and colour of the perceptibility zones 5 to 17 are merely examples. Equally, the probability values are merely exemplary. It is merely to be rendered clear how such object perception map 3 may look and which information it may provided.
- this object perception map 3 On the basis of this object perception map 3, due to the highly differentiated and detailed subdivision of the vehicle surroundings into these perceptibility zones 5 to 17 and additionally the specified probability values very precise scenarios for the mode of operation of the driver assistance devices can be rendered. On the basis of this object perception map 3, which then for example can be used as a basis exclusively for a driver assistance device, detected information from sensors can be rendered more precise or plausible. Equally, on the basis of this information the object perception map, in particular the probability values, operation scenarios of the driver assistance device can be improved. In particular here acoustic and/or optical warnings to the driver can be enhanced. An unnecessary, incorrect or too frequent emission of such warnings can be avoided. Thereby also the safety in driving a vehicle can clearly be raised, as the driver is not unnecessarily distracted or even startled.
- the shape and/or size of a perceptibility zone 5 to 17 can be predetermined individually in particular in dependency on its position within the vehicle surroundings relative to the vehicle 2. This, too, can for example be performed on the basis of further information about the vehicle 2 and/or the vehicle surroundings.
- a probability value of a perceptibility zone 5 to 17 and/or the number of probability values for a probability zone 5 to 17 thus can be determined in particular in dependency on the position of the respective probability zone 5 to 17 relative to the vehicle 2 and/or the shape of the probability zone 5 to 17. In this way, too, the precisions with regard to the probability of the object detection in a zone can be determined and in particular the resulting operating mode of a driver assistance device be enhanced.
- the object perception map 3 is created on the basis of sensor information from vehicle-internal and/or vehicle-external sensors and the object perception map is provided for an already mentioned driver assistance device.
- the object perception map 3 is created on the basis of this sensor information and in dependency on the object perception map 3 at least one driver probability value for an object perception is determined by the driver himself.
- the driver probability value can be calculated and/or estimated on the basis of the object perception map 3. This is advantageous, since due to the human perception
- At least some information captured by means of sensors and taken as a basis for the determination of the object perception map 3 is redundantly captured through various sensor types of the vehicle 2. Since various sensor types in dependency on the different ambient conditions, in particular weather conditions, may vary in terms of the precision of the information capture, through this redundant information capture the precision of the detected information can be improved. On this basis also the determination of the perceptibility zones 5 to 17 and in particular of the probability values used as a basis can be clearly rendered more precise.
- the object perception map 3 can be designed as a plan view and as a two-dimensional map. However, it can also be created as three- dimensional map.
- the creation of the object perception map 3 is effected in realtime. But some part of it can be recorded in a previous step or possibly predefined like for the vehicle body and the human vision model characteristics.
- the object perception map 3 unlike the design in Fig. 1 is shown merely for a different surroundings information with regard to a different weather scenario.
- the object perception map 3 in this connection is shown with bad vision, in particular fog.
- the vehicle 2 moving forward and otherwise identical conditions in comparison with the scenario representation in Fig. 1 in this regard clearly different probability values for an object detection in the respective perceptibility zones 5 to 17 are rendered.
- an exemplary scenario differing in terms of the weather conditions is given, which is reflected in the embodiment in that the number and shape as well as the position of the perceptibility zones 5 to 17 is the same.
- perceptibility zone 5 a plurality of probability values is determined.
- the probability value of an object detection decreases possibly in discrete steps but could be also in a continuous way.
- four subzones within the perceptibility zone are formed, the probability values of which decrease from 90% to 10%.
- the determination of the number of subzones can be performed in real time as the transition from the situation of Fig. 1 to Fig. 2. In fact the number of subzone may be adapted in real time to the environment and/or human characteristics of the driver.
- Fig. 3 in a simplified block diagram the procedure for determining a safety map 18 from an object perception map 3 are shown.
- an object perception map 3 as shown for example in Fig. 1 or Fig. 2
- traffic information such as for example information about other vehicles in the vicinity of the vehicle 2
- This information is captured according to the shown block 19 and evaluated, then combined in an evaluation unit of a driver assistance device of the vehicle 2, and a safety map 18 created therefrom.
- the combination of the various pieces of information is characterized in an exemplary way in Fig. 3 through block 20, wherein this also symbolizes the evaluation unit.
- the safety map 18 critical traffic situations for the vehicle 2 can be detected early and a corresponding hint be given to the driver and/or an intervention into the driving behaviour of the vehicle be performed, so that the driving of the vehicle is improved with regard to safety aspects.
- the object perception maps 3 shown in Figs. 1 and 2 are also marked in their perceptibility zone 5 and 17 by specific colouring when shown to the driver.
- especially critical zones can also be marked by specific colourings, so that these can be identified intuitively and quickly by a driver when information is shown to him.
- the perceptibility zone 5 can for example be designed to be green, which means that a high probability of object detection is given for the driver. The driver has sufficient time to respond to the situation himself.
- the perceptibility zones 6, 8, 10, 12, 14, and 16, as well as 17 are coloured red and thereby characterize particularly critical zones with minimal or no perceptibility probability.
- the driver with regard to an object detection in these zones has no way of reacting.
- the object perceptibility for a user is clearly reduced. Nevertheless there is still sufficient time left for the vehicle driver to react.
- Such set-up and design is particularly advantageous for the design of a safety map 18.
- the colouring and the evaluation with regard to the detectability and the possibility for the driver to react therefore serves in particular as a basis for the safety map 18. This is because on the basis of these probability evaluations with regard to the possibilities for a driver to react in connection with the perceptibility probabilities groupings of traffic situations with regard to critical or less critical states can be performed.
- Fig. 4 in a schematic representation shows a traffic scenario with bad visibility conditions, wherein the vehicle 2 moves from the left to the right according to an arrow.
- T c means the period of time until collision with a pedestrian 21 . If according to the first scenario the period of time D 0 until collision is ⁇ 2 s, through the driver assistance device on the basis of the object perception map or the safety map it is detected that there is no time left for reaction and a warning is immediately given off and possibly an intervention into the driving behaviour of the vehicle 2 performed.
- the period of time D1 until collision of the vehicle 2 with the pedestrian 21 amounts to between 2 s and 10 s. Thus sufficient time for a reaction is given. However, due to the bad visibility conditions an alarm or a warning is given off and/or an intervention into the driving behaviour of the vehicle performed.
- Fig. 5 a schematic scenario on the basis of the representation in Fig. 4 is shown with much better visibility conditions than those in the representation in Fig. 4.
- a warning is merely given off by the driver assistance device, if the pedestrian 21 is in the area I. Due to the good visibility conditions no warning is given off and also no intervention into the driving behaviour of the vehicle performed, if the pedestrian 21 is in the area II or III. The driver of the vehicle can detect the pedestrian 21 in the areas II and III early and accordingly can react himself.
- Fig. 6 shows a schematic night time scenario, in which the vehicle 2 moves in the direction of the arrow from the left to the right.
- the vehicle 2 is in the position shown in Fig. 6 in an area illuminated for instance by a street light.
- zone I by contrast, the brightness is already clearly reduced and further decreases in the direction of the area II.
- the area III it then is virtually dark. Due to this scenario, between the area in which the vehicle 2 is and the area I there is a certain bright/dark transition, through which the driver of the vehicle 2 is glared.
- the perceptibility of pedestrian 21 in the area I therefore is limited and no reasonable object perception possible.
- Fig. 7 shows a scenario in analogy to that of Fig. 6, wherein unlike in Fig. 6 the vehicle 2 is not in a brightly illuminated area at dark night time, but in an area to the left of area I, which is relatively dark. An abrupt bright/dark transition thus is not given and the eyes of the driver have already adapted to the dark visibility conditions.
- a warning is given off to the vehicle driver and/or an intervention into the driving behaviour of the vehicle 2 performed.
- a driver assistance device For each function of a driver assistance device at least one individually established and adjusted object perception map and possibly therefrom in particular a safety map can be created, especially in order to be able to perform the generation of warnings due to more or less critical traffic situations in a way better adapted to the requirements and the situation and to thus avoid too frequent unnecessary warnings.
- the same can additionally or instead also be performed with regard to the interventions of a driver assistance device into the driving behaviour of the vehicle.
- an automatic activation of fog lamps as provided within some ADAS could be advantageously correlated to the defined object perception map. This would permit that e.g. by fog night with high beam / low beam automation switch, low beam regime is kept even without any vehicle in front to account for presence of fog avoiding driver retro-glaring.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention relates to a method for creating a vehicle surroundings map, in which surroundings information is captured and considered for the creation of the vehicle surroundings map, wherein the vehicle surroundings map is created at least in parts as object perception map (3), in which perceptibility zones (5 to 17) are created, and at least for on of the perceptibility zones (5 to 17) at least one probability value for the perceptibility of an object (21) within the perceptibility zone (5 to 17) is determined. The invention also relates to a driver assistance device and a vehicle (2).
Description
Method for Creating a Vehicle Surroundings Map, Driver Assistance Device, and Vehicle
Having a Driver Assistance Device
The invention relates to a method for creating a vehicle surroundings map, in which surroundings information is captured and considered for creating the vehicle
surroundings map. Moreover, the invention further relates to a driver assistance device which is designed for performing a method according to the invention. Moreover the invention relates to a vehicle having a corresponding driver assistance device.
Vehicles are commonly designed to have a plurality of sensors, which can capture both the surroundings of the vehicle and the interior of the vehicle. In this connection for instance ultrasonic sensors and/or radar sensors and/or infrared sensors and/or lidar sensors and/or cameras which can capture corresponding information are known.
Further, sensors are mounted in the vehicle which can capture information about the vehicle itself. There are sensors with regard to the velocity of the vehicle, the steering angle, the inclination of the vehicle, and the like.
Driver assistance devices of the vehicle operate on the basis of at least some of this sensor information. In this connection are to be named merely as examples parking assistance systems, lane keeping systems, distance holding systems, night vision systems, and blind angle detection systems.
Driver assistance systems moreover can be designed to emit an information or warning to the driver, only if specific conditions, for instance critical driving conditions are reached. Moreover, driver assistance systems can also be designed for at least semi- autonomous intervention into the driving behaviour of the vehicle.
In particular also ADAS (Advanced Driver Assistance Systems) systems for a vehicle are known. Such systems support the vehicle driver in many situations, for example, if a headlight is to be automatically switched on or off depending on the surroundings information. From the US 201 1 /0054716 an ADAS system is known, in which from satellite information and vehicle information as well as vehicle surroundings information an exact position of the vehicle on the road is to be determined. In this connection an ADAS horizon for supporting the driving of a vehicle, with dynamic data of other sensors are already known.
Moreover, from the DE 10 2009 022 278 A1 a method for determining a hindrance-free space in the surroundings of a vehicle is known. By means of a sensor system a distance image is established and from the distance image a depth map of a
surroundings. When creating the depth map, surface measuring points on a basic plane of the surroundings are taken into consideration in such a way that an evidence and/or probability of hindrance measuring points captured in the same position as the surface measuring points on the basic plane is reduced.
In the known systems an object detection is limited and relatively imprecise.
It is the task of the present invention to provide a method for creating a vehicle surroundings map, a driver assistance device, as well as a vehicle by means of/with which a precise object detection is possible in the surroundings of the vehicle.
This task is solved by a method, a driver assistance device, as well as a vehicle according to the independent claims.
With the method for creating a vehicle surroundings map according to the invention surroundings information of the vehicle is captured and taken into consideration for the creation of the vehicle surroundings map. The vehicle surroundings map is created at least in parts as object perception map, in which perceptibility zones are created. For at least one of the perceptibility zones at least one probability value for the perceptibility of an object in the perceptibility zone is determined. Through proceeding in this way the vehicle surroundings can be subdivided into highly selective and specific zones, wherein then for the improved assessment of the perceptibility of an object into such a zone at least one probability value is moreover taken as a basis. Through such a specific and differentiated way of proceeding with regard to the characterization of the entire vehicle surroundings for the object detection very definite statements about possible critical detection scenarios of objects can be made. Through these clearly specified and detailed indications in the vehicle surroundings subsequently a plurality of precise statements can be made and reactions be performed when driving the vehicle. This leads to a clearly enhanced driving of the vehicle, in particular with regard to supporting a driver of a vehicle. Through a zone-like set-up and subdivision with classification of the zones by values with regard to object perception probability precise operating scenarios can be achieved for driver assistance devices.
Preferably, an object within the scope of present invention does not refer only to physical objects like vehicles, infrastructure, a human body (pedestrian, child), an animal, an item (ball) but also to other characteristics like road shape (hairpin curve), presence of ice or water on the road. Those latter characteristics are also of importance to be detected in time for a correct anticipation of the vehicle behaviour e.g. in case of braking activation by the driver or recommended speed to the driver.
Preferably, it is provided that for each perceptibility zone at least one probability value is determined. Thereby quasi the entire vehicle surroundings can be correspondingly categorized and classified, so that in particular around the vehicle a contiguous area of perceptibility zones is formed, all of which with regard to the object perceptibility are characterized by specific probability values.
Preferably, it is provided that a probability value depending on the surroundings information and/or vehicle information is determined. Thereby highly essential influential factors can be taken into consideration, so that the probability values are extremely exact and thus also the entire object perception map can be represented, in particular covering the entire area contiguously, by means of very exact probability values with regard to the corresponding object perception.
Preferably, it is provided that as surroundings information weather conditions and/or road information and/or traffic information in the area of the vehicle's position are considered. For instance visibility conditions and/or the time of day can be considered as weather conditions. The perceptibility of an object for example can be very different depending on the time of day. Moreover, also weather conditions, such as sunshine, on the one hand, or heavy and dense fog, on the other hand, can crucially affect the object perceptibility within a perceptibility zone. In this regard reference is further made to weather conditions such as rain or snowfall. As these different weather conditions can be captured very precisely using some built-in sensor like camera and/or provided by some application possibly from a wireless communication device with GPS like smartphone, on this basis of this information very precise probability value determinations for the object perception in specific perceptibility zones can be established.
As road information in particular the state of the road surface with regard to its material consistency and/or its surface structure can be taken into consideration. The road information of the lane course can equally be taken as a basis. Additionally or instead, also the dimensions of the road, such as the width of a lane, can be taken into account. Also the kind of road can be considered, with the "kind of road" referring to whether it is
for example a motor way and there are at least two lanes for each driving direction, or a federal road or state road, in which case there is only one lane for each driving direction. In particular road information, such as for instance an as straight as possible course of a lane or else a curvaceous lane or a very narrow lane, taken individually or in combination render a plurality of scenarios forming very specific and individual object perception scenarios, which possibly depend very delicately and subtly on these properties and thus lead to very different probability values for an object perception within a perceptibility zone.
As vehicle information for instance the velocity and/or the vehicle dimensions and/or the vehicle all-round vision design and/or the driving direction of the vehicle can be considered. Vehicles can move at a very large scope of velocity ranging from very small velocities up to very great velocities beyond 200 km/h, with the perceptibility of objects being very different. Also the design of a vehicle with regard to the dimensions and/or the all-round vision design in this context plays a decisive role. With regard to the all- round vision design for example a very low-riding sports car is comprised, which only has a relatively small and/or very flatly arranged or oriented rear window. Equally, though, vehicles can be designed without rear window and for example with only two seats, a driver's seat and one passenger's seat, and thus without rear seating space with side windows. Rear and side vision thereby is considerably limited for a driver. By contrast, for instance a sedan car can be named as a vehicle with relatively large windows having large side windows and a large rear window also in the rear seating space.
With regard to the driving direction it can be taken into account whether the vehicle moves forward or backward. All this vehicle information can equally essentially affect the object perception in a specific perceptibility zone in the surroundings of the vehicle.
Preferably, it is provided that a probability value is determined in dependency on the driver information. Also thereby very decisive information is considered, which could lead to very strong differences in the probability value for the determination of the probability value of a respective perceptibility zone. As driver information for instance the driving behaviour of the driver and/or the age of the driver and/or an agility state of the driver and/the viewing direction of the driver can be considered. With regard to the agility state the emotional state and/or the degree of tiredness of the driver are taken into account. All this information can be captured with suitable sensors and detectors and the corresponding information can be analysed to this effect.
Alternately or in combination, it is provided that a probability value is determined in dependency on human characteristics of the driver particularly the eye sensitivity. Latter is directly related to among other ambient light, size of the detected object, contrast, and spatial frequency and can be predefined from well known eye sensitivity curve. Thus human body intrinsic characteristics can be used to define an object detection map linked to driver perception and possibly to vehicle equipment perception. The driver perception can advantageously be defined by some human object perception model which is used as transfer function between sensors information and human vision characteristics.
Preferably, it is provided that objects to be detected are grouped in classes, with a classification being performed in dependency on the shape and/or the size and/or the optical contrast property of an object. These very specific parameters have an essential effect upon whether an object can be detected more or less well, if it moreover is arranged in a specific perceptibility zone around the vehicle.
It is particularly advantageous if with regard to the classification on the basis of the optical contrast property a first class is formed, in which objects with a light source of their own are grouped. Such objects, which can emit light themselves, possibly can be perceived more easily and faster. Particularly with regard to the position of this object in a specific perceptibility zone around the vehicle may lead to a larger probability value with regard to the object perceptibility. Especially in case of corresponding weather conditions and/or time of day, such objects which emit light themselves can be perceived more easily than others. For this reason they are equipped with a larger probability value. Moreover, additionally or instead it may also be provided that a second class is formed, in which objects without light source of their own but with high-contrast colour difference from the object surroundings are grouped. If such object thus at least in parts is designed to have a colour which is clearly different and in stark contrast to a colour of the immediate surroundings of this object, here, too, a larger probability value can be taken as a basis with regard to the object recognition.
Additionally or instead it is envisaged in particular that a third class is formed in which objects without light source of their own and with low-contrast colour difference from the object surroundings are grouped. Such objects in comparison with the other classes are much harder to be detected, so that they are grouped at a lower probability value.
Due to the plurality of different information that can be rendered the basis for determining the probability value already named in the above, a plurality of determination scenarios are rendered, which can be of more or less complex design.
Preferably, it is envisaged that a probability value of a perceptibility zone and/or the number of probability values for a single perceptibility zone is/are determined in dependence on the location of the perceptibility zone relative to the vehicle and/or the shape of the perceptibility zone. Also by means of this advantageous design the precision and exactness with regard to the statement on the possibility of an object detection in the vehicle surroundings can be enhanced in a highly individual way. For instance it can be provided that for a perceptibility zone at least two probability values are taken as a basis, in order to be able to further subdivide this possibly critical perceptibility zone with regard to the object perception. In this connection a subdivision into discrete graded probability values can be performed. Equally as well, though, it may be envisaged that with regard to the width and/or length of a perceptibility zone a relatively contiguous changing of the probability value occurs. Additionally or instead it may also be provided that the number of probability values for a perceptibility zone is dependent on the already named driver information and/or surroundings information and/or vehicle information.
Preferably, it is provided that the object perception map is created on the basis of sensor information from sensors of the vehicle and/or vehicle-external sensors and that the object perception map is made available to a driver assistance system or a driver assistance device. With such a way of proceeding thus the created object perception map is not created and made available for the vehicle driver himself, but merely for an electronic system of the vehicle. Thereby an information, in particular to an ADAS system of the vehicle, can be made possible with regard to the actual real perceptibility or to the visibility conditions. In this object perception map thus the information captured by the individual sensor types is rendered more precise. The information capturable by the individual sensors more or less exactly in each case thereby can be rendered more precise or be readjusted in a certain sense, so that also their functionality with regard to the object detection and operations resulting therefrom can in a way be more adapted to the individual requirements and situation. Thus, supportive information through the object perception map can be taken as a basis for individual sensor types, which under specific conditions, for instance particular weather conditions can only detect up to a certain degree of precision. Then, however, this information can be improved by the object perception map. Other sensors have a restricted detection precision in heavy rain, while
still others have blind angle fields, in which they cannot detect or only to a very limited extent. By such an object perception map thus the function of individual sensors, in particular the information captured by them, can be rendered more precise and for instance the emitting of a warning to the vehicle driver and/or the intervention into the driving behaviour of the vehicle be effected in a way better adapted to the requirements. Unnecessary optical and/or acoustic warnings and/or undesired vehicle interventions thereby can be minimized, in particular be prevented.
By means of such an object perception map thus sensor information can be rendered more precise, or additional information which cannot be captured and determined by the sensors can be created, which then can be used as a basis in the electronic system, in particular a driver assistance device. With regard to the use of such object perception map as a basis for electronic systems of a vehicle also further information, for instance about other vehicles in road traffic and/or navigation data, can be considered.
Moreover, it can also be envisaged that the object perception map is created on the basis of sensor information from sensors of the vehicle and/or vehicle-external sensors and in dependency on the object perception map at least one driver probability value for an object perception is determined through the driver. Thus, a corresponding correction with regard to the object perception is effected through the sensors, on the one hand, and through the person, on the other hand. On the basis of the information captured by the sensors thus the object perception map is created and an adaptation to the object perception, as it occurs through a person, in particular a driver, is performed. In this connection preferably behavioural patterns of the driver and/or his position are considered. With regard to the behaviour of the driver in particular his viewing direction and/or movements, in particular those of his head and/or eyes are taken into account.
Preferably, a driver probability value for an object perception in dependency on the object probability map is calculated and/or estimated.
The calculation and/or estimation may in particular be based upon the above-named parameters relating to the driver possibly including the characteristics of the human body characteristics like eye sensitivity.
Preferably it is envisaged that in dependency on the driver probability value the driver is warned and/or the intervention into the driving behaviour of the vehicle is performed. The warning in this set-up may be specified to the effect that it only is effected, if the vehicle
driver cannot detect an object and its position within the vehicle surroundings. This may for instance equally be the case with regard to the position of the object relative to the viewing direction of the driver. Due to visibility conditions this may even then be the case if the driver does look into the direction of the object, but cannot see it because of darkness or other conditions. Also, this may for example be the case if the driver cannot detect an object due to a sudden change in the visibility conditions, as this is for instance the case with sharp light/dark boundaries, if for example the vehicle enters a tunnel or a shaded area of a tree or the like. An intervention into the driving behaviour of the vehicle can take many different ways possibly by directly interacting with some vehicle equipment. In latter case an activation or adaptation of visibility systems (lighting, wiping) or some specific ADAS applications or functionalities (braking, speed limitation, steering control ...) can be performed in dependency of the driver probability value of the driver. Preferably, it is envisaged that at least some of the information captured by means of sensors, which is used as a basis to the object perception map, is redundantly captured by various sensor types. Since, as has already been set out, different sensor types detect particularly well under specific ambient conditions, and less well and precisely under other ambient conditions, through such redundant information capture by means of different sensor types this can be counteracted and a very exact information basis be rendered for the determination of the object perception map.
Different sensor types may be the initially named examples, which may be designed as ultrasonic sensors, radar sensors, infrared sensors, laser scanners, cameras, and the like. With regard to the design of the camera this may be sensitive within the spectral range visible to human beings.
Preferably, it is envisaged that the object perception map is created as two-dimensional or as three-dimensional map.
In particular, it is to be stressed that the creation of the object perception map is performed in real time. Thereby an extremely fast object perception map is provided, which is specific to individual demands and can be permanently updated.
Preferably, it is envisaged that a safety map is created for driving the vehicle in consideration of the object perception map, wherein for the creation of the safety map information about other vehicles in the surroundings of the vehicle and/or critical driving behaviour for which the safety map is determined, are considered. On the basis of the safety map then critical driving situations for the vehicle are detected. The object
perception map thus serves as a basis for a further surroundings map, namely the safety map, which then allows for making statements with regard to aspects in the individual zones of the map that might lead to critical situations with objects or other vehicles or driver behaviour, as the vehicle moves further on. This, too, can be determined in dependency on the driving direction of the vehicle in which the safety map is determined or on the velocity of this vehicle or other vehicle information of this vehicle. Also navigation data can be used as a basis in this connection. Equally, information may be considered that are exchanged by a car-to-car communication between two vehicles.
Through the safety map also safety probability values can be determined, on the basis of which a statement about possible collisions or the like in the corresponding zone can be made. This may in turn be effected in dependency on the named driver information and/or surroundings information and/or road information. Here, too, information that is created through the safety map can be directly provided to an electronic system of the vehicle to interact with vehicle equipment, in particular a driver assistance device, or a vehicle driver. In this regard information can also be used to be displayed as image on a display unit. Other possible mode for display can be considered like front vehicle vision in forward direction. Equally, additionally or instead also acoustic and/or optical warnings may be given off.
In all embodiments it can be provided that the object perception map and/or the safety map are displayed on a display unit, in particular a screen, in the vehicle. Preferably, the object perception map and/or the safety map are displayed as a plan view of the vehicle and the vehicle surroundings.
An essential idea of the invention consequently consists in a method in which vehicle surroundings information, such as for instance road information, optical contrast information of the object, and atmospheric conditions, in particular weather conditions, with driver information, such as for instance driver behaviour and/or driver characteristics are combined, in order to create a perceptibility map on the basis of probability values of an object perception in perceptibility zones. Such object perception map then can serve as a basis for the most varied applications. For example it can be used as the basis to an adaptive activation or non-activation of a warning to the vehicle driver with regard to the critical situation. Through the object perception map the occurrence of unnecessary alarms can be reduced. The object perception map, though, can also be rendered the basis to the work of further electronic devices of the vehicle, in particular driver
assistance devices. In particular it can serve as basis for ADAS applications or functionalities.
Preferably, the relevant object perception maps are created through the system, with these maps being created individually and independently of the respective situation as required. With regard to creating the maps as required, in particular the situation to be assessed is used as a basis.
As ADAS information for instance all information of the vehicle or the systems incorporated in the vehicle are collected. In this connection raw data are equally collected, in order to be able to receive relevant information for the perceptibility determination. In this regard for example also stationary objects such as tunnels, traffic signs, trees, road markings, rails for a train, which at least partly extend within the road, and the like can be captured. This information can for example be obtained via sensors incorporated in the vehicle and/or via navigation information and/or via car-to-car communication.
Equally, road information and/or surroundings information can be obtained from vehicle- internal sensors and/or surroundings information from via vehicle-external information sources, such as for instance also a car-to-car communication, or via navigation systems.
Further, the invention relates to a driver assistance device which is designed for performing a method according to the invention or an advantageous embodiment thereof. The driver assistance device comprises at least one evaluation unit, which on the basis of information obtained, for example from vehicle-internal sensors and/or vehicle-external sensors, creates the object perception map and/or the safety map. The driver assistance device moreover can have a display unit on which the object perception map and/or the safety map is displayable. The driver assistance device can be designed to be merely for giving off an acoustic and/or optical warning. However, additionally or instead it can be designed for at least semiautonomous intervention into the driving behaviour of the vehicle. The driver assistance device with regard to its functionality can for example be designed as a parking assistance system, as a distance holding system, as a lane keeping system, as a lane departure warning system, as a blind angle detection system, as night vision system etc.
Moreover, the invention relates to a vehicle with a driver assistance device according to the invention.
Further features of the invention derive from the claims, the figures, and the description of the figures. The features and feature combinations previously mentioned in the description and the features and feature combinations named in the following in the description of the figures and/or the figures alone, are applicable not only in the respectively indicated combination, but also in any other combination or taken alone, without departing from the scope of the invention.
Embodiments of the invention in the following are set out in more detail by referring to schematic drawings. These show in:
Fig. 1 an embodiment of a created object perception map;
Fig. 2 a further embodiment of an object perception map;
Fig. 3 a block diagram showing the method sequence of creating a safety map;
Fig. 4 a sketched representation of a work scenario of a driver assistance device for which an object perception map is used as a basis.
Fig. 5 the scenario according to Fig. 4 with an ambient condition that is different to that in the representation according to Fig. 4, in particular with regard to the visibility conditions;
Fig. 6 a further schematic scenario of the driver's mode of operation assistance device for which an object perception map is used as a basis;
Fig. 7 the scenario according to Fig. 6 with an ambient condition which is
different from that in Fig. 6, in particular with different visibility conditions.
In the figures equal elements or elements of equal function are equipped with the same reference signs.
In Fig. 1 an embodiment of an object perception map 1 is shown as it is used in present invention by storing corresponding information into some memory of the system to be let available possibly in real time for different possible applications. This object perception map can be adapted in real time regarding current driving situation. Some of those information may be directly represented on a display unit of a driver assistance device of a vehicle. The object perception map 1 is shown as plan view of a vehicle 2 and the vehicle surroundings area 3. As can be seen, the vehicle surroundings area is shown in a vehicle surroundings map 4, which at least in parts is completely characterized in the embodiment by the object perception map 3.
The object perception map 3 is subdivided into a plurality of contiguous perceptibility zones 5 to 17. For each of the perceptibility zones 5 to 17 moreover also at least one probability value for the perceptibility of an object in the respective perceptibility zone 5 to 17 is indicated and displayed. In the embodiment shown in Fig. 1 for each
perceptibility zone 5 to 17 only one probability value is indicated. Each probability value is given in percentage. It describes the probability with which an object can be perceived in the respective area or in the respective perceptibility zone 5 to 17. Such probability value is determined in particular in dependency on the surroundings information of the vehicle 2, which in the shape of an image symbolizes the actual vehicle in the object perception map 3 and/or determines vehicle information and/or driver information. As surroundings information are considered for instance weather conditions and/or road information and/or traffic information in the area of the vehicle's position. This may in particular be within the surroundings area of the vehicle, which area may vary in size and shape. According to the representation in Fig. 1 it is provided that those perceptibility zones that are formed in front of the vehicle 2, with regard to their extension towards the respectively facing vehicle side, are larger than the perceptibility zones that are arranged laterally to the vehicle 2. This is given in particular if the vehicle 2 moves forward.
In this connection it can be seen that a probability value for an object detection in the perceptibility zone 5, which is formed centrally in front of the vehicle 2, amounts to 90%. In Fig. 1 a situation is given which allows for free and clear vision, for instance sunshine.
With regard to the calculation of the probability values as road information for instance the state of the road surface and/or the course of the road and/or the dimensions of the road and/or the kind of road are considered. As vehicle information for instance the velocity and/or the vehicle dimensions and/or the vehicle all-round vision design and/or the driving direction of the vehicle is/are considered.
In particular already for the creation of the object perception map 3 also a driver information can be taken into account, wherein as driver information for example the driving behaviour of the driver and/or the age of the driver and/or the agility state of the driver and/or the present viewing direction of the driver are considered.
For the probability value with which an object is detected within one of the formed perceptibility zones 5 to 17, a grouping of the objects into classes can be performed. In the embodiment it is envisaged that at least three classes are formed, which are defined in dependency on the shape and/or the size and/or the optical contrast properties of the respective objects. With regard to the optical contrast properties classifications are performed to the effect that a first class is provided, in which objects having a light source of their own are grouped. In a second class those objects are grouped which lack a light source of their own, but have a high-contrast colour difference from the object surroundings. In the third class those objects are grouped which have no light source of their own and have a low-contrast colour difference from the object surroundings.
It can be seen that an object detection in the perceptibility zones 6 and 16, which extend in front of the vehicle 2, is not possible for the vehicle driver. This is the case, as in this field of vision the A columns of the vehicle 2 are positioned and thereby the field of vision of the driver is obstructed.
Also in the immediate vicinity of the vehicle in the perceptibility zone 17 a probability of 0% is indicated, since here the vision of the driver through the carriage of the vehicle 2 in particular down towards the road is obstructed. In particular small object thus cannot be detected in this perceptibility zone 17. A probability of 0% is typical when considering driver perception using the human characteristics. A different probability could be set for that zone 17 if some detectors like ultrasonic and/or radar and/or other sensors are available at suitable places on the vehicle.
Moreover, it can be seen that also the further perceptibility zones in comparison with the perceptibility zone 5 are predetermined with clearly reduced probability values for the object detection. Since commonly a vehicle driver during the forward movement of a vehicle 2 keeps his viewing direction to the front, the perceptibility in this lateral and rear perceptibility zones is clearly reduced. Even though here also external mirrors and an interior mirror, which facilitates rear vision, may be arranged within the vehicle 2, nevertheless the object perception here is clearly reduced. For the determination of the
object perception map 3 in Fig. 1 also an object of class 1 with regard to its optical contrast properties is used as a basis. In particular also dry road conditions are assumed.
The number and design and colour of the perceptibility zones 5 to 17 are merely examples. Equally, the probability values are merely exemplary. It is merely to be rendered clear how such object perception map 3 may look and which information it may provided.
On the basis of this object perception map 3, due to the highly differentiated and detailed subdivision of the vehicle surroundings into these perceptibility zones 5 to 17 and additionally the specified probability values very precise scenarios for the mode of operation of the driver assistance devices can be rendered. On the basis of this object perception map 3, which then for example can be used as a basis exclusively for a driver assistance device, detected information from sensors can be rendered more precise or plausible. Equally, on the basis of this information the object perception map, in particular the probability values, operation scenarios of the driver assistance device can be improved. In particular here acoustic and/or optical warnings to the driver can be enhanced. An unnecessary, incorrect or too frequent emission of such warnings can be avoided. Thereby also the safety in driving a vehicle can clearly be raised, as the driver is not unnecessarily distracted or even startled.
The shape and/or size of a perceptibility zone 5 to 17 can be predetermined individually in particular in dependency on its position within the vehicle surroundings relative to the vehicle 2. This, too, can for example be performed on the basis of further information about the vehicle 2 and/or the vehicle surroundings. A probability value of a perceptibility zone 5 to 17 and/or the number of probability values for a probability zone 5 to 17 thus can be determined in particular in dependency on the position of the respective probability zone 5 to 17 relative to the vehicle 2 and/or the shape of the probability zone 5 to 17. In this way, too, the precisions with regard to the probability of the object detection in a zone can be determined and in particular the resulting operating mode of a driver assistance device be enhanced.
In particular the object perception map 3 is created on the basis of sensor information from vehicle-internal and/or vehicle-external sensors and the object perception map is provided for an already mentioned driver assistance device.
In particular, it is envisaged that the object perception map 3 is created on the basis of this sensor information and in dependency on the object perception map 3 at least one driver probability value for an object perception is determined by the driver himself. The driver probability value can be calculated and/or estimated on the basis of the object perception map 3. This is advantageous, since due to the human perception
characteristics (eye sensitivity curve) and/or the position of the driver within the vehicle, in particular the driver's eyes, various perception scenarios in comparison with the purely electronic perception through the sensors occur, so that in this regard a corresponding combination or correction of the probability values takes place. Thus also a quasi intelligent system can be provided, which, whilst it is based upon electronic components, however, is quasi arranged in the sense of the driver and his human object perception and can respond accordingly. In some alternative, only the human characteristics adapted to some specific surrounding i.e. driver within the vehicle, may be sufficient to achieve the goal of present invention i.e. creating the vehicle surrounding map. In this regard namely on the basis of the driver perception values operation scenarios of at least one driver assistance device may be rendered that differ from operation scenarios that are merely based on the probability values, as indicated in the object perception map 3.
Moreover, it may be envisaged that at least some information captured by means of sensors and taken as a basis for the determination of the object perception map 3 is redundantly captured through various sensor types of the vehicle 2. Since various sensor types in dependency on the different ambient conditions, in particular weather conditions, may vary in terms of the precision of the information capture, through this redundant information capture the precision of the detected information can be improved. On this basis also the determination of the perceptibility zones 5 to 17 and in particular of the probability values used as a basis can be clearly rendered more precise.
In the embodiment according to Fig. 1 the object perception map 3 can be designed as a plan view and as a two-dimensional map. However, it can also be created as three- dimensional map.
In particular it is to be mentioned that the creation of the object perception map 3 is effected in realtime. But some part of it can be recorded in a previous step or possibly predefined like for the vehicle body and the human vision model characteristics.
In Fig. 2 the object perception map 3 unlike the design in Fig. 1 is shown merely for a different surroundings information with regard to a different weather scenario. In Fig. 2
the object perception map 3 in this connection is shown with bad vision, in particular fog. With the vehicle 2 moving forward and otherwise identical conditions in comparison with the scenario representation in Fig. 1 in this regard clearly different probability values for an object detection in the respective perceptibility zones 5 to 17 are rendered. As can be seen from the representations in Fig. 1 and Fig. 2, here an exemplary scenario differing in terms of the weather conditions is given, which is reflected in the embodiment in that the number and shape as well as the position of the perceptibility zones 5 to 17 is the same.
Besides a different determination of the probability values in the respective perceptibility zones 5 to 17, moreover, in Fig. 2 it can also be seen that for instance in the
perceptibility zone 5 a plurality of probability values is determined. As can be seen, with increasing distance the probability value of an object detection decreases possibly in discrete steps but could be also in a continuous way. In the embodiment thus four subzones within the perceptibility zone are formed, the probability values of which decrease from 90% to 10%. The determination of the number of subzones can be performed in real time as the transition from the situation of Fig. 1 to Fig. 2. In fact the number of subzone may be adapted in real time to the environment and/or human characteristics of the driver.
In Fig. 3 in a simplified block diagram the procedure for determining a safety map 18 from an object perception map 3 are shown. Starting from an object perception map 3, as shown for example in Fig. 1 or Fig. 2, additionally further traffic information, such as for example information about other vehicles in the vicinity of the vehicle 2, is
considered. This may for example amongst others be based on navigation data and/or car-to-car communication information. This information is captured according to the shown block 19 and evaluated, then combined in an evaluation unit of a driver assistance device of the vehicle 2, and a safety map 18 created therefrom. The combination of the various pieces of information is characterized in an exemplary way in Fig. 3 through block 20, wherein this also symbolizes the evaluation unit.
By means of the safety map 18 critical traffic situations for the vehicle 2 can be detected early and a corresponding hint be given to the driver and/or an intervention into the driving behaviour of the vehicle be performed, so that the driving of the vehicle is improved with regard to safety aspects.
It can also be envisaged that the object perception maps 3 shown in Figs. 1 and 2 are also marked in their perceptibility zone 5 and 17 by specific colouring when shown to the driver. In particular thereby especially critical zones can also be marked by specific colourings, so that these can be identified intuitively and quickly by a driver when information is shown to him. In this way the perceptibility zone 5 can for example be designed to be green, which means that a high probability of object detection is given for the driver. The driver has sufficient time to respond to the situation himself. Moreover, it may be envisaged that the perceptibility zones 6, 8, 10, 12, 14, and 16, as well as 17 are coloured red and thereby characterize particularly critical zones with minimal or no perceptibility probability. The driver with regard to an object detection in these zones has no way of reacting. In the remaining other perceptibility zones 7, 9, 1 1 , 13, and 15 the object perceptibility for a user is clearly reduced. Nevertheless there is still sufficient time left for the vehicle driver to react.
Such set-up and design is particularly advantageous for the design of a safety map 18. The colouring and the evaluation with regard to the detectability and the possibility for the driver to react therefore serves in particular as a basis for the safety map 18. This is because on the basis of these probability evaluations with regard to the possibilities for a driver to react in connection with the perceptibility probabilities groupings of traffic situations with regard to critical or less critical states can be performed.
Fig. 4 in a schematic representation shows a traffic scenario with bad visibility conditions, wherein the vehicle 2 moves from the left to the right according to an arrow. On the basis of the object perception map or a safety map information is transmitted to the vehicle driver. In the representation according to Fig. 4 Tc means the period of time until collision with a pedestrian 21 . If according to the first scenario the period of time D0 until collision is < 2 s, through the driver assistance device on the basis of the object perception map or the safety map it is detected that there is no time left for reaction and a warning is immediately given off and possibly an intervention into the driving behaviour of the vehicle 2 performed. If the pedestrian 21 is in the area II, the period of time D1 until collision of the vehicle 2 with the pedestrian 21 amounts to between 2 s and 10 s. Thus sufficient time for a reaction is given. However, due to the bad visibility conditions an alarm or a warning is given off and/or an intervention into the driving behaviour of the vehicle performed.
If the pedestrian 21 is in the area III and thus still at a far distance from the vehicle 2, it is detected that there remains time for reaction despite the bad visibility conditions. And as
the pedestrian 21 thus is outside of the area of interest for the function of the driver assistance device, no warning given off.
In Fig. 5 a schematic scenario on the basis of the representation in Fig. 4 is shown with much better visibility conditions than those in the representation in Fig. 4. Unlike is the case in the representation according to Fig. 4, here a warning is merely given off by the driver assistance device, if the pedestrian 21 is in the area I. Due to the good visibility conditions no warning is given off and also no intervention into the driving behaviour of the vehicle performed, if the pedestrian 21 is in the area II or III. The driver of the vehicle can detect the pedestrian 21 in the areas II and III early and accordingly can react himself.
Fig. 6 shows a schematic night time scenario, in which the vehicle 2 moves in the direction of the arrow from the left to the right. In this scenario it is envisaged that the vehicle 2 is in the position shown in Fig. 6 in an area illuminated for instance by a street light. In zone I, by contrast, the brightness is already clearly reduced and further decreases in the direction of the area II. In particular in the area III it then is virtually dark. Due to this scenario, between the area in which the vehicle 2 is and the area I there is a certain bright/dark transition, through which the driver of the vehicle 2 is glared. The perceptibility of pedestrian 21 in the area I therefore is limited and no reasonable object perception possible. Sufficient time for reaction consequently is not given and thus a warning is given off by the driver assistance device to the driver of the vehicle 2 and/or an intervention into the driving behaviour of the vehicle 2 is performed through the driver assistance device. The same also happens, if the pedestrian 21 according to the representation in Fig. 6 is in the area II. If the pedestrian is in the area III, on the one hand, there is sufficient time for a reaction of the driver of the vehicle 2, on the other hand, the distance from the vehicle 2 is large enough for this area III to also be outside of the area of interest with regard to the function of the driver assistance device and thus in particular no warning is given off, either.
Fig. 7 shows a scenario in analogy to that of Fig. 6, wherein unlike in Fig. 6 the vehicle 2 is not in a brightly illuminated area at dark night time, but in an area to the left of area I, which is relatively dark. An abrupt bright/dark transition thus is not given and the eyes of the driver have already adapted to the dark visibility conditions. In this constellation then in analogy to the explanations given in Fig. 6 due to the short distance of the pedestrian 21 from the vehicle 2 nevertheless here, too, a warning is given off to the vehicle driver and/or an intervention into the driving behaviour of the vehicle 2 performed.
If the pedestrian 21 is in the area II, due to the eyes of the driver 2 already having adapted to the darkness there is sufficient time for detecting the pedestrian 21 and for reacting by a certain driving behaviour. Consequently in the case of the scenario shown in Fig. 7 for a position of the pedestrian 21 in the area II no warning is given off by the driver assistance device when considering human characteristics of the driver (time required for human eyes to adapt to some obscurity).
For each function of a driver assistance device at least one individually established and adjusted object perception map and possibly therefrom in particular a safety map can be created, especially in order to be able to perform the generation of warnings due to more or less critical traffic situations in a way better adapted to the requirements and the situation and to thus avoid too frequent unnecessary warnings. The same can additionally or instead also be performed with regard to the interventions of a driver assistance device into the driving behaviour of the vehicle. For example, an automatic activation of fog lamps as provided within some ADAS could be advantageously correlated to the defined object perception map. This would permit that e.g. by fog night with high beam / low beam automation switch, low beam regime is kept even without any vehicle in front to account for presence of fog avoiding driver retro-glaring.
Claims
1 . A method for creating a vehicle surroundings map, in which surroundings
information is captured and considered for creating the vehicle surroundings map, characterized in that
the vehicle surroundings map at least in parts is created as object perception map (3), in which perceptibility zones (5 to 17) are created, and at least for one of the perceptibility zones (5 to 17) at least one probability value for the perceptibility of an object (21 ) in the perceptibility zone (5 to 17) is determined.
2. The method according to claim 1 ,
characterized in that
for each perceptibility zone (5 to 17) at least one probability value is determined.
3. The method according to claim 1 or 2,
characterized in that
a probability value in dependency on the surroundings information and/or vehicle information is determined.
4. The method according to claim 3,
characterized in that
as surroundings information weather conditions and/or road information and/or traffic information in the area of the vehicle position are considered.
5. The method according to claim 4,
characterized in that
as weather conditions visibility conditions and/or the time of day are considered.
6. The method according to 4 or 5,
characterized in that
as road information the state of the road surface and/or the road course and/or the road dimensions and/or the kind of road are considered.
7. The method according to one of claims 2 to 6,
characterized in that
as vehicle information the velocity and/or the dimensions of the vehicle and/or the all-round vision design of the vehicle and/or the driving direction of the vehicle (2) are considered.
8. The method according to any one of the preceding claims,
characterized in that
a probability value is determined in dependency on driver information.
9. The method according to claim 8,
characterized in that
as driver information the driving behaviour of the driver and/or the age of the driver and/or the agility state of the driver and/or the viewing direction of the driver are considered.
10. The method according to any one of the preceding claims,
characterized in that
objects (21 ) are grouped in classes, wherein a classification is performed depending on the shape and/or the size and/or the optical contrast property of an object (21 ).
1 1 . The method according to claim 10,
characterized in that
with regard to the classification on the basis of the optical contrast property a first class is formed in which objects (21 ) with a light source of their own are grouped and/or a second class is formed in which objects (21 ) without light source of their own but with high-contrast colour difference from the object surroundings are grouped, and/or a third class is formed in which objects (21 ) without light source of their own and with low-contrast colour difference from the object surroundings are grouped.
12. The method according to any one of the preceding claims, characterized in that
a probability value of a perceptibility zone (5 to 17) and/or the number of probability values for a perceptibility zone (5 to 17) is determined at least in dependency on the location of the perceptibility zone (5 to 17) relative to the vehicle (2) and/or the shape of the perceptibility zone (5 to 17).
13. The method according to any one of the preceding claims,
characterized in that
the object perception map (3) is created on the basis of sensor information from sensors of the vehicle and/or vehicle-external sensors and the object perception map (3) is made available to a driver assistance device.
14. The method according to one of the preceding claims,
characterized in that
the object perception map (3) is created on the basis of sensor information from sensors of the vehicle (2) and/or vehicle-external sensors and in dependency on the object perception map (3) at least one driver probability value for an object perception is determined through the driver.
15. The method according to claim 14,
characterized in that
in dependency on the driver probability value the driver is warned and/or an intervention into the driving behaviour of the vehicle (2) is performed and/or a vehicle equipment activation is performed.
16. The method according to one of claims 13 to 15,
characterized in that
at least some of the information captured by means of the sensors upon which the determination of the object perception map (3) is based are redundantly captured by different sensor types.
17. The method according to one of the preceding claims,
characterized in that the object perception map (3) is created as two-dimensional or three-dimensional map.
18. The method according to one of the preceding claims,
characterized in that
the creation of the object perception map (3) is performed in real time.
19. The method according to one of the preceding claims,
characterized in that
a safety map (18) is created for driving the vehicle in consideration of the object perception map (3), wherein for the creation of the safety map (18) at least information about other vehicles in the surroundings of the vehicle (2), for which the safety map (18) is determined, are considered and critical driving situations for the vehicle (2) detected on the basis of the safety map (18).
20. A driver assistance device designed for performing a method according to one of the preceding claims.
21 . A vehicle (2) with a driver assistance device according to claim 20.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2011/065507 WO2013034182A1 (en) | 2011-09-08 | 2011-09-08 | Method for creating a vehicle surroundings map, driver assistance device, and vehicle having a driver assistance device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2011/065507 WO2013034182A1 (en) | 2011-09-08 | 2011-09-08 | Method for creating a vehicle surroundings map, driver assistance device, and vehicle having a driver assistance device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013034182A1 true WO2013034182A1 (en) | 2013-03-14 |
Family
ID=44587826
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2011/065507 WO2013034182A1 (en) | 2011-09-08 | 2011-09-08 | Method for creating a vehicle surroundings map, driver assistance device, and vehicle having a driver assistance device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2013034182A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019037907A1 (en) * | 2017-08-21 | 2019-02-28 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for determining a probability with which an object will be located in a field of view of a driver of a vehicle |
CN112009483A (en) * | 2019-05-30 | 2020-12-01 | 罗伯特·博世有限公司 | Redundancy information for object interfaces for highly and fully automated driving |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS4871402A (en) * | 1971-12-28 | 1973-09-27 | ||
WO1995025322A1 (en) * | 1994-03-15 | 1995-09-21 | Gallium Software Inc. | Blind spot detector |
JP2001108745A (en) * | 1999-10-13 | 2001-04-20 | Matsushita Electric Ind Co Ltd | On-vehicle radar unit |
US20030218919A1 (en) * | 2002-02-08 | 2003-11-27 | Omron Corporation | Distance measuring apparatus |
JP2004114931A (en) * | 2002-09-27 | 2004-04-15 | Nissan Motor Co Ltd | Apparatus for detecting looking aside |
US20050063565A1 (en) * | 2003-09-01 | 2005-03-24 | Honda Motor Co., Ltd. | Vehicle environment monitoring device |
US20050073136A1 (en) * | 2002-10-15 | 2005-04-07 | Volvo Technology Corporation | Method and arrangement for interpreting a subjects head and eye activity |
EP1681213A2 (en) * | 2005-01-17 | 2006-07-19 | Kabushiki Kaisha Toyota Chuo Kenkyusho | Collision behavior control apparatus |
US20070010937A1 (en) * | 2005-07-08 | 2007-01-11 | Denso Corporation | Road shape recognition apparatus |
EP1785326A1 (en) * | 2005-11-09 | 2007-05-16 | Nissan Motor Co., Ltd. | Vehicle driving assist |
JP2007318387A (en) * | 2006-05-25 | 2007-12-06 | Nissan Motor Co Ltd | Inter-vehicle communication device |
US20080288140A1 (en) * | 2007-01-11 | 2008-11-20 | Koji Matsuno | Vehicle Driving Assistance System |
JP2009120147A (en) * | 2007-11-19 | 2009-06-04 | Aisin Seiki Co Ltd | Vehicular lamp control system |
JP2009237776A (en) * | 2008-03-26 | 2009-10-15 | Mazda Motor Corp | Vehicle drive supporting apparatus |
DE102009022278A1 (en) | 2009-05-22 | 2010-01-21 | Daimler Ag | Obstruction-free area determining method for drive assistance system in vehicle, involves considering surface-measuring point on ground plane during formation of depth map such that evidence of obstruction-measuring points is reduced |
US20100191433A1 (en) * | 2009-01-29 | 2010-07-29 | Valeo Vision | Method for monitoring the environment of an automatic vehicle |
US20100253539A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Vehicle-to-vehicle communicator on full-windshield head-up display |
US20110054716A1 (en) | 2008-02-15 | 2011-03-03 | Continental Teves Ag & Co Hg | Vehicle system for navigation and/or driver assistance |
JP2011198247A (en) * | 2010-03-23 | 2011-10-06 | Toyota Motor Corp | Driving support device |
JP2011210098A (en) * | 2010-03-30 | 2011-10-20 | Toyota Motor Corp | Drive assistance system and drive assistance method |
-
2011
- 2011-09-08 WO PCT/EP2011/065507 patent/WO2013034182A1/en active Application Filing
Patent Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS4871402A (en) * | 1971-12-28 | 1973-09-27 | ||
WO1995025322A1 (en) * | 1994-03-15 | 1995-09-21 | Gallium Software Inc. | Blind spot detector |
JP2001108745A (en) * | 1999-10-13 | 2001-04-20 | Matsushita Electric Ind Co Ltd | On-vehicle radar unit |
US20030218919A1 (en) * | 2002-02-08 | 2003-11-27 | Omron Corporation | Distance measuring apparatus |
JP2004114931A (en) * | 2002-09-27 | 2004-04-15 | Nissan Motor Co Ltd | Apparatus for detecting looking aside |
US20050073136A1 (en) * | 2002-10-15 | 2005-04-07 | Volvo Technology Corporation | Method and arrangement for interpreting a subjects head and eye activity |
US20050063565A1 (en) * | 2003-09-01 | 2005-03-24 | Honda Motor Co., Ltd. | Vehicle environment monitoring device |
EP1681213A2 (en) * | 2005-01-17 | 2006-07-19 | Kabushiki Kaisha Toyota Chuo Kenkyusho | Collision behavior control apparatus |
US20070010937A1 (en) * | 2005-07-08 | 2007-01-11 | Denso Corporation | Road shape recognition apparatus |
EP1785326A1 (en) * | 2005-11-09 | 2007-05-16 | Nissan Motor Co., Ltd. | Vehicle driving assist |
JP2007318387A (en) * | 2006-05-25 | 2007-12-06 | Nissan Motor Co Ltd | Inter-vehicle communication device |
US20080288140A1 (en) * | 2007-01-11 | 2008-11-20 | Koji Matsuno | Vehicle Driving Assistance System |
JP2009120147A (en) * | 2007-11-19 | 2009-06-04 | Aisin Seiki Co Ltd | Vehicular lamp control system |
US20110054716A1 (en) | 2008-02-15 | 2011-03-03 | Continental Teves Ag & Co Hg | Vehicle system for navigation and/or driver assistance |
JP2009237776A (en) * | 2008-03-26 | 2009-10-15 | Mazda Motor Corp | Vehicle drive supporting apparatus |
US20100191433A1 (en) * | 2009-01-29 | 2010-07-29 | Valeo Vision | Method for monitoring the environment of an automatic vehicle |
US20100253539A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Vehicle-to-vehicle communicator on full-windshield head-up display |
DE102009022278A1 (en) | 2009-05-22 | 2010-01-21 | Daimler Ag | Obstruction-free area determining method for drive assistance system in vehicle, involves considering surface-measuring point on ground plane during formation of depth map such that evidence of obstruction-measuring points is reduced |
JP2011198247A (en) * | 2010-03-23 | 2011-10-06 | Toyota Motor Corp | Driving support device |
JP2011210098A (en) * | 2010-03-30 | 2011-10-20 | Toyota Motor Corp | Drive assistance system and drive assistance method |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019037907A1 (en) * | 2017-08-21 | 2019-02-28 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for determining a probability with which an object will be located in a field of view of a driver of a vehicle |
CN110891841A (en) * | 2017-08-21 | 2020-03-17 | 宝马股份公司 | Method and device for ascertaining the probability of an object being in the field of view of a vehicle driver |
US20200193629A1 (en) * | 2017-08-21 | 2020-06-18 | Bayerische Motoren Werke Aktiengesellschaft | Method and Device for Determining a Probability With Which an Object Will Be Located in a Field of View of a Driver of a Vehicle |
CN110891841B (en) * | 2017-08-21 | 2023-05-12 | 宝马股份公司 | Method and device for ascertaining the probability that an object is in the field of view of a vehicle driver |
US11935262B2 (en) | 2017-08-21 | 2024-03-19 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for determining a probability with which an object will be located in a field of view of a driver of a vehicle |
CN112009483A (en) * | 2019-05-30 | 2020-12-01 | 罗伯特·博世有限公司 | Redundancy information for object interfaces for highly and fully automated driving |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11338820B2 (en) | Vehicle automated driving system | |
US10595176B1 (en) | Virtual lane lines for connected vehicles | |
EP3216667B1 (en) | Control system for vehicle | |
JP6319349B2 (en) | Information presentation device | |
US10120378B2 (en) | Vehicle automated driving system | |
US10067506B2 (en) | Control device of vehicle | |
US9649936B2 (en) | In-vehicle device, control method of in-vehicle device, and computer-readable storage medium | |
US9589464B2 (en) | Vehicular headlight warning system | |
US20160046289A1 (en) | Method of Warning Road Users of Potential Danger Areas Caused by a Vehicle that is or Will be Performing a Maneuver | |
US20130058116A1 (en) | Method and device for changing a light emission of at least one headlight of a vehicle | |
US12037006B2 (en) | Method for operating a driver information system in an ego-vehicle and driver information system | |
US10974642B2 (en) | Device for luminously signalling a change of lane for a motor vehicle | |
JP2019045901A (en) | Information presenting device | |
WO2016157892A1 (en) | Information presentation apparatus | |
JP5531919B2 (en) | Headlight control device | |
WO2013034182A1 (en) | Method for creating a vehicle surroundings map, driver assistance device, and vehicle having a driver assistance device | |
JP2021039554A (en) | Gazing guidance device for vehicle | |
CN112896117B (en) | Method and subsystem for controlling an autonomous braking system of a vehicle | |
WO2016157891A1 (en) | Information presentation apparatus | |
CN114523905A (en) | System and method for displaying detection and track prediction of targets around vehicle | |
US10767989B2 (en) | Method and device for detecting a light-emitting object at a traffic junction for a vehicle | |
WO2020167110A1 (en) | System and method based on the correlation of interior and exterior visual information in a vehicle for improving safety in manual or semi-automatic driving |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11754869 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11754869 Country of ref document: EP Kind code of ref document: A1 |