CN108240819B - Driving support device and driving support method - Google Patents
Driving support device and driving support method Download PDFInfo
- Publication number
- CN108240819B CN108240819B CN201711295856.6A CN201711295856A CN108240819B CN 108240819 B CN108240819 B CN 108240819B CN 201711295856 A CN201711295856 A CN 201711295856A CN 108240819 B CN108240819 B CN 108240819B
- Authority
- CN
- China
- Prior art keywords
- driving
- driver
- unit
- route
- plan
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 10
- 230000008451 emotion Effects 0.000 claims abstract description 104
- 238000001514 detection method Methods 0.000 claims abstract description 16
- 238000004364 calculation method Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 6
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 241000282414 Homo sapiens Species 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 206010039203 Road traffic accident Diseases 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000036760 body temperature Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 206010063659 Aversion Diseases 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- FASDKYOPVNHBLU-ZETCQYMHSA-N pramipexole Chemical compound C1[C@@H](NCCC)CCC2=C1SC(N)=N2 FASDKYOPVNHBLU-ZETCQYMHSA-N 0.000 description 1
- 229960003089 pramipexole Drugs 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/082—Selecting or switching between different modes of propelling
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3453—Special cost functions, i.e. other than distance or default speed limit of road segments
- G01C21/3484—Personalized, e.g. from learned user behaviour or user-defined profiles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0075—Automatic parameter input, automatic initialising or calibrating means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/22—Psychological state; Stress level or workload
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/50—External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Mathematical Physics (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
A driving assistance apparatus having: a proficiency level detection unit (11) that detects the driving proficiency level of the driver; a biological information acquisition unit (20) that acquires biological information of the driver; an emotion estimation unit (12) that estimates the emotion of the driver from the biological information acquired by the biological information acquisition unit (20); a plan generation unit (13) that generates a driving plan on the basis of the driving proficiency detected by the proficiency detection unit (11) and the emotion of the driver estimated by the emotion estimation unit (12); and a plan providing unit (14) that provides the driving plan generated by the plan generating unit (13) to the driver.
Description
Technical Field
The present invention relates to a driving assistance device and a driving assistance method that generate and provide a driving plan corresponding to driving proficiency of a driver.
Background
A device for performing a route search is known in the art, wherein the route search reflects the driving proficiency of the driver. For example, in the device described in japanese patent laid-open publication No. 2005-106475 (JP2005-106475a), a plurality of candidate routes from a departure point to a destination are provided (presented) to a driver, and when the driving proficiency of the driver is low, a candidate route having the smallest driving difficulty section is selected and provided to the driver.
However, in the apparatus described in JP2005-106475a, since the route search is performed in consideration of the driving proficiency regardless of the feeling (emotion) of the driver such as the driving enthusiasm, for example, the same candidate route is provided both when the driving enthusiasm of the driver is high and when the driving enthusiasm of the driver is low, and the satisfaction of the driver with respect to the provision of the candidate route cannot be sufficiently improved.
Disclosure of Invention
A driving assistance device according to an aspect of the present invention includes: a driving skill detection unit that detects a driving skill of a driver of the vehicle; a biological information acquisition unit that acquires biological information (biological information) of the driver; an emotion estimation unit that estimates the emotion (emotion) of the driver from the biological information acquired by the biological information acquisition unit; a plan generation unit that generates a driving plan based on the driving proficiency detected by the proficiency detection unit and the emotion of the driver estimated by the emotion estimation unit; and a plan providing unit that provides (presents) the driving plan generated by the plan generating unit to the driver.
A driving assistance method as another aspect of the present invention includes the steps of: detecting a driving proficiency of a driver of the vehicle; acquiring biological information of a driver; estimating the emotion of the driver according to the acquired biological information; generating a driving plan based on the detected driving proficiency and the estimated emotion of the driver; the generated driving plan is provided to the driver.
Drawings
The objects, features and advantages of the present invention will become more apparent from the following description of the embodiments in conjunction with the accompanying drawings. In the drawings, there is shown in the drawings,
fig. 1A is a block diagram showing a schematic configuration of a driving assistance device according to an embodiment of the present invention;
FIG. 1B is a block diagram showing a functional structure of the ECU of FIG. 1A;
fig. 2 is a diagram showing a relationship between driving skill and a target route used in the processing of the 1 st route generation unit in fig. 1B;
FIG. 3 is a diagram showing the relationship between the driving skill and the target route used in the processing of the 2 nd route generation unit in FIG. 1B;
FIG. 4 is a flowchart showing an example of processing performed by the ECU of FIG. 1A;
fig. 5 is a view showing a modification of fig. 1A and 1B.
Detailed Description
Embodiments of the present invention will be described below with reference to fig. 1A to 5. The driving assistance device of the present invention generates various driving plans and provides the driving plans to the driver, and a target route from a current position to a target point will be described as an example of a driving plan.
Fig. 1A is a block diagram showing a schematic configuration of a driving assistance device 100 according to an embodiment of the present invention. The driving assistance device 100 is configured to include a navigation device mounted on a vehicle, for example. As shown in fig. 1A, a biological information acquisition unit 20, a sensor group 30, a navigation device 40, and an actuator 50 are connected to an ECU (electronic control unit) 10 provided in a vehicle.
The biological information acquiring unit 20 acquires an image such as the expression of the driver and biological information such as the voice of the driver, and is configured to include a camera (in-vehicle camera) 21 and a microphone 22 mounted in the vehicle. The camera 21 has an imaging element such as a CCD sensor or a CMOS sensor, and images the upper body including the face of the occupant. That is, the camera 21 photographs the expression and posture of the occupant. The microphone 22 acquires the voice uttered by the occupant. The voice signal from the microphone 22 is input to the ECU10 as voice data, for example, through an a/D converter, and voice recognition is performed by the ECU 10.
The sensor group 30 is configured to include a plurality of detection devices that detect the peripheral conditions of the vehicle, such as an obstacle detector 31, a camera (vehicle exterior camera) 32, and a vehicle state detector 33. The obstacle detector 31 is configured by, for example, a millimeter wave radar or a laser radar that transmits electric waves or light to the surroundings of the vehicle, detects an object by receiving the electric waves or light reflected by the object (person or object), and detects the distance to the object by measuring the time required to receive the reflected electric waves or light. The camera 32 is, for example, a front camera that is provided at the front of the vehicle and that captures an image of the front of the vehicle. In addition, a rear camera or a side camera may also be used as the camera 32, wherein: the rear camera is arranged at the rear part of the vehicle and is used for shooting the rear part of the vehicle; the side camera is provided at a side portion of the vehicle and photographs a side of the vehicle. The camera 32 includes an imaging element such as a CCD sensor or a CMOS sensor.
The vehicle state detector 33 includes various detectors that detect information corresponding to the running state of the vehicle, such as a vehicle speed sensor, an acceleration sensor, a yaw rate sensor, and the like. The vehicle state detector 33 further includes various detectors that detect information corresponding to the driving operation of the driver, for example, an accelerator pedal sensor that detects the amount of depression of an accelerator pedal, a brake pedal sensor that detects the amount of depression of a brake pedal, a steering wheel sensor that detects the steering torque or steering angle of a steering wheel, and the like.
The navigation device 40 includes: a GPS signal receiving unit 41 that receives signals from GPS satellites and measures the current position of the vehicle; a map database 42 (storage section) in which map information is stored; a display unit 43 that is provided in front of the driver's seat and displays position information of the vehicle on the map; a speaker 44 that informs the driver of various information in a voice manner; an input unit 45 for inputting various commands by the driver; and a arithmetic unit 48 for performing various arithmetic operations. The calculation unit 48 has a path calculation unit 46 and a path guidance unit 47 as functional structures, in which: the route calculation section 46 calculates a target route; the route guide unit 47 guides the route of the vehicle in accordance with the target route.
The map information stored in the map database 42 includes information such as the position and shape of various facilities such as a parking lot, in addition to information such as the position and shape of a road. The information stored in the map database 42 may be stored in advance in a memory of the ECU10 without being stored in the navigation device 40, or may be acquired from the outside of the vehicle by a wireless or wired communication system. The display unit 43 may be constituted by a liquid crystal display or a touch panel, and when the display unit 43 is constituted by a touch panel, the input unit 45 may be provided on the touch panel. The switches provided on the steering wheel can also be configured as the input unit 45.
The route calculation unit 46 calculates a target route (for example, a plurality of target routes) from the current position to the target point based on the current position information of the vehicle measured by the GPS signal reception unit 41 and the map information of the map database 42. Then, the target route information is output to the ECU10 together with the current position information of the vehicle. The route guidance unit 47 displays the current position of the vehicle and the target route (final target route) specified by the ECU10 as described later on the display unit 43, and outputs the target route information from the speaker 44 in a voice form to perform route guidance.
The actuator 50 is a running actuator for realizing a running operation including basic vehicle actions such as running, turning, and parking. The actuator 50 includes: a drive actuator 51 that generates a travel driving force; a brake actuator 52 that generates braking force; and a manipulation actuator 53 that generates a manipulation force. The actuator 50 is driven according to the following control signals: the control signal from the ECU10 corresponding to the operation of an accelerator pedal, a steering wheel, a brake pedal, etc. by the driver, that is, the control signal output by the ECU10 in accordance with the detection signal of the vehicle state detector 33.
In the case where the vehicle is a vehicle driven by an engine, the drive actuator 51 is constituted by, for example, a throttle actuator that controls the throttle opening degree. In the case of an electric vehicle and a hybrid vehicle, the drive actuator 51 is constituted by, for example, a traveling motor. The brake actuator 52 is constituted by, for example, a hydraulic device that supplies hydraulic pressure to a brake disk. The steering actuator 53 is constituted by, for example, a steering motor that controls the steering torque of the electric power steering system.
The ECU10 includes a computer having an arithmetic unit 1 such as a CPU, a storage unit 2 such as a ROM and a RAM, and other peripheral circuits. Fig. 1B is a diagram showing a functional configuration of the ECU10 (mainly, the arithmetic unit 1). As shown in fig. 1B, the ECU10 includes a skill level detecting unit 11, an emotion estimating unit 12, a route generating unit 13, a route providing unit 14, and a travel control unit 15.
The proficiency level detecting unit 11 detects the driving proficiency level of the driver from the driver information stored in the storage unit 2 in advance. The driver information includes: driver identification information (name of driver, etc.); specifically, the information specific to each driver includes information such as a period elapsed after the driver license is acquired, a driving frequency, a total travel distance of the vehicle, and a total driving time. The number and frequency of rapid acceleration, rapid braking, rapid steering, and the like, and the travel distance for each road type such as national roads, main local roads, and expressways, and the like can be stored in the storage unit 2 in advance, and these pieces of information can be included in the driver information. The driving proficiency is classified into 3 levels of proficiency a to proficiency C, for example. For example, the driver who has just obtained the driver's license is proficiency a, and changes to proficiency B and proficiency C as the driving proficiency becomes higher.
The emotion estimation unit 12 estimates the emotion (occupant emotion) of the driver from the expression (mouth angle, etc.) of the driver based on the image signal acquired by the camera 21. Specifically, for example, using the placido-chik (Plutchik) emotion wheel, the emotions are divided into 8 basic emotions (expected, happy, accepted, uneasy, surprisal, sad, aversive, and anger) in advance, and the applied emotion based on the combination of the adjacent 2 emotions is compared between the image representing the feature of each emotion and the image from the camera 21, and the emotion of the occupant is estimated based on which emotion pattern matches among the emotion patterns of human beings. In addition, as the human emotion pattern, not only the placian emotion wheel but also other emotion patterns can be applied. The occupant emotion is estimated by considering not only the image signal from the camera 21 but also the voice signal from the microphone 22. That is, the occupant emotion is estimated in consideration of the speech content, the intonation and the pronunciation (prosody) of the speech uttered by the occupant, and the like.
The emotion estimation unit 12 also converts the emotion of the occupant into numerical values in accordance with matching with the human emotion pattern. For example, a positive value indicates a pleasant emotion (positive emotion) which is a desirable emotion for the occupant such as happiness and joy, and a negative value indicates an unpleasant emotion (negative emotion) which is an undesirable emotion for the occupant such as anger and dislike. In this case, for example, the value (absolute value) indicating the emotion is increased as the emotion is stronger (the value is located inward of the emotion wheel). In this way, the degree of emotion (pleasant emotion, unpleasant emotion) of the occupant can be obtained. After digitizing the occupant emotion, the emotion estimation unit 12 determines whether or not the occupant emotion is equal to or less than a predetermined value. This determination is a determination as to whether or not the driver has an unpleasant feeling. Further, the occupant may be determined to have an unpleasant emotion when an emotion (for example, an emotion of anger or aversion) in a predetermined pattern is estimated without digitizing the occupant emotion.
The route generation unit 13 includes a 1 st route generation unit 13A and a 2 nd route generation unit 13B. The 1 st route generation unit 13A generates a target route (1 st recommended route) corresponding to the driving proficiency of the driver detected by the proficiency detection unit 11. That is, the 1 st route generation unit 13A reads target route information including a plurality of target routes from the current point to the target point calculated by the route calculation unit 46, and generates the 1 st recommended route corresponding to the driving skill based on the target route information. Fig. 2 is a diagram showing an example of the relationship between the driving skill and the 1 st recommended route. The paths a to C in the figure are different paths from the current point to the target point.
The routes a to C are target routes for driving beginners, middle-class drivers, and high-class drivers, and are calculated by the route calculation unit 46 in consideration of road width, traffic volume, the number of left and right turns, speed limit, history of past traffic accidents, and the like. For example, the route a is a route that satisfies the requirements of a wide road width, a small traffic volume, a small number of left and right turns, a low frequency of lane changes, and the like, and is suitable for a driving beginner. On the other hand, the route C is a route that is not suitable for a driving beginner, because it has a narrow road width, a large traffic volume, a large number of left and right turns, a high frequency of lane changes, and the like.
As shown in fig. 2, the 1 st route generation unit 13A generates the route a as the target route when the driving proficiency is a, generates the routes a and B as the target routes when the driving proficiency is B, and generates the routes a to C as the target routes when the driving proficiency is C, from among the plurality of target routes calculated by the route calculation unit 46. That is, the 1 st route generation unit 13A generates one or more 1 st recommended routes suitable for the driver in consideration of the driving proficiency.
When the emotion estimation unit 12 determines that the driver has an unpleasant emotion, the 2 nd route generation unit 13B generates a target route (2 nd recommended route) different from the 1 st recommended route from among the plurality of target routes calculated by the route calculation unit 46. That is, the target route is generated in consideration of not only the driving proficiency of the driver detected by the proficiency detecting unit 11 but also the occupant emotion estimated by the emotion estimating unit 12. Alternatively, the route calculation unit 46 may calculate the target route again and read the target route information to generate the 2 nd recommended route. The 2 nd recommended route is a route with a lower recommended degree than the 1 st recommended route.
Fig. 3 is a diagram showing an example of the relationship between the driving skill and the 2 nd recommended route. In addition, the path in parentheses in the figure is the 1 st recommended path. As shown in fig. 3, the 2 nd route generation unit 13B generates the route B and the route C as the target routes when the driving proficiency is a, generates the route C as the target route when the driving proficiency is B, and generates the route D as the target route when the driving proficiency is C. Further, the path D is a new path for advanced drivers.
The route providing unit 14 provides the target route (the 1 st recommended route, the 2 nd recommended route) generated by the route generating unit 13 to the driver. For example, a control signal is output to the display unit 43, and the target route is displayed on the display unit 43. In this case, first, the 1 st recommended route is displayed on the display unit 43, and thereafter, the 2 nd recommended route is displayed on the display unit 43 when the occupant feels unpleasant. Further, the target path may also be provided in voice form from the speaker 44 by causing the speaker 44 to output a voice signal.
In the case where a single route is provided by the route providing section 14, when the driver agrees to the route by the operation of the input section 45 or the voice input from the microphone 22, the route is determined as the final target route. In the case where a plurality of routes are provided by the route providing section 14, when the driver selects one of the routes by an operation of the input section 45 or a voice input from the microphone 22, the final target route is determined. Thereafter, the route guide unit 47 guides the route of the vehicle while displaying the final target route on the display unit 43.
The vehicle can travel not only in a manual driving mode but also in an automatic driving mode, in which: the manual driving mode is a mode in which the actuator 50 is driven in accordance with the operation of an operation member (an accelerator pedal, a brake pedal, a steering wheel, etc.) by the driver; the automatic driving mode is a mode in which at least a part of the actuator 50 (for example, a steering actuator) is automatically driven. The driving mode is switched by operating a mode switch, not shown, for example. In a state where the manual driving mode is selected, the travel control unit 15 outputs a control signal to the actuator 50 based on a signal from the vehicle state detector 33 (an accelerator pedal sensor, a brake pedal sensor, or the like) to manually travel the vehicle. In a state where the automatic driving mode is selected, the travel control unit 15 outputs a control signal to the actuator 50 based on a signal from the obstacle detector 31 or the camera 32 so that the vehicle travels along a predetermined travel route (target route), thereby automatically traveling the vehicle.
The travel control unit 15 includes a driving support unit 15A that supports a driving operation of the driver. When the 2 nd recommended route is determined as the final target route, the driving assistance portion 15A outputs a control signal to the actuator 50 based on the detection signal of the sensor group 30 to intervene in the driving operation of the driver. That is, when the route B or the route C is determined as the final target route for the driver with the skill a, and when the route C is determined as the final target route for the driver with the skill B, the route D is determined as the final target route for the driver with the skill C, and the driving assistance operation is executed.
Specifically, the vehicle state, the conditions around the vehicle, and the like are monitored based on signals from the sensor group 30, and control signals for adjusting the operation amounts, the operation speeds, and the like of the various operation members are output to the actuator 50 based on the monitoring results. On the other hand, when the 1 st recommended route is determined as the final target route, the ECU10 invalidates the function of the driving assistance unit 15A.
The greater the degree of deviation between the 1 st recommended route and the 2 nd recommended route, the greater the degree of intervention (assistance) of the driving operation by the driving assistance portion 15A. For example, when the route C is selected for the driver with the skill a, the degree of intervention is increased compared to when the route B is selected. The larger the degree of intervention, the larger the proportion of intervention (correction amount of driving operation) to the operation of the driver, and the closer the driving mode of the vehicle is to the automatic driving mode. That is, the value of the control signal output to the actuator 50 is close to the value of the control signal output to the actuator 50 in the automatic driving mode.
In the above description, the target route is calculated by the route calculation unit 46 of the navigation device 40, but a function of calculating a route may be provided in the ECU 10. For example, the route generation unit 13 may calculate the target route to generate the 1 st recommended route and the 2 nd recommended route. It is also possible to provide not only the path calculator 46 but also the path guide 47 to the ECU10, in addition to the ECU 10. On the contrary, at least a part of the ECU10 (for example, the route generation unit 13 or the route providing unit 14) may be provided in the navigation device 40. Further, a part of the functions, for example, the path calculation function may be calculated on the server side in the client/server format. In this case, the driving support apparatus 100 is further provided with a wireless unit (not shown) that can perform wireless communication via a mobile wireless terminal such as a mobile phone network or a smartphone, and functions of the route generation unit 13, the route calculation unit 46, and the like are provided on the server side. By configuring the client/server system as described above, although some additional configuration such as a wireless unit is required, information required for specifying the routes a to C (for example, the amount of traffic and the past history of traffic accidents) can always be updated by centralized management on the server side. Further, since information from the plurality of driving assistance devices 100 is collected in the server, there is an advantage that logic for route generation can be advanced by machine learning or the like.
Fig. 4 is a flowchart showing an example of processing executed by the ECU10 in accordance with a program stored in advance in a memory. For example, when the target point is set by the navigation device 40 in a state where the manual driving mode is selected, the processing shown in the flowchart is started.
First, in S1 (S: processing step), the proficiency level detecting unit 11 detects the driving proficiency level of the driver based on the driver information stored in advance in the memory. Next, in S2, the 1 st route generator 13A generates the 1 st recommended route (fig. 2) corresponding to the driving skill level detected in S1. Next, at S3, the route providing unit 14 causes the display unit 43 to display the 1 st recommended route or routes generated at S2, and provides the 1 st recommended route to the driver.
Next, at S4, the emotion estimation unit 12 estimates the occupant' S emotion of the driver from the information acquired by the biological information acquisition unit 20. Then, in S5, the emotion estimation unit 12 determines whether or not the estimated occupant emotion is an unpleasant emotion. When an unpleasant emotion is estimated, it is assumed that the driver is not satisfied with the provided route. In this case, the result of S5 is affirmative, and the process proceeds to S6, and the 2 nd route generator 13B generates the 2 nd recommended route (fig. 3) which is a new route corresponding to the driving skill level detected in S1. Next, in S7, the route providing unit 14 causes the display unit 43 to display the 2 nd recommended route generated in S6, thereby providing the 2 nd recommended route to the driver.
Next, in S8, the 2 nd recommended path is determined as the final target path. In this case, when a single 2 nd recommended route is displayed, when the driver agrees to the 2 nd recommended route, the 2 nd recommended route being displayed is determined as the final target route. When a plurality of 2 nd recommended paths are displayed, when the driver selects one of the 2 nd recommended paths, the selected 2 nd recommended path is determined as a final target path. Next, the driving assistance unit 15A performs a driving assistance operation corresponding to a deviation between the 1 st recommended route and the target route (the 2 nd recommended route). That is, the greater the degree of deviation, the greater the degree of intervention.
On the other hand, if it is determined at S5 that the estimated occupant emotion is not unpleasant, the routine proceeds to S10. In this case, it is assumed that the 1 st recommended route displayed in S3 is satisfactory for the driver. Therefore, in S10, when a single 1 st recommended route is displayed in S3, the route is determined as the final target route, and on the other hand, when a plurality of 1 st recommended routes are displayed, the route selected by the driver is determined as the final target route. In this case, the driving assistance operation is not performed.
The operation of the driving assistance device according to the present embodiment will be described in more detail. When a target point of the vehicle is set by the navigation device 40 and a plurality of target routes to the target point are calculated, the 1 st route generation unit 13A generates a 1 st recommended route corresponding to the driving skill of the driver from these target routes and causes the display unit 43 to display the 1 st recommended route (S1 to S3). The emotion estimation section 12 determines whether the driver is satisfied with the 1 st route provided based on the emotion of the driver at this time, and determines the 1 st recommended route as the final target route when determined to be satisfied (S4 → S5 → S10). Thereafter, the navigation device 40 (route guide unit 47) performs route guidance of the vehicle in accordance with the final target route.
On the other hand, when it is determined that the driver is dissatisfied (unpleasant) with the 1 st recommended route provided, the 2 nd route generating unit 13B generates the 2 nd recommended route corresponding to the driving proficiency and causes the display unit 43 to display the 2 nd recommended route (for example, a plurality of 2 nd recommended routes) (S4 to S7). In this state, when the driver selects one 2 nd recommended route, the selected 2 nd recommended route is determined as the final target route (S8). Accordingly, the number of target paths that the driver can select increases, and thus the driver's satisfaction can be improved. When the 2 nd recommended route is determined as the final target route, the driving assistance operation is performed (S9). Therefore, appropriate driving assistance according to the driving proficiency can be performed.
According to the present embodiment, the following operational effects can be achieved.
(1) The driving assistance device 100 includes: a driving skill detection unit 11 that detects the driving skill of the driver; a biological information acquisition unit 20 that acquires biological information of the driver; an emotion estimation unit 12 that estimates the emotion of the driver from the biological information acquired by the biological information acquisition unit 20; a route generation unit 13 that generates a target route based on the driving proficiency detected by the proficiency detection unit 11 and the emotion of the driver estimated by the emotion estimation unit 12; and a route providing unit 14 that provides the driver (fig. 1A and 1B) with the target route generated by the route generating unit 13. In this way, the target route is generated in consideration of not only the driving proficiency but also the emotion of the driver, and thus, it is possible to avoid a route that is unpleasant to the driver from becoming the final target route, and it is possible to improve the satisfaction of the driver with respect to the provision of the target route.
(2) The route generation unit 13 generates a 1 st recommended route from the driving proficiency detected by the proficiency detection unit 11, and then generates a 2 nd recommended route from the emotion of the driver estimated by the emotion estimation unit 12 after the 1 st recommended route is provided by the route providing unit 14 (S2, S6). Accordingly, after the 1 st recommended route corresponding to the driving proficiency is provided, the 2 nd recommended route having a lower recommendation level than the 1 st recommended route is provided in consideration of the emotion of the driver. In other words, the 1 st recommended route corresponding to the driving proficiency is provided in preference to the 2 nd recommended route, and therefore, the target route can be provided in a manner preferable to the driver.
(3) The driving assistance device 100 further includes a driving assistance unit 15A, and the driving assistance unit 15A assists the driving operation of the driver in accordance with the driving proficiency detected by the proficiency detecting unit 11 (fig. 1B). Accordingly, when the target route is determined in consideration of the emotion of the driver, that is, when the route that does not match the driving skill (the 2 nd recommended route) is determined as the final target route, it is possible to assist the inexperience of the driving operation, and it is possible to realize good vehicle traveling.
(4) The driving assistance unit 15A is activated (activated) when the vehicle travels the 2 nd route and deactivated (activated) when the vehicle travels the 1 st route (S5, S9). Accordingly, the driving assistance portion 15A can be prevented from unnecessarily intervening in the driving operation, and a good driving feeling can be obtained.
(5) When the driving proficiency detected by the proficiency detecting unit 11 is equal to or greater than a predetermined value, that is, when the driving proficiency is the proficiency B and the proficiency C, the route generating unit 13 generates a plurality of 1 st recommended routes (fig. 2) that the driver can select. Accordingly, the driver can select a desired route as the final target route, so that high satisfaction can be obtained.
(6) The driving assistance device 100 further includes a route guide unit 47, and the route guide unit 47 guides the route of the vehicle from the current position to the target point in accordance with the target route (final target route) provided by the route providing unit 14 (fig. 1A). Accordingly, it is possible to provide useful services to the driver using the target route obtained in consideration of the driving proficiency and the emotion of the driver.
(7) The driving assistance method according to the present embodiment includes the steps of: detecting a driving proficiency of the driver (S1); acquiring biological information of a driver; estimating the emotion of the driver from the acquired biological information (S4); generating a target route based on the detected driving proficiency and the estimated emotion of the driver (S6); the generated target path is provided to the driver (S7). Accordingly, the driver's satisfaction with the provision of the target route can be improved.
In the above-described embodiment, the driving assistance device 100 is configured using the ECU10 provided in the vehicle, but may be configured using a mobile terminal such as a smartphone carried by the driver instead of the ECU 10. Fig. 5 is a diagram showing an example thereof. In the driving assistance device 100A of fig. 5, the same portions as those in fig. 1A and 1B are denoted by the same reference numerals. As shown in fig. 5, the mobile terminal 60 has a camera 21 and a microphone 22, and the camera 21 and the microphone 22 constitute a biological information acquiring unit 20. The mobile terminal 60 includes a controller 10A, and the controller 10A has the same functional configuration as the ECU10 of fig. 1B, that is, includes a skill level detecting unit 11, an emotion estimating unit 12, a route generating unit 13, and a route providing unit 14. The mobile terminal 60 includes a display unit 61, a speaker 62, and an input unit 63.
The mobile terminal 60 is provided in such a manner as to be able to communicate with the navigation device 40. According to this configuration, as in the above configuration, the target route in consideration of the driving proficiency and the emotion of the driver can be provided to the driver. Although the travel control unit is omitted in fig. 5, the travel control unit 15 may be provided in the controller 10A, for example, in communication with the vehicle control ECU, to control the actuator 50 based on the signals from the sensor group 30 and in accordance with the command from the portable terminal 60, in the same manner as the ECU10 in fig. 1A and 1B.
The route calculation unit 46 and the route guide unit 47 of the navigation device 40 can also be provided to the controller 10A of the mobile terminal 60. The information on the target route may be provided not through the display unit 43 and the speaker 44 of the navigation device 40 but through the display unit 61 and the speaker 62 of the mobile terminal 60. Various commands may be input through the input unit 63 of the mobile terminal 60, not through the input unit 45 of the navigation device 40. The mobile terminal 60 may be provided with a GPS signal receiving unit. The mobile terminal 60 can also communicate with an external device of the vehicle to acquire map data. Therefore, the driving assistance device can be constituted by only the mobile terminal without using the navigation device 40.
In the above embodiment, the driving proficiency detecting unit 11 detects the driving proficiency of the driver based on the driver information stored in advance, but there are various methods other than the above-described method for detecting the driving proficiency. For example, the driving skill may be determined in advance, and the skill may be detected based on the result. Further, the driver information may be transmitted to the skill level detection unit 11 of the ECU10 by providing a wireless terminal held by the driver, for example, a smartphone or various wearable terminals (wearable terminal) in which driving skill level is recorded, at the time of vehicle startup (ignition switch on). In this case, the hardware and software configuration is complicated, but on the other hand, since it is not necessary to record driver information on the vehicle side, it is preferable in terms of vehicle safety. Further, since the present invention can be applied even to transfer of a plurality of vehicles, the present invention can be applied to, for example, a rental car or a car sharing service. As described above, the structure of the skill level detecting unit is not limited to the above structure.
In the above-described embodiment, the biological information of the occupant is acquired by the biological information acquiring unit 20 including the camera 21 and the microphone 22, but the configuration of the biological information acquiring unit is not limited to this. For example, the wearable terminal may be worn by the occupant to acquire biological information such as pulse, blood pressure, body temperature, and blood oxygen level of the occupant. The body temperature (e.g., facial temperature) of the occupant may also be acquired by the thermal imager as the biological information.
In the above embodiment, the emotion estimation unit 12 estimates the emotion of the occupant by matching with the pramipexole emotion wheel, but may estimate the emotion of the occupant by matching with other classified emotion patterns, and the structure of the emotion estimation unit is not limited to the above structure.
In the above embodiment, the route generation unit 13 generates the 1 st recommended route based on the driving proficiency detected by the proficiency detection unit 11, and thereafter generates the 2 nd recommended route based on the emotion of the driver estimated by the emotion estimation unit 12 after the 1 st recommended route is provided, but the route generation unit may have any configuration as long as it generates the target route based on the driving proficiency and the emotion of the driver.
In the above embodiment, the route providing unit 14 provides the target route to the driver through the display unit 43 or the speaker 44, but the target route is output so as to be recognizable to the driver, and therefore the target route may be output through a member other than the display unit or the speaker, and the configuration of the route providing unit is not limited to the above configuration.
In the above-described embodiment, an example of a case where a target route from a current position to a target point is used as a driving plan has been described. That is, the route generation unit 13 and the route providing unit 14 are used as examples of the plan generation unit and the plan providing unit, respectively, but the present invention may be applied to other driving plans in the same manner. Therefore, the configuration of the plan generating unit that generates the driving plan based on the driving proficiency and the emotion of the driver and the configuration of the plan providing unit that provides the generated driving plan to the driver are not limited to the above-described configurations.
In the above embodiment, the assist function for the driving operation based on the instruction of the driving assist unit 15A is invalidated when the 1 st recommended route is determined as the final target route, and thereby the degree of assist for the driving operation when the 2 nd recommended route is determined as the final target route is made larger than the degree of assist for the driving operation when the 1 st recommended route is determined as the final target route. However, the structure of the driving assistance portion is not limited to this, and for example, even when the 1 st recommended route is determined as the final target route, the assistance function of the driving operation may be enabled.
In the above embodiment, when the emotion estimation unit 12 determines that the driver has an unpleasant emotion, the 2 nd route generation unit 13B generates a target route (2 nd recommended route) different from the 1 st recommended route from among the plurality of target routes calculated by the route calculation unit 46. However, the 2 nd recommended route can be generated even when the driver is determined to have an unpleasant feeling. For example, the 2 nd recommended route may be generated when it is determined that the driver has extremely strong pleasant emotion. The reason for this is that: in a case where the driver is excited due to an event at the destination, the driver may have a reduced concentration of driving, and it is determined that extremely strong pleasant emotion is appropriate for the same purpose as unpleasant emotion.
The above-described embodiment and one or more modifications can be arbitrarily combined, and modifications can be combined with each other.
According to the present invention, the driving plan is generated and provided in accordance with the driving proficiency and the emotion of the driver, and therefore the satisfaction of the driver with the driving plan can be sufficiently improved.
While the present invention has been described in connection with the preferred embodiments of the present invention, it will be apparent to those skilled in the art that various modifications and variations can be made without departing from the scope of the invention disclosed in the claims.
Claims (4)
1. A driving assistance apparatus is characterized in that,
comprising:
a proficiency level detection unit (11) that detects the driving proficiency level of a driver of a vehicle;
a driving assistance unit (15A) that assists the driving operation of the driver in accordance with the driving proficiency detected by the proficiency detection unit (11);
a biological information acquisition unit (20) that acquires biological information of the driver;
an emotion estimation unit (12) that estimates the emotion of the driver from the biological information acquired by the biological information acquisition unit (20);
a plan generation unit (13) that generates a driving plan based on the driving proficiency detected by the proficiency detection unit (11) and the emotion of the driver estimated by the emotion estimation unit (12); and
a plan providing unit (14) that provides the driving plan generated by the plan generating unit (13) to the driver,
the plan generating unit (13) generates a 1 st plan based on the driving proficiency detected by the proficiency detecting unit (11), and thereafter generates a 2 nd plan based on the emotion of the driver estimated by the emotion estimating unit (12) after the 1 st plan is provided by the plan providing unit (14),
when the vehicle travels according to the 2 nd plan, the driving assistance unit (15A) increases the degree of assistance for the driving operation as compared to when the vehicle travels according to the 1 st plan.
2. The driving assistance apparatus according to claim 1,
when the driving proficiency detected by the proficiency detecting unit (11) is equal to or greater than a predetermined value, the plan generating unit (13) generates a plurality of driving plans that can be selected by the driver.
3. The driving assistance apparatus according to claim 1 or 2,
the driving plan is a target path from a current position to a target location,
the driving assistance device further includes a route guide unit (47), and the route guide unit (47) guides the route of the vehicle in accordance with the target route provided by the plan providing unit (14).
4. A driving assistance method characterized by comprising the steps of,
the method comprises the following steps:
detecting a driving proficiency of a driver of the vehicle;
assisting the driving operation of the driver in accordance with the detected driving proficiency;
acquiring biological information of a driver;
estimating the emotion of the driver according to the acquired biological information;
generating a driving plan based on the detected driving proficiency and the estimated emotion of the driver;
the generated driving plan is provided to the driver, wherein,
generating a 1 st plan based on the detected driving proficiency, thereafter generating a 2 nd plan based on the emotion of the driver estimated after the 1 st plan is provided,
when the vehicle travels according to the 2 nd plan, the degree of assistance for the driving operation is increased as compared to when the vehicle travels according to the 1 st plan.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016253777A JP6513069B2 (en) | 2016-12-27 | 2016-12-27 | Driving support device and driving support method |
JP2016-253777 | 2016-12-27 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108240819A CN108240819A (en) | 2018-07-03 |
CN108240819B true CN108240819B (en) | 2021-10-26 |
Family
ID=62625827
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711295856.6A Active CN108240819B (en) | 2016-12-27 | 2017-12-08 | Driving support device and driving support method |
Country Status (3)
Country | Link |
---|---|
US (1) | US10576987B2 (en) |
JP (1) | JP6513069B2 (en) |
CN (1) | CN108240819B (en) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11313692B2 (en) * | 2015-09-01 | 2022-04-26 | Honda Motor Co., Ltd. | Navigation server and navigation system |
JP6770680B2 (en) * | 2016-06-21 | 2020-10-21 | 富士ゼロックス株式会社 | Control device, processing device system and program |
JP6499682B2 (en) * | 2017-01-31 | 2019-04-10 | 本田技研工業株式会社 | Information provision system |
US20180215392A1 (en) * | 2017-02-02 | 2018-08-02 | Denso Ten Limited | Vehicle control device and vehicle control method |
WO2019176230A1 (en) * | 2018-03-13 | 2019-09-19 | アイシン・エィ・ダブリュ株式会社 | Route searching system, route guiding system, route searching program, and route guiding program |
CN110871810A (en) * | 2018-08-21 | 2020-03-10 | 上海博泰悦臻网络技术服务有限公司 | Vehicle, vehicle equipment and driving information prompting method based on driving mode |
JP7023817B2 (en) * | 2018-09-19 | 2022-02-22 | 本田技研工業株式会社 | Display system, display method, and program |
JP2020091777A (en) * | 2018-12-07 | 2020-06-11 | 株式会社デンソー | Information processing system |
JP7100575B2 (en) | 2018-12-28 | 2022-07-13 | 本田技研工業株式会社 | Information processing equipment and programs |
JP6999540B2 (en) | 2018-12-28 | 2022-01-18 | 本田技研工業株式会社 | Information processing equipment and programs |
CN111483458B (en) * | 2019-01-25 | 2022-08-12 | 宇通客车股份有限公司 | Power system control method and device |
JP7277186B2 (en) * | 2019-03-08 | 2023-05-18 | 株式会社Subaru | Information processing device, information processing system, and vehicle control device |
JP2020165692A (en) * | 2019-03-28 | 2020-10-08 | 本田技研工業株式会社 | Controller, method for control, and program |
US20220126864A1 (en) | 2019-03-29 | 2022-04-28 | Intel Corporation | Autonomous vehicle system |
DE102019112922A1 (en) * | 2019-05-16 | 2020-11-19 | Dr. Ing. H.C. F. Porsche Aktiengesellschaft | Method and device for navigation |
JP7264804B2 (en) * | 2019-12-27 | 2023-04-25 | 本田技研工業株式会社 | Recommendation system, recommendation method and program |
US20240278796A1 (en) | 2020-09-18 | 2024-08-22 | Nec Corporation | Notification system, notification method, and non-transitory storage medium |
CN114312815B (en) * | 2020-09-30 | 2024-05-07 | 比亚迪股份有限公司 | Driving prompt method and device and automobile |
CN113060014B (en) * | 2021-04-16 | 2023-01-31 | 国家石油天然气管网集团有限公司华南分公司 | Method and device for improving control safety performance of motor |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2890001B2 (en) * | 1991-12-18 | 1999-05-10 | 本田技研工業株式会社 | Vehicle travel guidance device |
JP2003072488A (en) * | 2001-08-31 | 2003-03-12 | Sony Corp | Onboard device and processing method of vehicle and vehicle information |
JP2004325255A (en) * | 2003-04-24 | 2004-11-18 | Denso Corp | Navigation system for vehicle |
JP2005106475A (en) * | 2003-09-26 | 2005-04-21 | Hitachi Software Eng Co Ltd | Navigation system |
JP4342362B2 (en) * | 2004-03-26 | 2009-10-14 | パイオニア株式会社 | Navigation device |
JP4246693B2 (en) * | 2004-12-24 | 2009-04-02 | 富士通テン株式会社 | Driving assistance device |
JP2008058039A (en) * | 2006-08-29 | 2008-03-13 | Toyota Motor Corp | On-vehicle device for collecting dissatisfaction information, information collection center, and system for collecting dissatisfaction information |
US11067405B2 (en) * | 2010-06-07 | 2021-07-20 | Affectiva, Inc. | Cognitive state vehicle navigation based on image processing |
US8364395B2 (en) * | 2010-12-14 | 2013-01-29 | International Business Machines Corporation | Human emotion metrics for navigation plans and maps |
US9317983B2 (en) * | 2012-03-14 | 2016-04-19 | Autoconnect Holdings Llc | Automatic communication of damage and health in detected vehicle incidents |
US8876535B2 (en) * | 2013-03-15 | 2014-11-04 | State Farm Mutual Automobile Insurance Company | Real-time driver observation and scoring for driver's education |
KR102043637B1 (en) * | 2013-04-12 | 2019-11-12 | 한국전자통신연구원 | Route guidance apparatus based on emotion and method thereof |
US20150260531A1 (en) * | 2014-03-12 | 2015-09-17 | Logawi Data Analytics, LLC | Route planning system and methodology which account for safety factors |
JP6520506B2 (en) * | 2014-09-03 | 2019-05-29 | 株式会社デンソー | Vehicle travel control system |
US9573600B2 (en) * | 2014-12-19 | 2017-02-21 | Toyota Motor Engineering & Manufacturing North America, Inc. | Method and apparatus for generating and using driver specific vehicle controls |
US10228256B2 (en) * | 2015-01-15 | 2019-03-12 | Pcms Holdings, Inc. | Systems and methods for providing navigation directions based on emotions and activities |
CN104634358A (en) * | 2015-02-05 | 2015-05-20 | 惠州Tcl移动通信有限公司 | Multi-route planning recommendation method, system and mobile terminal |
JP6269557B2 (en) * | 2015-04-08 | 2018-01-31 | トヨタ自動車株式会社 | Vehicle driving support control device |
US9785145B2 (en) * | 2015-08-07 | 2017-10-10 | International Business Machines Corporation | Controlling driving modes of self-driving vehicles |
JP2017181449A (en) * | 2016-03-31 | 2017-10-05 | カシオ計算機株式会社 | Electronic device, route search method, and program |
DE112017002041T5 (en) * | 2016-04-14 | 2019-01-10 | Sony Corporation | Information processing apparatus, information processing method and mobile apparatus |
US10527441B2 (en) * | 2016-05-06 | 2020-01-07 | Michael K Colby | Load-based mapping |
US9945679B2 (en) * | 2016-06-27 | 2018-04-17 | International Business Machines Corporation | Personalized travel routes to reduce stress |
CN107703931B (en) * | 2016-08-09 | 2019-04-05 | 北京百度网讯科技有限公司 | Method and apparatus for controlling automatic driving vehicle |
US10449968B2 (en) * | 2016-09-23 | 2019-10-22 | Ford Motor Company | Methods and apparatus for adaptively assisting developmentally disabled or cognitively impaired drivers |
US20180109924A1 (en) * | 2016-10-17 | 2018-04-19 | International Business Machines Corporation | Cognitive Based Optimal Grouping of Users and Trip Planning Based on Learned User Skills |
JP2018100936A (en) * | 2016-12-21 | 2018-06-28 | トヨタ自動車株式会社 | On-vehicle device and route information presentation system |
-
2016
- 2016-12-27 JP JP2016253777A patent/JP6513069B2/en active Active
-
2017
- 2017-12-08 CN CN201711295856.6A patent/CN108240819B/en active Active
- 2017-12-12 US US15/839,449 patent/US10576987B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
US20180178807A1 (en) | 2018-06-28 |
JP6513069B2 (en) | 2019-05-15 |
CN108240819A (en) | 2018-07-03 |
JP2018106530A (en) | 2018-07-05 |
US10576987B2 (en) | 2020-03-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108240819B (en) | Driving support device and driving support method | |
KR101793370B1 (en) | Vehicle control apparatus | |
US9522700B2 (en) | Driving support apparatus for vehicle and driving support method | |
CN109835346B (en) | Driving advice device and driving advice method | |
CN110050301A (en) | Controller of vehicle | |
CN107914711B (en) | Vehicle control device | |
JP2007200274A (en) | Merge support device and merge support system | |
CN108885836A (en) | Driving assistance method and drive assistance device, automatic Pilot control device, vehicle, driving assistance system and the program for utilizing the driving assistance method | |
JP2017162406A (en) | Vehicle automatic driving control system | |
JP6612707B2 (en) | Information provision device | |
CN109841088A (en) | Vehicle drive assist system and method | |
JP6509940B2 (en) | Driving support device and driving support method | |
JP2007148917A (en) | Driving support device | |
JP2009025239A (en) | Route guide device | |
JP6090340B2 (en) | Driver emotion estimation device | |
US20200307644A1 (en) | Control system for vehicle, notification method for vehicle, and medium | |
CN109661338A (en) | Determination method, parking assistance method, outbound householder method and the obstacle judgment device of barrier | |
CN113135190A (en) | Automatic driving assistance device | |
JP4141895B2 (en) | Vehicle travel control device | |
JP4926182B2 (en) | Signal recognition apparatus, signal recognition method, signal recognition program, and recording medium | |
JP2015064584A (en) | Driver feeling estimation device and method | |
JP2007233744A (en) | Driving support apparatus | |
JP6627810B2 (en) | Operation mode switching control device, method and program | |
JP6648551B2 (en) | Automatic driving device | |
JP2015084253A (en) | Driver's feeling estimation device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |