Nothing Special   »   [go: up one dir, main page]

KR101657900B1 - Determining method for danger information on driving car and transferring method for danger information on driving car - Google Patents

Determining method for danger information on driving car and transferring method for danger information on driving car Download PDF

Info

Publication number
KR101657900B1
KR101657900B1 KR1020150078339A KR20150078339A KR101657900B1 KR 101657900 B1 KR101657900 B1 KR 101657900B1 KR 1020150078339 A KR1020150078339 A KR 1020150078339A KR 20150078339 A KR20150078339 A KR 20150078339A KR 101657900 B1 KR101657900 B1 KR 101657900B1
Authority
KR
South Korea
Prior art keywords
driver
image
map
vehicle
ham
Prior art date
Application number
KR1020150078339A
Other languages
Korean (ko)
Inventor
정민영
이석한
구자헌
안현국
Original Assignee
성균관대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 성균관대학교산학협력단 filed Critical 성균관대학교산학협력단
Priority to KR1020150078339A priority Critical patent/KR101657900B1/en
Application granted granted Critical
Publication of KR101657900B1 publication Critical patent/KR101657900B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2040/08
    • B60W2050/08

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method for determining danger information while driving a car, comprises: a step in which an attention tracking device installed in a car tracks a drivers attention, and a camera installed in the car obtains an image of the outside of a car when the camera tracks the drivers attention; a step in which the attention tracking device uses attention track data to create a human attention map (HAM); a step in which the camera or a calculation device receiving the image creates a saliency map based on the image; and a step in which the calculation device calculates a correlation between the HAM and the saliency map, and determines danger information based on a degree of the correlation.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a method for determining risk information during operation,

The techniques described below relate to techniques for determining risk information while driving a vehicle and for transmitting danger information around the vehicle.

Various types of safety devices are provided to automatically control the movement of the vehicle body in preparation for a dangerous situation that may occur during the operation of the vehicle. In addition, a driver's attention to potential danger factors is provided, For example, are provided. In recent years, a technology has been developed that tracks the driver's gaze movement and notifies the driver of a dangerous situation.

Korean Patent Publication No. 10-2012-0055011 Korean Patent Publication No. 10-2012-0071220

The technique described below is to provide a method of recognizing a dangerous situation of a driver by combining a sight line tracking of a driver and image information acquired by a camera and informing a driver of a dangerous situation through a terminal possessed by the driver.

A method for determining risk information during operation includes tracking a driver's line of sight installed in a vehicle and acquiring an image of the outside of the vehicle at a time the camera installed in the vehicle tracks the driver's line of sight, A step of generating a human attention map (HAM) by using gaze tracking data, a step of generating a saliency map based on the camera or the arithmetic unit to which the image is transmitted, based on the image, Calculating a correlation between the HAM and the salient map, and determining the risk information based on the degree of the correlation.

A method of delivering risk information during operation includes establishing a V2D (Vehicle-to-Device) communication session between a driving assistance system of a vehicle and a driver terminal, and a HAM the correlation between the HAM and the salient map is calculated based on a saliency map of the external image of the vehicle obtained at the time of tracing the driver's gaze, Determining whether the driving assistance system or the driver's terminal transmits the message including the risk information to the driver terminal, And outputting the risk information through at least one of the first and second information.

The technology described below can more precisely determine the dangerous situation using the driver's eye tracking and the image acquired by the camera, and can transmit the dangerous situation to the driver or the surrounding vehicles through a kind of D2D (Device to Device).

Figure 1 is an example of a flowchart for a method for determining risk information during operation.
FIG. 2 is an example of generating an association map using gaze tracking and a highlight map in a specific environment.
FIG. 3 is an example of a process of determining risk information in the driving assistance system.
4 is an example of a block diagram showing a configuration of a system for transmitting danger information of a driver.
FIG. 5 is an example of a process of transmitting danger information of a driver to a vehicle or a traffic facility device.

The following description is intended to illustrate and describe specific embodiments in the drawings, since various changes may be made and the embodiments may have various embodiments. However, it should be understood that the following description does not limit the specific embodiments, but includes all changes, equivalents, and alternatives falling within the spirit and scope of the following description.

The terms first, second, A, B, etc., may be used to describe various components, but the components are not limited by the terms, but may be used to distinguish one component from another . For example, without departing from the scope of the following description, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. And / or < / RTI > includes any combination of a plurality of related listed items or any of a plurality of related listed items.

As used herein, the singular " include "should be understood to include a plurality of representations unless the context clearly dictates otherwise, and the terms" comprises & , Parts or combinations thereof, and does not preclude the presence or addition of one or more other features, integers, steps, components, components, or combinations thereof.

Before describing the drawings in detail, it is to be clarified that the division of constituent parts in this specification is merely a division by main functions of each constituent part. That is, two or more constituent parts to be described below may be combined into one constituent part, or one constituent part may be divided into two or more functions according to functions that are more subdivided. In addition, each of the constituent units described below may additionally perform some or all of the functions of other constituent units in addition to the main functions of the constituent units themselves, and that some of the main functions, And may be carried out in a dedicated manner.

Also, in performing a method or an operation method, each of the processes constituting the method may take place differently from the stated order unless clearly specified in the context. That is, each process may occur in the same order as described, may be performed substantially concurrently, or may be performed in the opposite order.

The technique described below can be classified into a technique of extracting the danger information using the driver's gaze tracking and an image of the gaze direction in the vehicle and a technique of transmitting the extracted danger information to the surrounding vehicles.

First, a technique for extracting risk information will be described. Hazardous information refers to information about dangerous situations that occur around the vehicle while the vehicle is in operation. A system to assist in driving in the vehicle recognizes the hazard information. A system for recognizing danger information in a vehicle is hereinafter referred to as a driving assistance system. Driving assistance systems may also be included in the category of infotainment systems.

There are two source data for determining risk information. One is the data tracking the driver's gaze and the other is the image data of the driver's gaze direction.

There have been conventional studies for driver 's eye tracking, and various devices have been devised. Hereinafter, the apparatus for tracking the driver's gaze is referred to as a gaze tracking apparatus. The gaze tracking device is a device for analyzing the images taken by the driver and analyzing the directions and points of the driver's two eyes. In recent years, eye-tracking devices such as smart glasses are also on the market. In the following description, the eye tracking device is not limited to a particular method or form. Therefore, detailed description of the eye tracking device will be omitted. The line-of-sight tracking data is data indicating a point or an area where the driver's line of sight is directed in a specific image.

A device for photographing the driver ' s sight line in the vehicle is required. There are many camera devices already in use in vehicles. For example, there are a black box, a camera device for driving assistance, and the like. Regardless of which camera device is used, an image in the direction in which the gaze tracking device is to be tracked can be acquired. Since the gaze tracking device already uses the camera, it is possible to use the image acquired by the gaze tracking device without using a separate camera. Hereinafter, a device for photographing an original image for generating a saliency map will be referred to as a camera device. The camera device may be a dedicated camera for generating a surplus map, or a camera for photographing the outside of the vehicle for other purposes.

1 is an example of a flowchart for a method 100 for determining risk information during operation. First, the gaze tracking device tracks the driver's gaze, and at the same time, the camera device acquires an image outside the vehicle (110). On the other hand, the image acquired by the camera device should be the image of the driver's gaze direction. Therefore, it is preferable that the camera photographs all of the image of the front direction of the vehicle, the image of the side direction, and the image of the rear direction in advance. This will require multiple cameras.

The gaze tracking device generally generates a human attention map (HAM) 120 for eye tracking. Therefore, the HAM image generated by the gaze tracking device is used. In some cases, since the gaze tracking device generates the gaze tracking data, the driving assistance system may generate the HAM.

The camera device generates a saliency map based on the photographed external image (source image) (130). Alternatively, the driving assistance system generates a salient map using the source image acquired by the camera device (130). The camera device must acquire an image while the gaze tracking device is tracking the gaze.

The travel assistance system calculates the correlation based on the HAM and the salient map (140). The driving assistance system determines the risk information in the image according to the correlation (140). Further, the driving assistance system may determine the ranking of the risk factors indicated in the image according to the degree of the correlation (150). The correlation calculation process will be described later.

FIG. 2 is an example of generating an association map using gaze tracking and a highlight map in a specific environment. 2 (a) is an image taken by a gaze tracking device although it is not an image taken by a vehicle. In Figure 2 (a), the small yellow circle represents the fixation point at which the line of sight is located. HAM consists of all fixed points in one map. The area with many fixed points corresponds to the concentrated part of the person's gaze. The HAM expresses information about the case where the fixed points are collected the most or the fixed points staying for the longest time as image data. Fig. 2 (b) is an example of the HAM for Fig. 2 (a). Referring to FIG. 2 (b), it can be seen that the area having many fixed points is represented by a bright color. It is assumed that the HAM image can be expressed in different colors, but basically it is expressed by the brightness of the image. The HAM of FIG. 2 (b) shows the distribution of white in the binary image.

There are several ways to generate a saliency map from the source image measured by the camera device. The abscissa map corresponds to an image obtained by segmenting an object in a certain image. The appearance map can be generated by using criteria such as the color of the object, the shape of the object, and the orientation of the object.

Bottom-up saliency and top-down saliency techniques are used for the salient map. The basic bottom-up saliency model is Itti's saliency model. The Itti 's brilliant model detects a part of color, roughness, and direction in the surrounding environment and creates a mapping model.

The term top-down is defined as a memory of experiences experienced by a person or various information in the brain. When modeling is implemented using a computer, there is a method of inputting important feature values to the surrounding environment.

On the other hand, FIG. 2 (c) is an example of a mapping model generated by combining the source image of FIG. 2 (a) with the bottom-up method and the top-down method. The bottom-up technique can apply feature values of skin color, edge, and color, and the top-down technique can apply skin color histogram, face shape, and specific object shape as feature values. Detailed description for generating the salient map is omitted.

2 (d) is an example of a correlation image in which the HAM of FIG. 2 (b) and the salient map of FIG. 2 (c) are combined. Referring to FIG. 2 (d), the red color represents HAM, the blue color represents the abscissa map, and the purple color represents the portion where the HAM and the abscissa map are superimposed.

Now, a process of determining a correlation (correlation coefficient) using the HAM and the saturated map will be described. Correlation can be expressed as Equation 1 below.

Figure 112015053455619-pat00001

(x) is the coordinates of the image, M h (x) is the HAM, μ h is the average pixel value of HAM, M c (c) is the saturated map, and μ c is the average pixel value of the saturated map. The closer the correlation coefficient is to 1, the higher the degree of correlation, and the closer to 0 the correlation is.

When the correlation is applied to the driver of the vehicle, it can be said that the driver is watching the corresponding point (or area) when there is a high correlation and a specific object exists in the corresponding area. If the correlation is high, it is determined that there is a risk factor.

FIG. 3 is an example of a process of determining risk information in the driving assistance system. In the vehicle, the driving assist system calculates the correlation using the HAM and the saturation map in real time.

The vehicle's gaze tracking device tracks the driver's gaze (211). For example, if the driver looks ahead, he watches the front vehicle, the lane change of the front vehicle, the traffic signal ahead, and the accident vehicle ahead. The vehicle's gaze tracking device monitors the driver ' s forward gaze (212). On the other hand, the black box or another camera (the camera device described above) photographs the driver's gaze direction (221). Since the driver ' s line of sight may change in other directions, the camera device may photograph the direction, such as forward or backward (222).

The gaze tracking device or the driving assistance device sequentially sequences the data based on the gaze tracking data including the fixation point of the gaze and calculates the mapping data (HAM) (230). The driving assistance device processes the image acquired by the camera device in real time to calculate mapping data for the conspicuous map (240).

The travel assistance device calculates a correlation coefficient between the HAM and the salient map (250). The travel assistance device calculates the perceived risk according to the result of the correlation coefficient (260). In other words, it is judged that the higher the correlation coefficient (closer to 1) corresponds to the risk factor. At this time, the driving assistance device may determine several risk factors according to the ranking of the correlation coefficient. For example, the region having the upper three correlation coefficients can be determined as a risk factor.

There may be exceptions to the risk factor determination. For example, if there is a point or region where the correlation coefficient is low but the driver's line of sight stays longer than a predetermined reference time (e.g., 5 seconds), the driving assistance device may determine the area as a risk factor (270). Driving aids can process information that identifies the hazard into hazard information.

Hereinafter, the process of transmitting the danger information generated by the driving assistance system using the terminal possessed by the driver will be described. 4 is an example of a block diagram showing a configuration of a system for transmitting danger information of a driver. Basically, it is assumed that the process of transmitting the risk information uses D2D (Device-to-Device) communication. It is assumed that D2D communication uses inter-terminal communication based on a mobile communication network. D2D communication defined in the 3GPP standard includes a method of exchanging data between terminals without involvement of a BS, and a method of controlling data communication between two terminals of a base station. A detailed description of the D2D communication technique is omitted. When D2D communication is applied to a vehicle system, it is sometimes referred to as V2V (Vehicle-to-Vehicle) or V2I (Vehicle-to-Infrastructure). V2V means the communication between the vehicle and the vehicle, and V2I means the communication between the vehicle and the infrastructure installed in the roadside (signal lamp, traffic signal control device, etc.).

First, the driving assistance system 10 of the vehicle establishes a communication channel with the driver terminal 20 owned by the vehicle driver or disposed in the vehicle (①). This process corresponds to the process of establishing a vehicle-to-device (V2D) between the vehicle and the terminal. V2D corresponds to D2D communication.

Thereafter, the driver's eyes of the driving assistance system 10 are tracked, and the image captured by the camera device is received (2). The driving assistance system 10 calculates the correlation on the basis of the HAM and the salient map, and generates the risk information A indicating the risk element in the image.

The driving assistance system 10 can immediately inform the driver of the danger information A (3). For example, it is possible to output certain information to the display device or output a warning message through the speaker.

The driving assistance system (10) transmits the danger information (A) to the driver terminal (20). The driver terminal 20 transmits the danger information A to the terminal 80 of the nearby vehicle. The risk information A can be transmitted to one or more surrounding vehicles via the base station 50 as shown in FIG. Alternatively, the driver terminal 20 may directly transmit the danger information A to the terminal of the surrounding vehicle without involvement of the base station.

Meanwhile, the driver terminal 20 receiving the risk information A may notify the driver of the danger information A. The driver terminal 20 can inform the driver of danger information through vibration, sound, video, and the like.

The peripheral terminal 80 that has received the risk information A can directly inform the driver of the risk information A or inform the risk information A through the driving assistance system of the vehicle.

Furthermore, the peripheral terminal 80 can also transmit the danger information B generated by the driving assistance system of the vehicle to the driver terminal 20. The vehicle or the travel assistance system in which the peripheral terminal 80 is disposed is not shown in Fig. The risk information B may be the same or related to the risk information A, or may be completely separate risk information.

The driver terminal 20 can directly inform the user of the danger information B. [ In addition, the driver's terminal 20 can transmit the danger information B to the driving assistance system (4). The driving assistance system can inform the driver of danger information B (④).

A concrete example in which the danger information is transmitted will be described. FIG. 5 is an example of a process of transmitting danger information of a driver to a vehicle or a traffic facility device.

5, a vehicle accident occurred in the driving direction of the vehicle 510, a driver of the vehicle 510 observed a vehicle accident, and a driving assistance system of the vehicle 510 determined a vehicle accident area as a risk factor.

The vehicle 510 transmits the danger information directly to the surrounding vehicle 520 and transmits the danger information to the vehicle 530 through the base station BS and also transmits the traffic information through the base station BS Control device, 550).

The driving assistance system of the vehicle 510 can operate the braking system of the vehicle before the occurrence of the accident even if there is no operation by the driver. Also, the vehicle 520 recognizes that a vehicle accident has occurred on its anticipated route, and operates navigation to calculate a bypass route. The vehicle 530 receives the danger information, but can only inform the driver of the danger information if it is not related to the expected route. Furthermore, the traffic facility device 550 receives the danger information, recognizes that an accident has occurred in the arrow travel route, and can stop the travel to the route for a while.

It should be noted that the present embodiment and the drawings attached hereto are only a part of the technical idea included in the above-described technology, and those skilled in the art will readily understand the technical ideas included in the above- It is to be understood that both variations and specific embodiments which can be deduced are included in the scope of the above-mentioned technical scope.

10: travel assistance system 20: driver terminal
50: base station (AP) 80: terminal of another vehicle
510, 520, 530: vehicle 550: traffic facility device

Claims (11)

Tracking a driver's gaze of a line-of-sight tracking device installed in a vehicle, and acquiring an image of the outside of the vehicle at a time the camera installed in the vehicle tracks the driver's line of sight;
Generating a human attention map (HAM) using the gaze tracking data;
Generating a saliency map based on the image or the arithmetic unit that receives the image or the image; And
Calculating the correlation between the HAM and the salient map and determining the risk information based on the degree of the correlation,
The correlation is calculated based on the difference between the pixel value of the driver's gaze position and the total average pixel value of the HAM in the HAM and the difference between the pixel value of the driver's gaze position and the average average pixel value of the bright map A method for determining risk information during operation.
The method according to claim 1,
Wherein the computing device determines a ranking for a certain area of the image in the order of the correlation, and wherein the ranking is based on the degree of risk.
The method according to claim 1,
The computing device determines the same area as the highest risk information regardless of the ranking if it determines that the driver's gaze is watching over the same area over the reference time based on the gaze tracking data transmitted by the gaze tracking device A method of determining risk information during operation.
The method according to claim 1,
Wherein the computing device determines the in-operation risk information for computing the correlation based on the degree of different color or brightness from the background for the HAM and the salient map, respectively.
The method according to claim 1,
Correlation is a method of determining risk information during operation expressed as: < RTI ID = 0.0 >
Figure 112015053455619-pat00002

(Where x is the coordinates of the image, M h (x) is HAM, μ h is the average pixel value of HAM, M c (c) is the saturation map, and μ c is the average pixel value of the saturation map)
Establishing a V2D (Vehicle-to-Device) communication session between the driving assistance system of the vehicle and the driver terminal;
The HIL (human attention map) generated based on the driver's gaze tracking and the saliency map of the external image of the vehicle obtained at the time of tracking the driver's gaze, Calculating a correlation between the salient map and the salient map, and determining the risk information on the basis of the calculated correlation;
Transmitting the message including the risk information to the driver terminal through the V2D communication session; And
Wherein the driving assistance system or the driver terminal outputs the danger information through at least one of sound, vibration, or image,
The correlation is calculated based on the difference between the pixel value of the driver's gaze position and the total average pixel value of the HAM in the HAM and the difference between the pixel value of the driver's gaze position and the average average pixel value of the bright map A method of delivering risk information during operation.
The method according to claim 6,
The step of determining
Tracking a driver's gaze of the gaze tracking device installed in the vehicle and acquiring an image of the outside of the car at a time the camera installed in the car tracks the driver's gaze;
Generating a human attention map (HAM) using the gaze tracking data;
Generating a saliency map on the basis of the image or the driving assistance system having received the camera or the image; And
And calculating the correlation between the HAM and the salient map and determining the risk information based on the calculated correlation.
The method according to claim 6,
Wherein the driving assistance system determines a ranking for a certain area of the image in order of the correlation, and wherein the ranking determines risk information during operation corresponding to the degree of risk.
The method according to claim 6,
When the driver's sight line is considered to watch the same area over the reference time based on the line-of-sight tracking, the driving assist system transmits the danger information during driving to determine the same area as the highest dangerous information regardless of the ranking How to.
The method according to claim 6,
Correlation is a method of conveying risk information during operation expressed by the following equation.
Figure 112015053455619-pat00003

(Where x is the coordinates of the image, M h (x) is HAM, μ h is the average pixel value of HAM, M c (c) is the saturation map, and μ c is the average pixel value of the saturation map)
The method according to claim 6,
Further comprising the step of the driver terminal transmitting the message to a terminal of another vehicle in the vicinity of the vehicle or a traffic facility device in the vicinity of the vehicle in a D2D communication method of the mobile communication network.
KR1020150078339A 2015-06-03 2015-06-03 Determining method for danger information on driving car and transferring method for danger information on driving car KR101657900B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150078339A KR101657900B1 (en) 2015-06-03 2015-06-03 Determining method for danger information on driving car and transferring method for danger information on driving car

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150078339A KR101657900B1 (en) 2015-06-03 2015-06-03 Determining method for danger information on driving car and transferring method for danger information on driving car

Publications (1)

Publication Number Publication Date
KR101657900B1 true KR101657900B1 (en) 2016-09-19

Family

ID=57102801

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150078339A KR101657900B1 (en) 2015-06-03 2015-06-03 Determining method for danger information on driving car and transferring method for danger information on driving car

Country Status (1)

Country Link
KR (1) KR101657900B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111565978A (en) * 2018-01-29 2020-08-21 华为技术有限公司 Primary preview area and gaze-based driver distraction detection

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005267108A (en) * 2004-03-17 2005-09-29 Denso Corp Driving support device
KR20120055011A (en) 2010-11-22 2012-05-31 현대자동차주식회사 Method for tracking distance of eyes of driver
KR20120071220A (en) 2010-12-22 2012-07-02 한국전자통신연구원 Apparatus and method for supporting safe driving based on drivers'scan pattern and fixation
KR20120106691A (en) * 2012-09-12 2012-09-26 한양대학교 에리카산학협력단 Method of detecting and controlling unsafe zigzag driving of a vehicle
JP2013254409A (en) * 2012-06-08 2013-12-19 Toyota Central R&D Labs Inc Careless driving detection device and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005267108A (en) * 2004-03-17 2005-09-29 Denso Corp Driving support device
KR20120055011A (en) 2010-11-22 2012-05-31 현대자동차주식회사 Method for tracking distance of eyes of driver
KR20120071220A (en) 2010-12-22 2012-07-02 한국전자통신연구원 Apparatus and method for supporting safe driving based on drivers'scan pattern and fixation
JP2013254409A (en) * 2012-06-08 2013-12-19 Toyota Central R&D Labs Inc Careless driving detection device and program
KR20120106691A (en) * 2012-09-12 2012-09-26 한양대학교 에리카산학협력단 Method of detecting and controlling unsafe zigzag driving of a vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
이석한 외 2인. Eye-Tracker 를 이용한 시선경로와 saliency map에 대한 비교, 분석에 관한 연구. HCI 2010, 2010.1, 730-732 pages. *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111565978A (en) * 2018-01-29 2020-08-21 华为技术有限公司 Primary preview area and gaze-based driver distraction detection
US11977675B2 (en) 2018-01-29 2024-05-07 Futurewei Technologies, Inc. Primary preview region and gaze based driver distraction detection

Similar Documents

Publication Publication Date Title
US11562550B1 (en) Vehicle and mobile device interface for vehicle occupant assistance
US10298741B2 (en) Method and device for assisting in safe driving of a vehicle
KR20240074777A (en) Vehicle and mobile device interface for vehicle occupant assistance
CN105225508B (en) Road condition advisory method and device
US20220180483A1 (en) Image processing device, image processing method, and program
Schoop et al. Hindsight: enhancing spatial awareness by sonifying detected objects in real-time 360-degree video
JP6939283B2 (en) Image processing device, image processing method, and program
CN111681455B (en) Control method of electronic device, and recording medium
CN111216127A (en) Robot control method, device, server and medium
CN111243105B (en) Augmented reality processing method and device, storage medium and electronic equipment
US11999371B2 (en) Driving assistance processing method and apparatus, computer-readable medium, and electronic device
JP2017123029A (en) Information notification apparatus, information notification method and information notification program
US10748264B2 (en) Image processing apparatus and image processing method
JP2020032866A (en) Vehicular virtual reality providing device, method and computer program
CN108140124B (en) Prompt message determination method and device and electronic equipment
JP2010218568A (en) Communication vehicle display device
CN110012215B (en) Image processing apparatus, image processing method, and program
KR101657900B1 (en) Determining method for danger information on driving car and transferring method for danger information on driving car
JP2020154569A (en) Display device, display control method, and display system
KR101935853B1 (en) Night Vision System using LiDAR(light detection and ranging) and RADAR(Radio Detecting And Ranging)
JP2020154375A (en) Vehicle hazardous condition identification device, vehicle hazardous condition identification method, and program
KR101744718B1 (en) Display system and control method therof
CN206914229U (en) Outdoor scene internet is called a taxi accessory system
TWI719330B (en) Electronic device and driving safety reminding method
CN115472039B (en) Information processing method and related product

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant