CN109145852B - Driver fatigue state identification method based on eye opening and closing state - Google Patents
Driver fatigue state identification method based on eye opening and closing state Download PDFInfo
- Publication number
- CN109145852B CN109145852B CN201811009416.4A CN201811009416A CN109145852B CN 109145852 B CN109145852 B CN 109145852B CN 201811009416 A CN201811009416 A CN 201811009416A CN 109145852 B CN109145852 B CN 109145852B
- Authority
- CN
- China
- Prior art keywords
- driver
- fatigue
- eyes
- determining
- state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Ophthalmology & Optometry (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
- Eye Examination Apparatus (AREA)
Abstract
The invention discloses a driver fatigue state identification method based on eye opening and closing states, which comprises the following steps of 1: collecting a driver video, performing framing processing, and determining the face position of the driver; step 2: determining the positions of eyes of a driver by adopting a difference method, detecting the corner point of the boundary of the white eyes and the eyeballs based on a Harris algorithm, determining the area S of the exposed eyeballs, and determining the area S of the exposed eyeballs when the S is less than or equal to 0.35S0When, S0The area of the eyeball is the area when the human eyes are completely opened, and then the human eyes are in the eye closing state; and 3, step 3: determining a driving fatigue coefficient k; and 4, step 4: when k is less than or equal to 15%, the driver is in a waking state; when k is more than 15% and less than or equal to 50%, the driver is in first-level fatigue and should be reminded; when k is more than 50%, the driver is in secondary fatigue, an alarm should be given, and if the driver does not respond, the vehicle is forced to stop. The method has the advantages that the frame extraction image can be carried out on the driving video of the driver, the eyeball area is determined to judge the opening and closing state of eyes, the fatigue state of the driver is determined according to the driving fatigue coefficient, and the result is more accurate.
Description
Technical Field
The invention relates to the technical field of driving safety, in particular to a driver fatigue state identification method based on eye opening and closing states.
Background
Nowadays, the fatigue detection technology of drivers is more and more mature, and the fatigue detection method can be mainly divided into three types: a detection method based on driver behavior characteristics, on driver physiological parameters and on vehicle behavior characteristics. Detecting according to the behavior characteristics of the driver: the behavior characteristics of the driver mainly comprise two aspects of face feature change and hand movement. The facial features mainly comprise head postures, eye states and mouth states; the hand movement mainly includes the force and rotation angle of the steering wheel. The physiological parameters of the driver mainly comprise electroencephalogram, electrocardiogram and the like, but the application is limited to a certain extent because the driver needs to wear corresponding experimental equipment during detection and certain interference is caused to the operation of the automobile. Vehicle behavior is primarily measured by sensing parameters such as steering wheel angle, vehicle speed, and steering angle.
In the existing fatigue judgment based on the facial feature change of a driver, when the fatigue judgment is carried out through a mouth, the fatigue judgment is mainly carried out according to the opening degree of the mouth, but when the driver speaks or laughs, the opening degree of the mouth is large, the detection effect is influenced, and the accuracy is reduced; when the head posture is based, the head posture is mainly recognized according to the nodding frequency, the three-dimensional coordinate of the head needs to be established, a certain point of a body is used as a base point, projection transformation is needed, and the head is laterally deviated when fatigue occurs, so that the calculation amount is large; based on the degree of eye closure, two methods are mainly used: the black pixel transformation of the eyeball and the state recognition of the eyes based on the perclos rule can only carry out fatigue judgment by the three methods, and the form is single.
Disclosure of Invention
The invention designs and develops a driver fatigue state identification method based on the eye opening and closing state, which can perform frame-division image extraction on a driving video of a driver, determine the eyeball area to judge the eye opening and closing state, determine the driver fatigue state according to the driving fatigue coefficient, and obtain a more accurate result.
The technical scheme provided by the invention is as follows:
a driver fatigue state identification method based on eye opening and closing states comprises the following steps:
step 1: collecting a driver video, performing framing processing, and determining the face position of the driver;
step 2: determining the positions of eyes of a driver by adopting a difference method, detecting the corner point of the boundary of the white eyes and the eyeballs based on a Harris algorithm, determining the area S of the exposed eyeballs, and determining the area S of the exposed eyeballs when the S is less than or equal to 0.35S0When S is present0The area of the eyeball when the human eye is completely opened is the area of the eyeball, so that the human eye is in an eye closing state;
and step 3: determining the driving fatigue coefficient as follows:
when T is0When the pressure is higher than 0, the pressure is higher,
when T is0When the ratio is less than 0, the reaction mixture is,
when T is0When the content is equal to 0, the content,
wherein, T0Is the ambient temperature, T is the interior temperature, α is the rainfall, β is the snowfall, G is the ultraviolet intensity, f (v), G (v) is the velocity function, v is the vehicle speed, k is the driving fatigue coefficient, e is the base of the natural logarithm, N1The number of closed-eye frames is N, and the total number of frames is N;
and 4, step 4: when k is less than or equal to 15%, the driver is in a waking state;
when k is more than 15% and less than or equal to 50%, the driver is in first-level fatigue and should be reminded;
when k is more than 50%, the driver is in secondary fatigue, an alarm should be given, and if the driver does not respond, the vehicle is forced to stop.
Preferably, in step 1, a driver video is collected, each frame of image is extracted, and the position of the face of the driver is determined based on an adaboost algorithm after preprocessing.
Preferably, in the step 2, the image of the upper part of the face of the driver is selected for difference, and the image without the moving object is used as the background image of the difference.
Preferably, the detecting the corner at the boundary between the white and the eyeball based on the Harris algorithm comprises:
calculating the gradient I of each frame of image I (x, y) in both x and y directionsx,Iy,
calculating the product of the x and y directions of the image I (x, y),
Ixx=Ix 2,Iyy=Iy 2,Ixy=Ix·Iy;
using a Gaussian function pair Ix 2,Iy 2,Ix·IyGaussian weighting, resulting in the elements A, B, C of the matrix M,
wherein w is a gaussian function;
calculating a Harris response value R for each pixel, and setting R to zero for values less than a certain threshold t,
R={R:detM-α(traceM)2<t},
in the formula, detM is a determinant of a matrix M, traceM is a direct trace of the matrix M, and alpha is an empirical constant;
non-maxima suppression is performed within the neighborhood and corner points in the image are determined.
Preferably, the determining of the area of the exposed eyeball comprises:
when the human eyes are completely opened, determining the corner points at the boundary of the three white eyes and the eyeballs, and then the area S of the eyeballs when the human eyes are completely opened0Comprises the following steps:
in the formula, a, b and c are respectively the linear distances between adjacent angular points;
when the eyes are closed to a certain extent, determining a point at the junction of the upper eyelid and the edge of the eyeball based on a Harris algorithm, wherein the area S of the eyeball is as follows:
wherein h is a linear distance between points at the boundary of the upper eyelid and the edge of the eyeball.
Preferably, when the corner point is not detected in 5 continuous frames of images, the driver is judged to be in the sleep state.
The invention has the following beneficial effects:
(1) the method for identifying the fatigue state of the driver based on the eye opening and closing state can be used for performing frame-division image extraction on the driving video of the driver, determining the eyeball area to judge the eye opening and closing state of the driver, and determining the fatigue state of the driver according to the driving fatigue coefficient, so that the result is more accurate.
(2) The invention carries out eye detection based on Harris algorithm, not only can judge the state of the driver, but also can judge the head movement direction of the driver, and can judge whether the vehicle is ready to turn left or turn right according to the head movement direction. And only the area of the black eyeballs needs to be calculated when fatigue judgment is carried out, the calculation amount is small, the detection precision is high, the gray difference between the white eyeballs and the black eyeballs is large, the external interference is small, and the detection effect is improved.
Drawings
Fig. 1 is a flowchart of a driver fatigue state identification method based on the eye opening and closing state according to the present invention.
Detailed Description
The present invention is further described in detail below with reference to the attached drawings so that those skilled in the art can implement the invention by referring to the description text.
As shown in fig. 1, the present invention provides a method for identifying a fatigue state of a driver based on an eye opening/closing state, comprising the steps of:
step 1: collecting a driver video, performing frame processing, and determining the face position of the driver:
and extracting each frame of image of the collected driver video, preprocessing the image to reduce noise interference and enhance image effect, and positioning the human face by adopting an adaboost algorithm.
Step 2: determining the position of the eyes of the driver by adopting a difference method:
when the difference method is used for positioning the eyes, in order to reduce the calculated amount and improve the accuracy, the upper half part of the face image is intercepted and differentiated according to the distribution characteristics of the five sense organs. When the motion detection is started, selecting a frame of image without a motion target as a differential background image, when the motion target appears, starting to perform the differential between the current image and the background image, when the motion target detection is finished, updating the background image, and when the next motion target appears, performing the differential. The differential result can remove part of noise, and can remove the static background area irrelevant to the detection of the moving object, and the background image updating mechanism is adopted, and can also adapt to the change of the background and the light to a certain extent. After the difference processing, only the moving object and part of the noise are left in the difference image, and then the filtering and denoising processing is performed.
And step 3: and based on Harris algorithm, carrying out angular point detection on the boundary of white eyes and eyeballs, and specifically comprising the following steps:
calculate image I (x) of each frameY) gradient I in both x and y directionsx,Iy,
calculating the product of the x and y directions of the image I (x, y),
Ixx=Ix 2,Iyy=Iy 2,Ixy=Ix·Iy;
using a Gaussian function pair Ix 2,Iy 2,Ix·IyGaussian weighting, resulting in the elements A, B, C of the matrix M,
wherein w is a gaussian function;
calculating a Harris response value R for each pixel, and setting R to zero for values less than a certain threshold t,
R={R:detM-α(traceM)2<t},
in the formula, detM is a determinant of a matrix M, traceM is a direct trace of the matrix M, and alpha is an empirical constant;
and (3) carrying out non-maximum suppression in the neighborhood, and determining the angular points (namely local maximum points) in the image, wherein the smaller the moving small window is, the more accurate the detected angular points are, and therefore, the size is selected to be 3 x 3.
Of course, when no corner is detected in 5 continuous frames of images, the driver is judged to be in a sleep state.
And 4, step 4: determining an area S exposing an eyeball:
when the human eyes are completely opened, determining the corner points at the boundary of the white eyes and the eyeballs, so that the area S of the eyeballs when the human eyes are completely opened0Comprises the following steps:
in the formula, a, b and c are respectively the linear distances between adjacent angular points;
when the eyes are closed to a certain extent, determining a point at the junction of the upper eyelid and the edge of the eyeball based on a Harris algorithm, wherein the area S of the eyeball is as follows:
wherein h is a linear distance between points at the boundary of the upper eyelid and the edge of the eyeball.
When S is less than or equal to 0.35S0When the eye is closed, the human eyes are in an eye closing state.
And 5: determining the driving fatigue coefficient as follows:
when T is0When the pressure is higher than 0,
when T is0When the ratio is less than 0, the reaction mixture is,
when T is0When the content is equal to 0, the content,
wherein, T0Is the ambient temperature (DEG C), T is the temperature (DEG C) in the vehicle, alpha is rainfall (m), beta is the snowfall (m), G is the ultraviolet intensity (between 0 and 15), f (v), G (v) are speed functions, v is the vehicle speed (km/h), k is the driving fatigue coefficient, e is the base number of the natural logarithm, N is the base number of the natural logarithm1The number of closed-eye frames, and N the total number of frames.
Step 6: when k is less than or equal to 15%, the driver is in a waking state;
when k is more than 15% and less than or equal to 50%, the driver is in first-level fatigue and should be reminded;
when k is more than 50%, the driver is in secondary fatigue, an alarm should be given, and if the driver does not respond, the vehicle is forced to stop.
The method for identifying the fatigue state of the driver based on the eye opening and closing state can be used for performing frame-division image extraction on the driving video of the driver, determining the eyeball area to judge the eye opening and closing state of the driver, and determining the fatigue state of the driver according to the driving fatigue coefficient, so that the result is more accurate.
The invention carries out eye detection based on Harris algorithm, not only can judge the state of the driver, but also can judge the head movement direction of the driver, and can judge whether the vehicle is ready to turn left or turn right according to the head movement direction. And only the area of the black eyeballs needs to be calculated when fatigue judgment is carried out, the calculation amount is small, the detection precision is high, the gray difference between the white eyeballs and the black eyeballs is large, the external interference is small, and the detection effect is improved.
While embodiments of the invention have been described above, it is not limited to the applications set forth in the description and the embodiments, which are fully applicable in various fields of endeavor to which the invention pertains, and further modifications may readily be made by those skilled in the art, it being understood that the invention is not limited to the details shown and described herein without departing from the general concept defined by the appended claims and their equivalents.
Claims (6)
1. A driver fatigue state recognition method based on an eye opening/closing state, characterized by comprising the steps of:
step 1: collecting a driver video, performing framing processing, and determining the face position of the driver;
and 2, step: determining the positions of eyes of a driver by adopting a difference method, detecting the corner point of the boundary of the white eyes and the eyeballs based on a Harris algorithm, determining the area S of the exposed eyeballs, and determining the area S of the exposed eyeballs when the S is less than or equal to 0.35S0When S is present0For eyes of people when the eyes are fully openThe human eyes are in an eye closing state according to the area of the ball;
and step 3: determining the driving fatigue coefficient as follows:
when T is0When the pressure is higher than 0, the pressure is higher,
when T is0When the ratio is less than 0, the reaction mixture is,
when T is0When the content is equal to 0, the content,
wherein, T0Is the ambient temperature, T is the interior temperature, α is the rainfall, β is the snowfall, G is the ultraviolet intensity, f (v), G (v) is the velocity function, v is the vehicle speed, k is the driving fatigue coefficient, e is the base of the natural logarithm, N1The number of closed-eye frames is N, and the total number of frames is N;
and 4, step 4: when k is less than or equal to 15%, the driver is in a waking state;
when k is more than 15% and less than or equal to 50%, the driver is in first-level fatigue and should be reminded;
when k is more than 50%, the driver is in secondary fatigue, an alarm should be given, and if the driver does not respond, the vehicle is forced to stop.
2. The method for identifying the fatigue state of the driver based on the eye opening and closing state as claimed in claim 1, wherein in the step 1, a video of the driver is collected, each frame of image is extracted, and the face position of the driver is determined based on an adaboost algorithm after preprocessing.
3. The method for identifying fatigue status of driver based on eye opening/closing status as claimed in claim 1, wherein in step 2, the image of the upper part of the face of the driver is selected for difference, and the image without moving object is used as the background image of difference.
4. The method for identifying the fatigue state of the driver based on the opening and closing states of the eyes as claimed in claim 3, wherein the detecting the corner points at the boundary between the white eyes and the eyeballs based on the Harris algorithm comprises:
calculating the gradient I of each frame of image I (x, y) in both x and y directionsx,Iy,
calculating the product of the x and y directions of the image I (x, y),
Ixx=Ix 2,Iyy=Iy 2,Ixy=Ix·Iy;
using a Gaussian function pair Ix 2,Iy 2,Ix·IyGaussian weighting, resulting in the elements A, B, C of the matrix M,
wherein w is a gaussian function;
calculating a Harris response value R for each pixel, and setting R to zero for values less than a certain threshold t,
R={R:detM-α(traceM)2<t},
in the formula, detM is a determinant of a matrix M, traceM is a direct trace of the matrix M, and alpha is an empirical constant;
non-maxima suppression is performed within the neighborhood and corner points in the image are determined.
5. The eye-opening-and-closing-state-based driver fatigue state recognition method according to claim 4, wherein the determination of the exposed eyeball area includes:
when the human eyes are completely opened, determining the corner points at the boundary of the three white eyes and the eyeballs, and then the area S of the eyeballs when the human eyes are completely opened0Comprises the following steps:
in the formula, a, b and c are respectively the linear distances between adjacent angular points;
when the eyes are closed to a certain extent, based on a Harris algorithm, determining a point at the junction of the upper eyelid and the edge of the eyeball, wherein the area S of the eyeball is as follows:
wherein h is a linear distance between points at the boundary of the upper eyelid and the edge of the eyeball.
6. The method for identifying a driver's fatigue state based on the eye-open-close state according to claim 5, wherein it is judged that the driver is in the sleep state when the corner points are not detected for 5 consecutive frames of images.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811009416.4A CN109145852B (en) | 2018-08-31 | 2018-08-31 | Driver fatigue state identification method based on eye opening and closing state |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811009416.4A CN109145852B (en) | 2018-08-31 | 2018-08-31 | Driver fatigue state identification method based on eye opening and closing state |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109145852A CN109145852A (en) | 2019-01-04 |
CN109145852B true CN109145852B (en) | 2022-06-17 |
Family
ID=64825878
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811009416.4A Active CN109145852B (en) | 2018-08-31 | 2018-08-31 | Driver fatigue state identification method based on eye opening and closing state |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109145852B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109703340B (en) * | 2019-02-12 | 2022-02-08 | 合肥京东方光电科技有限公司 | Sun shield, automobile and adjusting method of sun shield |
CN109878304B (en) * | 2019-03-29 | 2021-04-09 | 合肥京东方光电科技有限公司 | Sun visor, sun visor control method and automobile |
JP7127661B2 (en) * | 2020-03-24 | 2022-08-30 | トヨタ自動車株式会社 | Eye opening degree calculator |
CN111741250A (en) * | 2020-07-07 | 2020-10-02 | 全时云商务服务股份有限公司 | Method, device and equipment for analyzing participation degree of video conversation scene and storage medium |
CN113449670B (en) * | 2021-07-09 | 2022-04-15 | 浙江正元智慧科技股份有限公司 | Drowsiness detection method based on human eye state |
CN113591682B (en) * | 2021-07-28 | 2024-09-24 | 地平线(上海)人工智能技术有限公司 | Fatigue state detection method, fatigue state detection device, readable storage medium, and electronic device |
CN113703335A (en) * | 2021-10-27 | 2021-11-26 | 江苏博子岛智能产业技术研究院有限公司 | Intelligent home brain control system based on internet of things and provided with brain-computer interface |
CN116469085B (en) * | 2023-03-30 | 2024-04-02 | 万联易达物流科技有限公司 | Monitoring method and system for risk driving behavior |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102054163A (en) * | 2009-10-27 | 2011-05-11 | 南京理工大学 | Method for testing driver fatigue based on monocular vision |
CN107943061A (en) * | 2018-01-09 | 2018-04-20 | 辽宁工业大学 | A kind of model automobile automatic Pilot experimental provision and method based on machine vision |
CN108309311A (en) * | 2018-03-27 | 2018-07-24 | 北京华纵科技有限公司 | A kind of real-time doze of train driver sleeps detection device and detection algorithm |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7855743B2 (en) * | 2006-09-08 | 2010-12-21 | Sony Corporation | Image capturing and displaying apparatus and image capturing and displaying method |
CN107292251B (en) * | 2017-06-09 | 2020-08-28 | 湖北天业云商网络科技有限公司 | Driver fatigue detection method and system based on human eye state |
CN206914227U (en) * | 2017-06-22 | 2018-01-23 | 辽宁工业大学 | A kind of steering wheel deviates alarm set |
-
2018
- 2018-08-31 CN CN201811009416.4A patent/CN109145852B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102054163A (en) * | 2009-10-27 | 2011-05-11 | 南京理工大学 | Method for testing driver fatigue based on monocular vision |
CN107943061A (en) * | 2018-01-09 | 2018-04-20 | 辽宁工业大学 | A kind of model automobile automatic Pilot experimental provision and method based on machine vision |
CN108309311A (en) * | 2018-03-27 | 2018-07-24 | 北京华纵科技有限公司 | A kind of real-time doze of train driver sleeps detection device and detection algorithm |
Non-Patent Citations (7)
Title |
---|
一种快速驾驶员疲劳检测方法;蒋文博等;《电子设计工程》;20151205;第23卷(第23期);第1节、第2.4节 * |
基于人脸关键点的疲劳驾驶检测研究;黄家才等;《南京工程学院学报(自然科学版)》;20171215(第04期);摘要 * |
基于眼球运动状态检测的疲劳预警系统研究;张志文等;《计算机与数字工程》;20160220(第02期);摘要 * |
基于累积帧差的人眼定位及模板提取;舒梅等;《西华大学学报(自然科学版)》;20081115;第27卷(第06期);第1.1节 * |
恶劣环境中车辆执行伤员后送任务分析;张军辉等;《科技创新导报》;20180401(第10期);摘要 * |
浅析驾驶员疲劳驾驶监测技术;应建明;《驾驶园》;20090915(第09期);摘要 * |
驾驶疲劳致因及监测研究进展;肖赛等;《交通科技与经济》;20170805;第19卷(第04期);第2.2节、第4节 * |
Also Published As
Publication number | Publication date |
---|---|
CN109145852A (en) | 2019-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109145852B (en) | Driver fatigue state identification method based on eye opening and closing state | |
CN104616438B (en) | A kind of motion detection method of yawning for fatigue driving detection | |
CN108053615B (en) | Method for detecting fatigue driving state of driver based on micro-expression | |
Tian et al. | Recognizing action units for facial expression analysis | |
Fuhl et al. | Eyes wide open? eyelid location and eye aperture estimation for pervasive eye tracking in real-world scenarios | |
CN102696041B (en) | The system and method that the cost benefit confirmed for eye tracking and driver drowsiness is high and sane | |
CN108875642A (en) | A kind of method of the driver fatigue detection of multi-index amalgamation | |
CN109840565A (en) | A kind of blink detection method based on eye contour feature point aspect ratio | |
CN111616718B (en) | Method and system for detecting fatigue state of driver based on attitude characteristics | |
CN112434611B (en) | Early fatigue detection method and system based on eye movement subtle features | |
CN110728241A (en) | Driver fatigue detection method based on deep learning multi-feature fusion | |
CN108596087B (en) | Driving fatigue degree detection regression model based on double-network result | |
CN106650635B (en) | Method and system for detecting viewing behavior of rearview mirror of driver | |
CN106919913A (en) | Method for detecting fatigue driving and device based on computer vision | |
CN109711239B (en) | Visual attention detection method based on improved mixed increment dynamic Bayesian network | |
CN109740477A (en) | Study in Driver Fatigue State Surveillance System and its fatigue detection method | |
CN111460950A (en) | Cognitive distraction method based on head-eye evidence fusion in natural driving conversation behavior | |
Luo et al. | The driver fatigue monitoring system based on face recognition technology | |
CN113989788A (en) | Fatigue detection method based on deep learning and multi-index fusion | |
Nakamura et al. | Driver drowsiness estimation from facial expression features computer vision feature investigation using a CG model | |
CN108921010A (en) | A kind of pupil detection method and detection device | |
CN112069986A (en) | Machine vision tracking method and device for eye movements of old people | |
CN114241452A (en) | Image recognition-based driver multi-index fatigue driving detection method | |
CN111104817A (en) | Fatigue detection method based on deep learning | |
CN113408389A (en) | Method for intelligently recognizing drowsiness action of driver |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |